Skip to main content

MYO armband lets users control Macs and PCs through gesture

myo_650x400-650x400
Image used with permission by copyright holder

We might still have a ways to go before gesture control becomes a norm, but we’ve at least been making headway into helping it become more accessible for everyone. There’s the Kinect, the Leap, and now there’s also the MYO arm band by Thalmic Labs

To use MYO, you strap it somewhere above the elbow and perform gestures to issue commands. The company claims it will work out of the box with Macs and PCs when it ships sometime in late 2013, but it could also be used for iOS and Android. It has the potential to be used in even more applications – in the video below, you can see the MYO being used to shoot a video game gun with one’s hand and to control a small military vehicle.

You can also see the different gesture controls that can be used with the MYO, comprised of a number of arm, hand, and finger gestures. To scroll pages, for instance, you have to wave two fingers in the air similar to how you’d do it on a Mac trackpad. You can also hold you palm up in a classic stop gesture to pause a video, and then gesture backwards to rewind. Unlike the Kinect, the MYO doesn’t use a camera, so gestures are limited to the arm with the high-tech band attached to it. 

The arm band communicates with whatever computer or device you’re using it with via a low power Bluetooth connection. While the arm band itself won’t be available until late into the year, the company is making the API available today for iOS and Android developers in hopes that they’ll create applications for it. If you want to get your hands on the limited number of MYO arm bands upon release, you can preorder a unit today via its official website for $149.

Mariella Moon
Former Digital Trends Contributor
Mariella loves working on both helpful and awe-inspiring science and technology stories. When she's not at her desk writing…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more