Skip to main content

Man hacks Kinect to play World of Warcraft with gestures

microsoft-kinect-world-of-warcraft-faast
Image used with permission by copyright holder

You know you’ve got a hit device on your hands when it inspires people to start hacking it and using it to get creative. Microsoft’s Kinect is new to the market, but is already showing its presence in more than just video games. A group of researchers and students at the University of Southern California Institute for Creative Technologies have hacked the motion camera and retooled it to…wait for it…play World of Warcraft! That’s right, the Xbox 360 video game peripheral can now be used as a PC game peripheral.

Fortunately, the group has more noble goals than pwning troggs in Loch Modan. They’re creating software that maps a person’s skelatal movements with buttons on a keyboard. If successful, you might be able to do almost anything on a computer with motions and gestures. It could potentially be used to help those recovering from traumatic accidents as well. An early toolkit is already available for those hoping to experiment or contribute.

Six years ago, technology like this seemed like science fiction. Thanks to the Wii and Xbox 360, Minority Report-style airscreens could practically happen today. It’s amazing how video games are helping to advance technology and interfaces and it’s great to see a university recognizing and celebrating the possibilities of Kinect. They may be doing a better job than Microsoft at this point. Though it initially filed suit against some hackers, Microsoft has recently backpeddled and begun supporting Kinect innovations.

Jeffrey Van Camp
Former Digital Trends Contributor
As DT's Deputy Editor, Jeff helps oversee editorial operations at Digital Trends. Previously, he ran the site's…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more