Skip to main content

Hover Junker devs show how to turn your pet into an alien, with the power of VR

One of the reasons Valve and HTC added the Chaperone and camera system to the Vive virtual reality headset was to stop people tripping over their pets while they are flailing around in VR. Stress Level Zero, the developer of Hover Junkers however, went a step further, and put a dog in the game by attaching a controller to its back.

Hover Junkers is a first person shooter that makes good use of the Vive’s hand tracked controllers to offer real-world accurate gun battles in a bleak, desert world. But when the servers aren’t populated with robots and scrap hunters battling it out for various junk around the wasteland, it can be quite a lonely place – so the developers added their dog to the game.

Admittedly, when you put the headset on, the little guy doesn’t look much like a dog any more. In-fact he’s transformed into a grotesque, worm/grub-thing, but is animated and more importantly, fully tracked within the game.

Although this is more of an experiment than a realistic way to track your pet, it does open up some interesting potential for in-game pets, especially when you consider that the actual Lighthouse tracking sensors are smaller than a quarter. It’s not beyond the realm of possibility to imagine a dog collar with a few on it, to give you a heads up should your pet wander into the room.

Since we’re talking VR tracked pets, we do need to give props to arguably the first person to (at least publicly) perform this feat. Charles Alexander, the developer of Into the Arcade, managed to get his dog tracking in-game too – and even placed a virtual laser on the dog’s back!

Into Arcade™ for HTC Vive -- Mixed-Reality Gameplay Teaser (dog-tracking thrown in for fun)

While this might be a way to keep your pet safe while your flail in VR, we can imagine even greater possibilities. A VR game that turns your real pet into a virtual gameplay companion could be a hoot — and unlikely anything we’ve seen in gaming before.

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more