Skip to main content

The greasiest VR headset yet? McDonald’s launches viewer made from Happy Meal box

It may well be the greasiest VR viewer out there, but McDonald’s is nevertheless confident its Happy Goggles will be a hit with kids who get a chance to try it out.

The Happy Goggles promotion, which takes place this month at 14 of the fast-food restaurant’s outlets across Sweden, lets the littl’uns convert their Happy Meal box into a set of VR goggles. Think of it as a smelly version of Google Cardboard.

All it takes to create the device is a few simple folds and tears along perforated lines, followed by the insertion of the provided lens pack. To complete the setup, you’l have to be cool with slipping your smartphone into a box that a few minutes earlier contained a cheeseburger and fries, but if it’s going to keep your kid quiet for another 10 minutes then it may be hard to resist.

happy goggles
Image used with permission by copyright holder

McDonald’s has launched the Happy Goggles to mark the Happy Meal’s 30th anniversary in the Scandinavian country. It’s also created a skiing game for the headset called “se upp i backen” (in English: “watch out on the slopes”) that aims “to help kids understand the need to be alert on the slopes and avoid obstacles – including other skiers.” It’s even been endorsed by the Swedish alpine ski team.

McDonald’s is just the latest organization to experiment with virtual reality, though being a fast-food chain, it’s utilizing the platform as a promotional tool rather than a way to enhance its service.

If its VR deal has your eyes spinning in excitement, then be patient, the Happy Goggles will be rolled out globally if the kids of Sweden respond positively. If, on the other hand, the offer has you licking your lips in anticipation, you’re in luck – the Happy Meal is available everywhere.

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more