Skip to main content

If ‘The Onion’ is correct, Samsung wearable gives new meaning to the term ‘hardware’ (please be correct, Onion)

Samsung Apex The Onion
Image used with permission by copyright holder

It’s less than 24 hours before vacation begins so let’s start the day on a lighthearted note with The Onion’s latest satirical report on wearable computing. As you know, wearable tech is currently all the rage: Fitness bands, watches, Google Glass, and mind-reading headbands are just the beginning. But these devices fulfill things we didn’t know we needed. Tech companies are always trying to push new desires with their latest products, as if we couldn’t have lived before being able to snap photos from computing glasses that strap around your face. The Onion’s version of Samsung, however, knows you better.

Introducing the (fake, please note this is fake) Samsung Apex: A new wearable computing device that streams video into one eye, Internet into the other, and simultaneously gives you a blow job. That’s not a typo, and yes, ladies, there are ones designed for you too. Can we give fake Samsung, heh, “Hardware Of The Year” award right now?

The Onion’s Tech Trends segment is one of our favorites (aside from the obvious “Hey, they’re making fun of my industry!”) and the Samsung Apex is totally on point with today’s culture. After all, someone who spends that much time with their gadgets couldn’t possibly have a fulfilling sex life on their own… could they?

Just joking, we’re not gonna judge. Watch the hilarious video below, and beware of NSFW language.

Topics
Aaron Colter
Former Digital Trends Contributor
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more