Skip to main content

Virginia Tech grad builds an iPhone-controlled beer cannon

iphone-beer-vending-cannon
Image used with permission by copyright holder

If you are not a Jedi, but do enjoy a cold beer, you may want to check this out. Ryan Rusnak, a 25 year-old Virginia Tech graduate, and some friends have invented something we could all use: an iPhone controlled beer launcher called the BeerBot. The device is a dorm-sized refrigerator and table that allows you to select what beer you’d like and then tosses it to you, reports Asylum. Where can we buy one?

The device, which would have handily won best of show at CES should it have made an appearance, uses an iobridge 204 micro controller to connect to an iPhone app that allows you to choose a beer, aim the machine, and then fire. The machine uses compressed air to fire a can across the room at 50 psi. The fridge can hold four types of beer and uses a webcam for aiming. It can also tweet every time it tosses a beer.

“Once I had a working Internet vending machine, I wasn’t satisfied,” said Rusnak. “I called Graham to decide how we could get the beer from the fridge to the couch. After a few days of deliberation, we somehow arrived that compressed air would just be awesome.”

It took the Rusnak and his friends more than three months to get the beer vending machine and cannon working, but he claims it was totally worth the effort.

“The cool part of the micro controller is that, in theory, anyone could log in and control the fridge,” Rusnak told The Register. “Our next big idea is throwing a party, people could log in and throw cans of beer at us.”

Now all we have to do is get these kids to drink better beer.

Jeffrey Van Camp
Former Digital Trends Contributor
As DT's Deputy Editor, Jeff helps oversee editorial operations at Digital Trends. Previously, he ran the site's…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more