Skip to main content

Bots_alive uses your smartphone to drive artificially intelligent spider robots

Oops! We couldn't load this video player
Artificial intelligence is all the rage in robotics these days, and for good reason: Properly implemented, it has the potential to program ‘bots on the fly. That’s the promise behind Cozmo, the AI-powered robot from Anki. And it’s the conceit of the Professor Einstein, the intelligent toy from Hanson Robotics.

But those toys and others react in predictable ways to changing contexts and situations. One startup, though, purports to have developed an algorithm capable of generating entirely new behaviors dynamically.

Recommended Videos

It’s called bots_alive, and it’s the brainchild of Brad Knox. Knox, who completed a dissertation in artificial intelligence at the University of Austin, worked with the Personal Robotics Group at MIT’s Media Lab on “Learning from the Wizard,” a project in which a robot learns to emulate its puppeteer’s control. It’s research that informed the development of bots_alive, a low-cost AI robotics platform.

Please enable Javascript to view this content

The impetus, Knox said, was to design a robot that behaved in a personable, human-like way. “We all want robots we can interact with, but there aren’t any products on the market that come close,” he said. “It came out of conversations about complex AI in research. We wanted to make something that’s valuable now — deliver on the promise of machine learning, given the limitations of current technology.”

Bots_alive
Image used with permission by copyright holder

The novelty of bots_alive lies in the way it interacts with its surroundings. AI programmers typically give robots personalities with decision trees, Knox explained, dictating the rules by which they abide when behaving in certain ways. But true artificial intelligence of the kind embodied by bots_alive is entirely free form. “We don’t always know what the robot will do,” he said.

It requires a bit of human guidance, initially. An “improviser” operates the robot over a long period of time, generating data in what Knox calls “puppet sessions.” From that data, the bots_alive machine learning algorithm generates a model, assigning probabilities to outcomes. The end result, Knox said, is “lifelike authenticity” — a robot personality that reacts subtly but differently to changing environmental conditions.

It’s alive!

Knox demonstrated the technology’s potential during a Skype conversation. He placed the robot near a handful of blue blocks and red blocks, and defined two simple rules: The robot was to move toward blue blocks and perceive red blocks as barriers.

First, he placed a blue block in the center of the robot’s vision. It moved imperfectly, hesitatingly toward it. (Knox described the motion as “authentic” and “organic.”) Then, Knox placed a blue block behind a wall of red blocks. The robot easily charted a path around the wall.

“Through real-world interaction, we were able to affect the development of its behavior.”

The next scenario was a little more challenging: An unbreakable barrier of red blocks encircling the robot and a blue block just beyond reach. Impressively, the robot broke through the barrier, inching backward and forward until it managed to create an opening in the barrier through which it could escape.

It’s an example of spontaneous behavior, Knox said — of the robot doing something the team didn’t train it to do. “Through real-world interaction, we were able to affect the development of its behavior.”

It’s not the only example. In play tests, users have placed blue blocks at the top of stacked red blocks, Knox said, and the robot has knocked them over. “Nowhere in the operations data is it told to push the blocks,” he said.

Image used with permission by copyright holder

And this is just the beginning. Over-the-air software updates will enable new features like nonverbal signs of social interaction between robots, Knox said. If the Kickstarter campaign reaches its first stretch goal, users will be able to pit two robots against each other in a robot battle to the death. And enterprising programmers will be able to teach the robots new skills.

Knox believes these robots have disruptive potential. That’s thanks in part both their ease of use, he said — bots_alive leverages a smartphone for processing and a system of QR codes to track the cubes’ and robot’s position — and crucially to their price point. “It’s a fun and varied user experience,” he said, “and it’s affordable compared to other robots with cutting-edge AI.”

And when it comes to the software’s applicability, the sky’s the limit, Knox said. “It’s very easily translatable to any remote-controlled robot that’s controlled via Bluetooth,” he said. “We don’t have explicit plans, but one of the main things that we’re looking forward to in the Kickstarter campaign is what people would value. If there’s a very strong, resounding call, then we’ll consider it.”

Bots_alive launches on January 24. It’s expected to ship later this year. For $60, you get the full kit, including the Hexbug Spider, decals, five vision blocks, an IR blaster, and the mobile app. If you pay $85, you get the same kit plus an extra Hexbug Spider. You can learn more on the company’s website or back it on Kickstarter right now.

Kyle Wiggers
Former Digital Trends Contributor
Kyle Wiggers is a writer, Web designer, and podcaster with an acute interest in all things tech. When not reviewing gadgets…
The OnePlus 13 is coming on January 7 — along with a surprise
The OnePlus logo on the back of the OnePlus Open Apex Edition.

It's official: the OnePlus 13 will launch on January 7, 2025. Preempting the anticipated event by several weeks, OnePlus has officially confirmed the date we’ll see its next major smartphone release outside of China. Additionally, it has revealed some key features and news of a surprise new launch to go along with the phone.

OnePlus will release the OnePlus 13 in three different colors — Black Eclipse, Arctic Dawn, and Midnight Ocean. It’s the latter that is likely to be the model to have, as it is wrapped in a material called micro-fiber vegan leather, which is apparently corrosion and scratch-resistant but still luxurious to the touch. For the Arctic Dawn phone, the glass will have a special coating to give it a silky-smooth finish. It’s likely these are the same colors offered in China, where the phone has already been announced, just with different names.

Read more
I’m really worried about the future of smart glasses
The front of the Ray-Ban Meta smart glasses.

The Ray-Ban Meta smart glasses are among the most interesting, unexpectedly fun, and surprisingly useful wearables I’ve used in 2024. However, as we go into 2025, I’m getting worried about the smart glasses situation.

This isn’t the first time I’ve felt like we’re on the cusp of a new wave of cool smart eyewear products, only to be very disappointed by what came next.
Why the Ray-Ban Meta are so good

Read more
We need to talk about this fantastic, industry-leading Motorola collab
A person holding the Motorola Edge 50 Neo.

We are accustomed to tech brands partnering with adjacent brands, whether it’s OnePlus with Hasselblad or Honor and Huawei with Porsche Design, and often — such as with Xiaomi and Leica — singing the praises of the resulting collaborations. But not enough has been said about Motorola’s now established partnership with color experts Pantone.

It was when the recently released Motorola Edge 50 Neo arrived for me to try out that I finally understood how impactful the collaboration has become. Why? It manages to make even ordinary colors look fantastic.
Boring gray?

Read more