Skip to main content

Leaps, bounds, and beyond: Robot agility is progressing at a feverish pace

Cassie robot learns to hop, run and skip

When Charles Rosen, the A.I. pioneer who founded SRI International’s Artificial Intelligence Center, was asked to come up with a name for the world’s first general -purpose mobile robot, he thought for a moment and then said: “Well, it shakes like hell when it moves. Let’s just call it Shakey.”

Some variation of this idea has pervaded for much of the history of modern robotics. Robots, we often assume, are clunky machines with as much grace as an atheist’s Sunday lunch. Even science fiction movies have repeatedly imagined robots as ungainly creations that walk with slow, halting steps.

That idea simply no longer lines up with reality.

Recently, a group of researchers from the Dynamic Robotics Laboratory at Oregon State took one of the university’s Cassie robots, a pair of walking robot legs that resembles the lower extremities of an ostrich, to a sports field to try out the lab’s latest “bipedal gait” algorithms. Once there, the robot hopped, walked, cantered, and galloped, switching seamlessly between each type of motion without having to slow down. It was an impressive demonstration, and one that speaks to the agility of current legged robots — especially when a bit of deep learning-based training is involved.

OSU/Agility Robotics

“Usually, when people apply deep reinforcement learning to robotics, they use reward functions that boil down to rewarding the neural network for closely mimicking a reference trajectory,” Jonah Siekmann, one of the researchers on the project, told Digital Trends. “Collecting this reference trajectory in the first place can be pretty difficult, and once you have a ‘running’ reference trajectory, it’s not very clear if you can also use that to learn a ‘skipping’ behavior, or even a ‘walking’ behavior.”

In the OSU work, the team created a reward paradigm that scrapped the idea of reference trajectories completely. Instead, it breaks up chunks of time into “phases,” penalizing the robot for having a specific foot on the ground during a certain phase, while allowing it to do so at other points. The neural network then figures out “all the hard stuff” — such as the position the joints should be in, how much torque to apply at each joint, how to remain stable and upright — to create a reward-based design paradigm that makes it easy for robots like Cassie to learn just about any bipedal gait found in nature.

Predicting the future

It’s an impressive feat, to be sure. But it also brings about a larger question: How on Earth did robots get so agile? While there are still no shortage of videos online showing robots collapsing when things go wrong, there is also no doubt that the overall path they are on is one that’s headed toward impressively smooth locomotion. Once the idea of a robot cantering like a pony or performing a picture-perfect athletic routine would have been far-fetched even for a movie. In 2020, robots are getting there.

Predicting these advances isn’t easy, however. There is no simple Moore’s Law-type observation that makes it easy to map out the path robots are taking from clunky machines to smooth operators.

Moore’s Law refers to the observation made by Intel engineer Gordon Moore in 1965 that, every one to two years, the number of components that could be squeezed onto an integrated circuit will double. While there’s an argument to be made that we may now be reaching the limits of Moore’s Law, a researcher in, say, 1991 could realistically work out, on the back of an envelope, where computer capabilities might be, in terms of calculations, in 2021. Things are more complex for robots.

Anybotics

“Even though Moore’s Law forecasted the trend in compute power astonishingly well, forecasting a trend in legged robots is like gazing into a crystal ball,” Christian Gehring, chief technology officer at ANYbotics AG, a Swiss company making legged robots that are already being used for tasks like autonomously inspecting offshore energy platforms, told Digital Trends. “In essence, legged robots are highly integrated systems relying on many different technologies like energy storage, sensing, acting, computing, networking and intelligence.”

It’s advances in this conflation of different technologies working together that make today’s robots so powerful. It is also what makes them tough to predict as far as the road map of future development goes. To build the kinds of robots that roboticists would like, there needs to be advances in the creation of small and lightweight batteries, sensing and perception capabilities, cellular communications, and more. All of these will need to work together with advances in fields like dee- learning A.I. to create the kinds of machines that will forever banish images of clunky science fiction bots we grew up watching on TV.

Smaller, cheaper, better

The good news is that it’s happening. While Moore’s Law leads to advances on the software side, essential hardware components are getting smaller and cheaper, too. It’s not as neat as Gordon Moore’s formulation, but it is happening.

“Even with our Atreus science demonstrator [robot] from six or eight years ago, the power amplifiers to run our motors were these three-pound bricks; they were big,” Jonathan Hurst, co-founder of Agility Robotics, which built the aforementioned Cassie robot, told Digital Trends. “Since then, we’ve got these little, tiny amplifiers that have the same amount of current, the same amount of voltage, and give us very good control over the torque output of our motors. And they’re tiny — only an inch by two inches by a half-inch high or something like that. We’ve got 10 of those on Cassie. That adds up. You’ve got a three-pound brick that’s six inches by four inches by four inches versus maybe a couple ounces that’s an inch by two inches. It makes a big difference with things like the power electronics.”

UW ECE Research Colloquium, October 20, 2020: Jonathan Hurst, Oregon State University

Hurst said he believes legged robots are still in the early stages of their path to becoming ubiquitous technologies that can not only move in a naturalistic way like humans, but function seamlessly alongside them. Some of these challenges will go way beyond cute (but extremely impressive) demos like making robots canter like ponies. But building smarter machines that can master different kinds of movement, and be trusted to operate in the real world, is certainly an important step.

It’s a step (or steps) that walking robots are getting better and better at all the time.

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Digital Trends’ Top Tech of CES 2023 Awards
Best of CES 2023 Awards Our Top Tech from the Show Feature

Let there be no doubt: CES isn’t just alive in 2023; it’s thriving. Take one glance at the taxi gridlock outside the Las Vegas Convention Center and it’s evident that two quiet COVID years didn’t kill the world’s desire for an overcrowded in-person tech extravaganza -- they just built up a ravenous demand.

From VR to AI, eVTOLs and QD-OLED, the acronyms were flying and fresh technologies populated every corner of the show floor, and even the parking lot. So naturally, we poked, prodded, and tried on everything we could. They weren’t all revolutionary. But they didn’t have to be. We’ve watched enough waves of “game-changing” technologies that never quite arrive to know that sometimes it’s the little tweaks that really count.

Read more
Digital Trends’ Tech For Change CES 2023 Awards
Digital Trends CES 2023 Tech For Change Award Winners Feature

CES is more than just a neon-drenched show-and-tell session for the world’s biggest tech manufacturers. More and more, it’s also a place where companies showcase innovations that could truly make the world a better place — and at CES 2023, this type of tech was on full display. We saw everything from accessibility-minded PS5 controllers to pedal-powered smart desks. But of all the amazing innovations on display this year, these three impressed us the most:

Samsung's Relumino Mode
Across the globe, roughly 300 million people suffer from moderate to severe vision loss, and generally speaking, most TVs don’t take that into account. So in an effort to make television more accessible and enjoyable for those millions of people suffering from impaired vision, Samsung is adding a new picture mode to many of its new TVs.
[CES 2023] Relumino Mode: Innovation for every need | Samsung
Relumino Mode, as it’s called, works by adding a bunch of different visual filters to the picture simultaneously. Outlines of people and objects on screen are highlighted, the contrast and brightness of the overall picture are cranked up, and extra sharpness is applied to everything. The resulting video would likely look strange to people with normal vision, but for folks with low vision, it should look clearer and closer to "normal" than it otherwise would.
Excitingly, since Relumino Mode is ultimately just a clever software trick, this technology could theoretically be pushed out via a software update and installed on millions of existing Samsung TVs -- not just new and recently purchased ones.

Read more
AI turned Breaking Bad into an anime — and it’s terrifying
Split image of Breaking Bad anime characters.

These days, it seems like there's nothing AI programs can't do. Thanks to advancements in artificial intelligence, deepfakes have done digital "face-offs" with Hollywood celebrities in films and TV shows, VFX artists can de-age actors almost instantly, and ChatGPT has learned how to write big-budget screenplays in the blink of an eye. Pretty soon, AI will probably decide who wins at the Oscars.

Within the past year, AI has also been used to generate beautiful works of art in seconds, creating a viral new trend and causing a boon for fan artists everywhere. TikTok user @cyborgism recently broke the internet by posting a clip featuring many AI-generated pictures of Breaking Bad. The theme here is that the characters are depicted as anime characters straight out of the 1980s, and the result is concerning to say the least. Depending on your viewpoint, Breaking Bad AI (my unofficial name for it) shows how technology can either threaten the integrity of original works of art or nurture artistic expression.
What if AI created Breaking Bad as a 1980s anime?
Playing over Metro Boomin's rap remix of the famous "I am the one who knocks" monologue, the video features images of the cast that range from shockingly realistic to full-on exaggerated. The clip currently has over 65,000 likes on TikTok alone, and many other users have shared their thoughts on the art. One user wrote, "Regardless of the repercussions on the entertainment industry, I can't wait for AI to be advanced enough to animate the whole show like this."

Read more