Skip to main content

Inside the mind of an autonomous delivery robot

In the summer of 2014, Ahti Heinla, one of the software engineers who helped develop Skype, started taking photos of his house.

There is nothing particularly unusual about this, of course. Only he kept on doing it. Month after month, as summer turned to fall and fall gave way to winter, Heinla went out to the same exact spot on the sidewalk and snapped new, seemingly identical pictures of his home. Was the man who had played a crucial role in building a multibillion dollar telecommunications app losing his mind? As it turned out, there was an entirely logical reason for Heinla’s actions — although it might have nonetheless sounded a bit crazy to anyone who asked what he was doing. Ahti Heinla was helping future autonomous robots learn how to see.

More than half a decade later, the world (or, at least, a few select parts of it) are reaping the rewards of Heinla’s seemingly oddball experiment. As the co-founder of a startup called Starship Technologies alongside Skype co-founder Janus Friis, Heinla has helped build a fleet of self-driving delivery robots. These robots, which resemble six-wheeled coolers, have traveled tens of thousands of miles all around the world, making some 100,000-plus deliveries in the process. They are particularly prevalent on a growing number of university campuses, although they have also traversed streets in cities ranging from San Francisco to Milton Keynes in the U.K.

To order something from one of Starship’s delivery robots, a customer simply selects the item they want from one of Starship’s delivery partners. For a small delivery fee, the robot will then pick up the item and autonomously deliver it to the spot of your choosing. All the customer has to do is unlock the robot using the app and retrieve the order. Simple, right?

Starship Campus Delivery Service with Robots

As with any such solution, however, the simpler things seem from the user perspective, the more complex they are technologically. Here in 2020, we’re used to hearing about self-driving cars that are able to navigate through the world with impressive levels of ease. As one of the first companies to roll out self-driving vehicles without human safety drivers, Starship Technologies has helped play a key role in making autonomous technologies such as this a part of everyday life.

We shouldn’t take these tools for granted, though. Not only are they amazing feats of engineering and computer science, but the choices currently being made surrounding these technologies will help determine the future of human and robot interactions.

Maps aren’t built for robots

Do you remember the overwhelming feeling of starting a new school as a kid and having to navigate your way around? Perhaps, if you lived close by, you even walked from home to school on your own or with friends. Normally, these trips were preceded by ones on which we we are accompanied by a parent or guardian who’s able to give us tips about how to navigate the world around us. They might walk with us the first few times to ensure that we are familiar with a certain path. They will probably point out certain landmarks, such as signs or particularly memorable buildings. Before long, we form a mental map of where we are going and how to navigate there.

The shortest route (green) is not always the fastest and safest. The robot will prefer the route that is longer in distance, but faster and safer Image used with permission by copyright holder

This ability, which most of us take for granted, is what Starship Technologies has worked hard to develop for its robots. In some ways, it is a surprisingly complicated one. Take maps, for instance. When Starship’s robots set out to navigate from point A to point B, they start by using satellite imagery to help them plan out the journey. A routing algorithm is then used to figure out the shortest and safest path for the robot to take. So far, so simple, right? Except that it isn’t.

As Heinla says: “We can’t use a lot of existing maps because they are not really made for robots; they are made for humans.” Existing mapping systems assume a level of human knowledge, such as an understanding of which part of the road we should walk on, and how we should maneuver on a busy sidewalk. These are all things a robot doesn’t necessarily understand. There are plenty of added complexities.

For example, think about how your behavior while walking across a driveway differs from a regular sidewalk. We might not think of them as being especially different, but they are. If one of Starship’s robots encounters an obstacle on the sidewalk, its response is to stop in its tracks. That’s because stopping is the safest thing to do. But stopping on a driveway, or while crossing a street, blocks access for vehicles. It requires learning a totally different type of behavior.

To help understand the kind of behavior its robots should use, Starship has developed machine learning tools that can segment maps into a series of interconnected colored lines representing sidewalks (in green), crossings (in red), and driveways (purple). Rather than simply selecting the shortest route in terms of distance, the robot determines the quickest route by attaching a cost to every scenario that the robot will encounter over the course of a journey.

Recognizing the world around them

After this, Starship’s robots head out into the real world, using a bevy of 10 cameras to identify the 360-degree world around them through observation. Special image-recognition systems divide the world up into thousands of lines, giving it a simplified wireframe view of the world to use as guideposts. Over time, as the company’s robots spend longer in one area, they can build up collaborative three-dimensional wireframe maps of entire areas, making it far easier for future robots to understand the scenery around them.

Different-colored lines (yellow and blue) represent the edges different robots detected while driving. Later, the server will figure out that the lines from different robots match and thus the robot location is known and those pieces of driving can be put together like a puzzle Image used with permission by copyright holder

“It’s just like the way you might direct a person: continue until you hit a yellow building, then turn right and continue until the church,” Heinla said. “The robot also has landmarks, but they’re not yellow buildings or churches; they’re abstract shapes.”

The last stage of the robots’ mapping process is to work out exactly how wide and where the sidewalk is. This is done by using both its onboard cameras and its 2D map taken from satellite imagery.

“Even something as simple as walking down the sidewalk is something we’ve learned from the time that we were very young,” Heinla said. “We take it for granted. But for machines, it’s something that needs to be taught. There are things like whether you pass an approaching person on the left or the right. If someone slower than you is walking ahead, do you slow down or pass them? If you slow down, how close should you get to the other person? If you get too close, the other person will get uncomfortable. All of these we have to teach to the machine.”

Should all go to plan (and, to date, it has), Starship’s robots will be able to navigate to the destination users select on the map.

How do we want robots to interact with people?

This isn’t a challenge that’s unique to Starship Technologies. A number of other companies, ranging from Nuro to BoxBot, are exploring their own self-driving robot-delivery services. But it goes far beyond robots that can bring us takeout or groceries when we’re too busy (or lazy) to go to the shops. As robots play a bigger role in our lives, the question of how to integrate them within our world is becoming more pressing.

There is no need to take into account the small static obstacles such as these poles while defining the drivable area for the robot. These are mapped using the sensor input while driving and the robot will avoid them automatically later on. Image used with permission by copyright holder

Robots have traditionally performed very well in lab conditions where every variable can be perfectly controlled. They have also been largely separated from people for safety reasons. Now they are moving into the real world in a big way. If we’re not used to the sight of robots on our streets now, we sure will by the time the 2020s comes to an end.

“Every week in our autonomous driving team, we have a meeting where, for one hour, our safety team shows the autonomous driving engineers some of the most interesting things that have happened during the last [seven days],” Heinla said. “These interesting things are either places where there has been some discomfort, the robot has done exceptionally well driving, or [where there have been] some unusual weather conditions or objects.”

Some of these t problems involve robots being able to comprehend our world. That is what Heinla was testing when he took photos outside his house in the early days of Starship Technologies. He wanted to know whether a robot would be able to recognize his house as, well, his house, regardless of whether it was a sunny summer’s day or a rainy winter’s evening. It turns out that it could — and that insight helped spawn a whole company (or maybe even an entire delivery industry).

Research such as this — part engineering, part sociology — is all about finding answers to how humans and machines can better coexist. Is it worse for a robot to be overly cautious or too reckless? What happens when delivery robots encounter guide dogs? Data from this new field of research is being gathered and used to tweak the algorithms that power robots made by companies such as Starship Technologies.

One day, we’ll thank them for it. For now, though, it’s just important that we understand the decisions they make — and the reasons they make them.

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
The best robots at CES 2021
CES 2021 Samsung Robots

CES has gone virtual this year, but there’s still a ton of amazing tech on display. With everything from A.I. innovation to cleaning tech, CES 2021 provides a look into the near future, with the latest and greatest offerings from brands both large and small. This year, we’re seeing quite a few robots — robots that help you around the house, educational robots, and even robots designed to combat COVID-19. Check out some of the best robots from CES 2021 below.

More CES 2021 coverage

Read more
A disembodied robot mouth and 14 other 2020 stories we laughed at
The Prayer

Goodbye 2020, and good riddance! But before we slam the door shut on this tumultuous year, let’s try to raise a smile or two by revisiting some of the more amusing tech stories that landed on the pages of Digital Trends over the last 12 months. Here's a recap of the weirdest, wildest, and most hilariously strange stories we've run this year. Enjoy!
A.I. fail as robot TV camera follows bald head instead of soccer ball
https://twitter.com/rogbennett/status/1321869751258329090

While artificial intelligence (A.I.) has clearly made astonishing strides in recent years, the technology is still prone to the occasional fail.

Read more
Self-driving forklifts are here to revolutionize warehouses, for better or worse
third wave automation autonomous forklifts forklift scene

Imagine a future Blade Runner-esque workplace in which human and robot co-workers work side by side without it seeming in the least bit remarkable. As it turns out, you don’t need to be much of a futurist to conceive of such a scenario: It’s been the day-to-day reality of factory and warehouse workers for decades.

The term “automation” was first coined in 1948 by Delmar S. Harder, an engineer and vice president at the Ford Motor Company to describe the handoff of particularly heavy, repetitive, and dangerous jobs to machines in industrial settings like factories. The first industrial robots and automated warehouses began appearing in the 1960s -- and have only grown in number since then.

Read more