The companies racing to deploy autonomous cars on the world’s roads took a reality check in the 2010s, but multimillion-dollar development efforts remain ongoing across the automotive and tech industries. German supplier Bosch is notably moving full speed ahead with its quest to make driverless cars a reality. Kay Stepper, Bosch’s senior vice president of automated driving, sat down with Digital Trends to talk about the state of autonomous driving in 2020, and what’s next for the artificial intelligence technology that powers the prototypes it’s testing.
Bosch has never made a car, so it brings its innovations to the market through partnerships with automakers. It chose Mercedes-Benz parent company Daimler to test autonomous technology in real-world conditions via a ridesharing pilot program in San Jose, California, close to one of the company’s research centers. Stepper explained that, while engineers learn a lot from software-based simulations, field testing is still crucial.
“We’re trying to learn [from the program] what are some of the use cases we need to continue to refine and tune for testing and validation purposes,” Stepper said. For example, autonomous cars have a difficult time recognizing logging trucks with trailers that have an axle at the front, one at the rear, and only wood in the middle, because there’s no metal between the two ends. While San Jose certainly isn’t the epicenter of America’s logging industry, it’s a hurdle that needs to be cleared to bring autonomous technology outside of urban centers. Tracking pedestrians, chickens, motorcyclists, and anything else that could potentially cross a road with little or no prior notice is challenging, too.
Sociology also plays a role in developing self-driving technology. Michael from Billings, Montana, doesn’t have the same driving habits as Haruto in Tokyo, or Jean-Pierre in France, so it stands to reason that autonomous cars need to adapt to their environment. That’s where artificial intelligence comes to the rescue.
“The aim is to really predict what’s going to be the next move for traffic participants — cars, bicyclists, scooters, you name it,” Stepper explained. Adapting to local traffic conditions has its limits: Engineers won’t program a car to speed or run a red light, even if the locals do it without thinking twice.
An expensive endeavor
Navigating this complex latticework requires a tremendous amount of hardware and software not currently found in regular-production cars, and that means making a car that drives itself is extremely expensive. Bosch is working on bringing costs down in the coming years, in part by developing a form of lidar in-house. The unit will complement the radars, sensors, and cameras currently fitted to the prototypes participating in the San Jose pilot program.
“For us, level 4 won’t happen without lidar. Maybe in the future, we’ll find other sensing solutions, but at the moment, we need all 4,” Stepper explained. Most automakers agree; some, like Tesla, are bypassing lidar.
Level 4 is the second-highest level on the autonomous driving scale created by the Society of Automotive Engineers (SAE). It corresponds to a system in which the car drives itself without any human input when the right conditions are met. The next and final level, 5, denotes a car that operates autonomously all of the time, regardless of the weather, the road it’s on, and so forth. Stepper uses a more straightforward scale: Instead of numbers, he cleverly refers to the various levels as “feet off, hands off, eyes off, mind off, and human off.”
A gradual process
The S-Class sedans participating in the San Jose pilot program run level 4 technology (so, mind off in Stepper-speak), but an engineer rides in the driver’s seat at all times to take over in case something goes wrong. They’re also prototypes; you can’t buy one, lease one, or take one for a spin. Odds are you can’t even hail one to travel across town, because the program is only open to Bosch and Daimler associates via a purpose-designed app. The partners will make the program available to a wider audience as soon as possible, and the experience gained while teaching cars how to drive will echo far beyond the boundaries of transportation.
“We need to make artificial intelligence beneficial to everyone, and to human lives on a daily basis.”
“We need to make artificial intelligence beneficial to everyone, and to human lives on a daily basis. Home-use health care products, for example. We can train them to diagnose medical conditions, detect warning signs, and so on. Or, the SoundSee robot currently floating around the International Space Station, 250 miles above us. It might seem far away for most humans, but this research affects all of us,” Stepper said.
In the meantime, he predicted autonomous technology will gradually trickle into American cities over the next five years, and it will slowly spread across the nation’s interstates. “We will deploy autonomous taxis in other cities, and we’ll apply the technology to other modes of transportation, like commercial vehicles,” he concluded.
In other words, your next car won’t be autonomous, and neither will the one you replace it with, but it might travel from the factory to your town on the back of a self-driving truck, and you could hail a lounge-like driverless cab to go pick it up.