Skip to main content

Tesla issues stark warning to drivers using its Full Self-Driving mode

Tesla in recent days rolled out a long-awaited update to its Full Self-Driving (FSD) mode that gives its vehicles a slew of driver-assist features.

But in a stark warning to owners who’ve forked out for the premium FSD feature, Tesla said that the software is still in beta and therefore “may do the wrong thing at the worst time.” It insisted that drivers should keep their “hands on the wheel and pay extra attention to the road.”

https://twitter.com/tesla_raj/status/1413764413165772803

The message comes after a number of fatal accidents over the years involving Tesla vehicles where the driver may not have been paying attention while the car was in Autopilot or FSD mode.

Critics of the California-based electric-car company have long said that using terms such as Autopilot and Full Self-Driving can lead some drivers into believing that their vehicle is fully autonomous, prompting them to take their eye off the road.

Drivers are supposed to keep their hands on the wheel at all times, with Tesla incorporating a safety system that emits warnings and eventually slows the car to a halt if it detects that the driver is not touching the wheel. But videos online, as well as a recent test by a leading U.S. consumer organization, suggest the system can be easily tricked into thinking someone has their hands on the wheel or is in the driver’s seat.

Consumer Reports carried out its test shortly after a Tesla Model S crashed into a tree in Spring, Texas, in April, killing two men inside. A police report said the two occupants were found in the back seat, with no one else in the vehicle. The suggestion was that the vehicle was operating in Autopilot (the owner hadn’t purchased the FSD feature) at the time of the crash, though Tesla chief Elon Musk said at the time that vehicle data logs indicated that the car was not in Autopilot mode when the accident occurred.

Notably, a preliminary report into the crash released by the National Transportation Safety Board said that in attempting to repeat the same moments leading up to the fatal crash, its investigators were unable to engage the Autosteer element of Autopilot along the stretch of road where the accident happened, and that security footage at the home of the Model S owner showed both men climbing into the front seats of the vehicle just minutes before the crash occurred. Several investigations into the accident are continuing.

The National Highway Traffic Safety Administration revealed in June that since 2016 it has launched 30 investigations into Tesla crashes involving 10 fatalities in which the driver assistance system may have been in use. To date, it has ruled out the involvement of Tesla’s Autopilot system in three of the crashes and published reports on two of the accidents.

https://twitter.com/elonmusk/status/1413306409693892613

In a tweet posted last week, Musk said that the recent beta update for FSD mode “addresses most known issues,” but also warned that “there will be unknown issues.” He urged drivers to “please be paranoid,” adding, “Safety is always top priority at Tesla.”

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
We now know what the self-driving Apple Car might look like
A render that shows what the Apple Car might look like.

Thanks to several 3D concept renders, we now know what the future self-driving Apple Car might look like.

Vanarama, a British car-leasing company, took inspiration from other Apple products, as well as Apple patents, in order to accurately picture the rumored Apple car.

Read more
Tesla starts opening up its Supercharger network to other EVs
Tesla Supercharger

Tesla has started to open up its Supercharger network to non-Tesla electric vehicles (EVs).

Tesla chief Elon Musk promised in July that the automaker would begin the process before the end of this year.

Read more
Tesla pulls latest Full Self-Driving beta less than a day after release
The view from a Tesla vehicle.

False collision warnings and other issues have prompted Tesla to pull the latest version of its Full Self-Driving (FSD) beta less than a day after rolling it out for some vehicle owners.

Tesla decided to temporarily roll back to version 10.2 of FSD on Sunday following reports from some drivers of false collision warnings, sudden braking without any apparent reason, and the disappearance of the Autosteer option, among other issues.

Read more