Skip to main content

Tesla pulls latest Full Self-Driving beta less than a day after release

False collision warnings and other issues have prompted Tesla to pull the latest version of its Full Self-Driving (FSD) beta less than a day after rolling it out for some vehicle owners.

Tesla decided to temporarily roll back to version 10.2 of FSD on Sunday following reports from some drivers of false collision warnings, sudden braking without any apparent reason, and the disappearance of the Autosteer option, among other issues.

In a tweet, Tesla chief Elon Musk confirmed that his team had decided to pull version 10.3 until the bugs have been dealt with.

“Seeing some issues with 10.3, so rolling back to 10.2 temporarily,” Musk said in his tweet, adding: “Please note, this is to be expected with beta software. It is impossible to test all hardware configs in all conditions with internal QA, hence public beta.”

A tweet by Elon Musk.
Image used with permission by copyright holder

Tesla debuted FSD 12 months ago for a small number of select drivers. Despite the name, FSD is actually a driver-assistance feature, with drivers expected to keep their hands on the wheel and their eyes on the road at all times.

FSD version 10.2 launched earlier this month, with the beta software made available to drivers who scored perfect marks on Tesla’s new Safety Score test, which analyzes vehicle data to calculate a safety rating for each driver. For 10.3, Tesla relaxed the entry level a tiny bit and allowed eligible owners with a 99/100 score to receive the FSD software. The system is designed to give Tesla a degree of confidence that the FSD beta is being tested by drivers who exhibit a high level of responsibility when behind the wheel.

The performance of vehicles with driver-assistance systems is closely monitored by regulators. Tesla has faced criticism for the names that it uses for its own systems — Autopilot and Full Self-driving — which some believe can lead Tesla owners to be less attentive when behind the wheel.

In August, the National Highway Traffic Safety Administration launched a probe into Tesla’s Autopilot system after a number of crashes that saw its cars slam into emergency vehicles. The safety agency has also been looking into other crashes involving Tesla vehicles.

With the ongoing scrutiny and the automaker’s reputation at stake, Tesla has decided to take the cautious route by quickly reversing the latest FSD beta release until it’s able to solve the highlighted issues.

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
Waymo’s self-driving cars can’t get enough of one dead-end street
waymo

Waymo has been testing its self-driving cars in San Francisco for the last decade. But an apparent change to the vehicles’ routing has caused many of them to make a beeline for a dead-end street in a quiet part of the city, causing residents there to wonder what on earth is going on.

At CBS news crew recently visited the site -- 15th Avenue north of Lake Street in Richmond -- to see if it could work out why so many of Waymo’s autonomous cars are showing up, turning around, and then driving right out again.

Read more
Watch San Franciscans take a ride in Waymo’s self-driving car
Waymo Jaguar I-Pace

Waymo is inviting San Francisco residents to hop inside its self-driving vehicles for a drive around the city.

Welcoming our first riders in San Francisco

Read more
Tesla issues stark warning to drivers using its Full Self-Driving mode
A Telsa Model 3 drives along a road.

Tesla in recent days rolled out a long-awaited update to its Full Self-Driving (FSD) mode that gives its vehicles a slew of driver-assist features.

But in a stark warning to owners who’ve forked out for the premium FSD feature, Tesla said that the software is still in beta and therefore “may do the wrong thing at the worst time.” It insisted that drivers should keep their "hands on the wheel and pay extra attention to the road.”

Read more