July 10, 2025

Primariasabiertas

All The Technology

Tesla’s handling of braking bug in self-driving test raises alarms

Tesla pushed out a new version of its experimental self-driving software suite on Oct. 23 through an “over the air” update to approved drivers. The next morning, Tesla learned the update had altered cars’ behavior in a way the company’s engineers hadn’t intended.

In a recall report to federal safety regulators dated Oct. 29, Tesla put the problems like this: The company discovered a software glitch that “can produce negative object velocity detections when other vehicles are present.”

In everyday English, Tesla’s automatic braking system was engaging for no apparent reason, causing cars to rapidly slow as they traveled down the highway, putting them at risk of being rear-ended. Forward collision warning chimes were ringing too, even though there was no impending collision to warn about.

The company said no crashes or injuries were reported. Still, the incident demonstrates how complicated these systems are: Even a small change can affect how something as vital but seemingly simple as automatic braking will function. The incident raises the question of whether there is a safe way to test self-driving vehicles at mass scale on public roads, as Tesla has been doing.

Tesla, which has disbanded its media relations department, could not be reached for comment.

Tesla’s response to the glitch raises its own concerns. While its engineers worked to fix the software, they turned off automatic braking and forward collision warning for the software testers over the weekend, the company said. According to numerous messages posted on Twitter, owners were not informed that those safety systems has been deactivated, finding out only after scrolling through the menu on their cars’ dashboard screens.

By Oct. 25, Tesla had knocked out a software fix and zapped it to 11,704 drivers enrolled in the Full Self-Driving program. The program is the company’s attempt to develop a driverless car, and is markedly different from its driver assist system called Autopilot. The latter, introduced in 2015, automates cruise control, steering and lane changing.

Autopilot is the subject of a federal safety investigation into why a dozen Teslas have crashed into police cars and other emergency vehicles parked by the roadside. Those crashes resulted in 17 injuries and one death.

Investigators are trying to learn why Tesla’s automatic emergency braking systems apparently did not engage to prevent or mitigate such crashes. The National Highway Transportation Safety Administration is looking into which system software elements control automatic braking when a car is on Autopilot and a crash is imminent. Experts have raised the possibility that Tesla is suppressing automatic braking when Autopilot is on, possibly to avoid phantom braking of the sort drivers experienced after the Full Self-Driving update.

Tesla has billed Full Self-Driving as the culmination of its push to create a car that can navigate itself to any destination with no input from a human driver. Tesla Chief Executive Elon Musk has promised for years that driverless Teslas are imminent.

The regulations on deploying such technology on public roads are spotty. There is no federal law — legislation on driverless technology has been gummed up in Congress for years, with no action expected soon.

And though California requires companies testing driverless technology on public roads to report even minor crashes and system failures to the state Department of Motor Vehicles, Tesla doesn’t do so, according to DMV records. Companies including Argo AI, Waymo, Cruise, Zoox, Motional and many others comply with DMV regulations. The Times has asked repeatedly over several months to speak with department director Steve Gordon to explain why Tesla gets a pass, but every time he’s been deemed unavailable.

In May, the DMV announced a review of Tesla’s marketing practices around Full Self-Driving. The department has declined to discuss the matter beyond saying, as it did Tuesday, that the review continues.

Like Tesla, other companies developing autonomous driving systems use human drivers to supervise public road testing. But where they employ trained drivers, Tesla uses its customers.

Tesla charges customers $10,000 for access to periodic iterations of Full Self-Driving Capability software. The company says it qualifies beta-test drivers by monitoring their driving and applying a safety score, but has not clarified how the system was developed.

YouTube is loaded with dozens of videos showing Tesla beta-test software piloting cars into oncoming traffic or other dangerous situations. When one beta-test car tried to cross the road into another vehicle, a passenger commented in a video, “it almost killed us,” and the driver said in the video, “FSD, it tried to murder us.”

The recall that resulted from the automatic braking bug — employing over-the-air software, no visit to the dealer necessary — marks a beginning of a major change in how many recalls are handled. Tesla has taken the lead in automotive software delivery, and other carmakers are trying to catch up.

Federal safety regulators are only beginning to grasp the profound changes that robot technology and its animating software are bringing about. NHTSA recently named Duke University human-machine interaction expert Missy Cummings as senior adviser for safety.

Asked for comment, NHTSA said the agency “will continue its conversations with Tesla to ensure that any safety defect is promptly acknowledged and addressed according to the National Traffic and Motor Vehicle Safety Act.”

primariasabiertas.com | Newsphere by AF themes.