Skip to Content

Tesla is under investigation because its cars keep hitting emergency vehicles

<i>Christopher Goodney/Bloomberg/Getty Images</i><br/>An instrument panel with the Tesla Motors Inc. 8.0 software update illustrates the road ahead using radar technology inside a Model S P90D vehicle.
Bloomberg via Getty Images
Christopher Goodney/Bloomberg/Getty Images
An instrument panel with the Tesla Motors Inc. 8.0 software update illustrates the road ahead using radar technology inside a Model S P90D vehicle.

By Chris Isidore and Peter Valdes-Dapena, CNN Business

Federal safety regulators are investigating at least 11 accidents involving Tesla cars using Autopilot or other self-driving features that crashed into emergency vehicles when coming upon the scene of an earlier crash.

The National Highway Transportation Safety Administration said seven of these accidents resulted 17 injuries and one death.

All of the Teslas in question had the self-driving Autopilot feature or the traffic-aware cruise control engaged as they approached the crashes, the NHTSA said.

Tesla stock fell 5% in morning trading following news of the probe.

The accidents under investigation occurred between January 22, 2018, and July 10, 2021, across nine different states. They took place mostly at night, and the post-accident scenes all included control measures like first responder vehicle lights, flares, an illuminated arrow board and road cones.

Tesla did not immediately respond to a request for comment about the probe.

The safety of Tesla’s Autopilot feature has been questioned before. The National Transportation Safety Board, a separate agency that also investigates plane crashes and other fatal accidents, found Autopilot partly to blame in a 2018 fatal crash in Florida that killed a Tesla driver.

Police in a Houston suburb said there was no one in the driver’s seat of a Tesla that crashed and killed two people in the car earlier this year, a charge that Tesla has denied. But Lars Moravy, Tesla’s vice president of vehicle engineering, confirmed in April in comments to investors that Tesla’s adaptive cruise control was engaged and accelerated to 30 mph before that car crashed.

Tesla has been seeking to offer full self-driving technology to its drivers. But while it says that its data shows cars using Autopilot have fewer accidents per mile than cars being driven by drivers, it does warn “current Autopilot features require active driver supervision and do not make the vehicle autonomous.”

The safety agency said its investigation will allow it to “better understand the causes of certain Tesla crashes,” including “the technologies and methods used to monitor, assist, and enforce the driver’s engagement with driving while Autopilot is in use.” It will also look into any contributing factors in the crashes.

“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” said the agency in a statement. “Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles. Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.”

The investigation involves the Tesla Y, X, S and 3 with model years 2014 to 2021.

Gordon Johnson, an analyst and vocal critic of Tesla, wrote in a note to clients Monday that the issue isn’t just about Autopilot users — but also other non-Tesla drivers on the road who could be injured by cars using the feature.

“NHTSA is zeroing in on a particular danger that Tesla creates for people outside the vehicle — ie, those who never agreed to be Autopilot ‘guinea pigs,'” Johnson wrote. “Thus, to simply say ‘Tesla drivers accept Autopilot’s risks,’ as has been used in the past, does not appear to be a defense here.”

Self-driving options such as Tesla’s Autopilot or more widely available adaptive cruise control, available on a wide range of automakers’ vehicles, do a good job of slowing a vehicle down when the car in front is slowing down, said Sam Abuelsamid, an expert in self-driving vehicles and principal analyst at Guidehouse Insights.

But Abuelsamid said those vehicles are designed to ignore stationary objects when traveling at more than 40 mph so they don’t slam on the brakes when approaching overpasses or other stationary objects on the side of the road, such as a car stopped on the shoulder. Fortunately most of these vehicles with some kind of automatic braking do stop for stationary objects when they’re moving more slowly, Abuelsamid said.

The real problem he said is that many more Tesla owners assume their cars can, in fact, drive themselves than do drivers of other cars with automatic braking and other safety features. And the cues that a driver would see when approaching an accident site, such as road flares or flashing lights, make more sense to a human than they might to an auto drive system.

“When it works, which can be most of the time, it can be very good,” said Abuelsamid, about Tesla’s Autopilot feature. “But it can easily be confused by things that humans would have no problem with. Machine visions are not as adaptive as humans. And the problem is that all machine systems sometimes make silly errors.”

The-CNN-Wire
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN - Business/Consumer

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

ABC 17 News is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content