Tesla’s Autopilot and Full Self-Driving linked to Numerous of crashes and dozens of deaths

HTSA found that Tesla’s driver-assist features are insufficient at keeping drivers engaged in the task of driving, which can often have fatal results.

Left side of Tesla Model 3 main screen showing a computer-generated image of an intersection with cars parked on the sides and the Model 3 following another car
Image: Owen Grove / The Verge

In March 2023, a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y traveling at “highway speeds,” according to a federal investigation that published today. The Tesla driver was using Autopilot, the automaker’s advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars.

The 17-year-old student who was struck was transported to a hospital by helicopter with life-threatening injuries. But what the investigation found after examining hundreds of similar crashes was a pattern of driver inattention, combined with the shortcomings of Tesla’s technology, resulting in hundreds of injuries and dozens of deaths.

Drivers using Autopilot or the system’s more advanced sibling, Full Self-Driving, “were not sufficiently engaged in the driving task,” and Tesla’s technology “did not adequately ensure that drivers maintained their attention on the driving task,” NHTSA concluded.

Drivers using Autopilot or the system’s more advanced sibling, Full Self-Driving, “were not sufficiently engaged in the driving task”

In total, NHTSA investigated 956 crashes, starting in January 2018 and extending all the way until August 2023. Of those crashes, some of which involved other vehicles striking the Tesla vehicle, 29 people died. There were also 211 crashes in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path.” These crashes, which were often the most severe, resulted in 14 deaths and 49 injuries.

NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into stationary emergency vehicles parked on the side of the road. Most of these incidents took place after dark, with the software ignoring scene control measures, including warning lights, flares, cones, and an illuminated arrow board.

In its report, the agency found that Autopilot — and, in some cases, FSD — was not designed to keep the driver engaged in the task of driving. Tesla says that it warns its customers that they need to pay attention while using Autopilot and FSD, which includes keeping their hands on the wheels and eyes on the road. But NHTSA says that in many cases, drivers would become overly complacent and lose focus. And when it came time to react, it was often too late.

In 59 crashes examined by NHTSA, the agency found that Tesla drivers had enough time, “five or more seconds,” prior to crashing into another object in which to react. In 19 of those crashes, the hazard was visible for 10 or more seconds before the collision. Reviewing crash logs and data provided by Tesla, NHTSA found that drivers failed to brake or steer to avoid the hazard in a majority of the crashes analyzed.

“Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances,” NHTSA said.

NHTSA also compared Tesla’s Level 2 (L2) automation features to products available in other companies’ vehicles. Unlike other systems, Autopilot would disengage rather than allow drivers to adjust their steering. This “discourages” drivers from staying involved in the task of driving, NHTSA said.

“Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances.”

“A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities,” the agency said.

Even the brand name “Autopilot” is misleading, NHTSA said, conjuring up the idea that drivers are not in control. While other companies use some version of “assist,” “sense,” or “team,” Tesla’s products lure drivers into thinking they are more capable than they are. California’s attorney general and the state’s Department of Motor Vehicles are both investigating Tesla for misleading branding and marketing.

NHTSA acknowledges that its probe may be incomplete based on “gaps” in Tesla’s telemetry data. That could mean there are many more crashes involving Autopilot and FSD than what NHTSA was able to find.

Even the brand name “Autopilot” is misleading, NHTSA said

Tesla issued a voluntary recall late last year in response to the investigation, pushing out an over-the-air software update to add more warnings to Autopilot. NHTSA said today it was launching a new investigation into the recall after a number of safety experts said the update was inadequate and still allowed for misuse.

The findings cut against Musk’s insistence that Tesla is an artificial intelligence company that is on the cusp of releasing a fully autonomous vehicle for personal use. The company plans to unveil a robotaxi later this year that is supposed to usher in this new era for Tesla. During this week’s first quarter earnings call, Musk doubled down on the notion that his vehicles were safer than human-driven cars.

“If you’ve got, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let’s say, half the accident rate of a human-driven car, I think that’s difficult to ignore,” Musk said. “Because at that point, stopping autonomy means killing people.”

Related Posts

They Were Already Gone: Chilling New Evidence Shows Lilly & Jack Sullivan Vanished Hours Before Anyone “Noticed” – While Mom Slipped Out the Back and Stepdad Rehearsed His Alibi in the Front Yard

Everybody remembers the frantic 911 call at 10:03 a.m. on May 2, 2025. “This is Malehya Brooks-Murray. My kids, Lilly and Jack Sullivan, they’re… they’re gone! I…

“We Found Lilly’s Strawberry Fabric in the River”: Volunteer Divers Just Dragged Up the First Real Evidence in 7 Months – And the RCMP Can’t Bury This One Anymore.

Seven months of nothing. Seven months of police saying “no physical evidence,” “no criminality,” “they probably just wandered off and got lost.” Seven months of a mother…

Volunteer Divers Just Pulled Nightmare Clues from Nova Scotia River – The Lilly & Jack Sullivan Case Is About to Crack Wide Open.

Seven months. Seven agonizing months since the world stopped breathing over the vanishing of Lilly Sullivan, 6, and her inseparable little brother Jack, 4 – two wide-eyed…

“I Heard Them Laughing at 8:45”: RCMP’s Newly Unsealed Witness Statement Just Obliterated the Lilly & Jack Sullivan Timeline.

Six months of silence. Six months of yellow ribbons turning gray on Nova Scotia fence posts. Six months of a mother’s voice breaking on every news channel:…

RCMP Drops Shocking New Eyewitness Bombshell in Lilly and Jack Sullivan Case – Her Claim About “Hearing the Kids Play” Just Torched the Official Timeline and Has the Internet Screaming “Cover-Up!”

It’s been 215 days since the world woke up to the gut-wrenching news that two tiny siblings had vanished from their rural Nova Scotia home without a…

Four Years After Summer Wells Vanished, the Internet Turned Her Parents Into Monsters – And the Most Terrifying Part Is We Might Have Done It to the Wrong People.

She was five years old. Pink shirt, shaved head, unicorn obsession, last seen planting flowers with her grandma in the front yard of a little red-brick house…