Tesla Driver Called 9-11 After Running Down Motorcyclist While Browsing Phone on Full Self-Driving Mode
Distracted Self-Driving A Tesla driver crashed into a motorcycle outside of Seattle last year, pinning him underneath the car, and eventually running him over. The 28-year-old motorcyclist was pronounced dead at the scene. Extracted data from the crash obtained by NPR through a public records request, paint a damning picture of the events leading up to the fatal collision. "I'm the driver," Scott Hunter, the 56-year-old owner of the Tesla Model S, told dispatchers. "I'm not sure how it happened, but I am freaking out. So please get here." As it turns out, Hunter had Tesla's misleadingly called "Full Self-Driving" […]
Full Self-Distraction
A Tesla driver ran down a motorcyclist outside Seattle last year, pinning him under the car, where the 28-year-old victim was later pronounced dead at the scene.
Extracted data from the crash, obtained by NPR through a public records request, paint a damning picture of the events leading up to the fatal collision.
"I'm the driver," Scott Hunter, the 56-year-old owner of the Tesla Model S, told dispatchers. "I'm not sure how it happened, but I am freaking out. So please get here."
As it turns out, Hunter had Tesla's misleadingly called "Full Self-Driving" driver-assistance feature turned on, making it yet another instance of Tesla owners being lulled into a false sense of security by the tech before an entirely avoidable tragedy struck.
According to NPR, Hunter spent more than a minute without his hands on the steering wheel before the crash, admitting to police that he had been "distracted" by his phone.
Unpaid Attention
Tesla's FSD driver assistance software has been under heavy scrutiny by regulators for quite a few years now, spawning several government investigations.
Despite FSD and its overarching software suite Autopilot being linked to hundreds of crashes and dozens of deaths, Tesla is still allowed to have its customers test out the software on public roads.
Most recently, the company released an oxymoronically-named update called "Full Self-Driving (Supervised)." According to independent testing, the software still causes cars to run red lights, swerve across double yellow lines, or suddenly stop without warning.
Meanwhile, Tesla maintains that its flawed software is safer on the whole than human drivers, with executives railing against regulatory crash report requirements — which the incoming Trump administration is expected to gut.
Regulators, however, are clear that customers are getting the wrong idea and quite literally handing over the controls, even when they're instructed by Tesla not to — in large part thanks to Tesla's misleading marketing of its driver-assistance software.
"The drivers would see a car that's doing pretty good," former National Highway Traffic Safety Administration adviser and George Mason University robotics professor Missy Cummings told NPR. "They get a false sense of security, and they don't understand that they should be paying attention."
Tesla CEO Elon Musk has promised that the company will be moving on to testing an "unsupervised" version of FSD as soon as this year. Whether it will prevent crashes like this one remains unclear at best.
And if anything, now that Musk has put in charge of the so-called Department of Government Efficiency (DOGE), regulations surrounding the use of driver-assistance software could soon be on the chopping block.
More on FSD: Godfather of AI Says Elon Musk Is Lying About Self-Driving Teslas
The post Tesla Driver Called 9-11 After Running Down Motorcyclist While Browsing Phone on Full Self-Driving Mode appeared first on Futurism.