Tesla owner says car in ‘full self-driving mode’ failed to detect a moving train
Shocking footage shows the moment a Tesla in “Full Self Driving” mode narrowly avoided a crash on rail tracks in Ohio after the vehicle failed to detect a passing train.
The vehicle’s owner, Craig Doty II, said he took responsibility for the near-miss incident on the morning of May 8 in Camden, Ohio, and that he may have become “complacent” with the technology.
The footage, first aired by NBC News, showed the vehicle driving down a road in heavy fog towards a set of railtracks, where a freight train is passing. The car does not slow down as it approaches and only swerves to avoid the train at the last second, smashing into a safety barrier.
The close-shave was caught from multiple angles by the Tesla’s cameras. A photo provided by Mr Doty later showed the car with heavy damage on its front right side.
Despite taking responsibility for control of the vehicle, Mr Doty told NBC News that he believed Full-Self Driving mode (FSD) in his vehicle was defective. “I am in control of the vehicle, I don’t go around causing mayhem and getting in wrecks and driving outlandishly,” Mr Doty said.
“I was the only one in the car. I was the only car in the accident. So yes, it was my fault, it had to be... But I feel it was more that the damn car didn’t recognize the train.”
“You do get complacent that [the technology] knows what it’s doing. And usually it’s more cautious than I would be as a driver.”
According to a Tesla crash report, shared with NBC by Mr Doty, he was driving at around 60mph at the time of the incident. The speed limit was 55mph. Drivers can request crash reports from Tesla, which are generated using data individual cars send to Tesla servers.
Tesla offers two partially-automated systems, Autopilot and a more sophisticated “Full Self Driving,” but the company says neither can fully drive the car. The company has previously faced multiple lawsuits, stemming from crashes involving the Autopilot system.
Safety advocates have long expressed concern that Autopilot, which keeps a vehicle in its lane and a safe distance from objects in front, was not designed to operate on roads other than limited access highways.
The widow of a man who died after his Tesla veered off the road and crashed into a tree, while he was using the partially-automated driving system, is now suing the carmaker, claiming its marketing of the technology is dangerously misleading.
The Autopilot system prevented Hans Von Ohain from being able to keep his Model 3 Tesla on a Colorado road in 2022, according to a lawsuit filed by Nora Bass in state court earlier this month. Von Ohain died after the car hit a tree and burst into flames but a passenger was able to escape, the suit stated.
Last month, Tesla paid an undisclosed amount of money to settle a separate lawsuit that made similar claims, brought by the family of a Silicon Valley engineer who died in a 2018 crash while using Autopilot. Walter Huang’s Model X veered out of its lane and began to accelerate before barreling into a concrete barrier located at an intersection on a busy highway in Mountain View, California.
In December, US auto safety regulators pressured Tesla into recalling more than two million vehicles to fix a defective system that was supposed to make sure drivers pay attention when using the Autopilot function.
The Independent has contacted Tesla and the Camden Police Department for comment about the incident in Ohio.