- Tesla’s Full Self-Driving systems have come under scrutiny again
- Uber’s former head of self-driving crashes the Model X using FSD
- A Cybertruck also crashed violently into an overpass barrier
Two recent high-profile Tesla accidents have once again put the spotlight on the company’s Full Self-Driving (FSD) technology. They highlight the problem of “asking people to monitor systems designed to make monitoring feel pointless,” as Uber’s former head of self-driving cars wrote in an article for The Atlantic after his Model X crashed into a wall.
Raffi Krikorian, Mozilla CTO and former Uber self-driving chief, said he was navigating the residential streets of San Francisco’s Bay Area with his Model X set to Full Self-Driving mode when the steering wheel “jerked one way, then the other.”
Moments before colliding with a wall, Krikorian says he spun the wheel to take over, but it was too late.
The article continues below
In a similar but altogether much scarier scenario, dash cam footage captured by Justine Saint Amour’s Cybertruck reveals the moments just before the vehicle ran into an overpass barrier at speed, nearly sending it over the edge and potentially killing the driver and her one-year-old child.
‘TERRIFYING’: Dashcam video shows the moment a Tesla Cybertruck, allegedly operating in self-driving mode, nearly sent a Houston mother and her infant off a bridge before crashing violently into an overpass barrier. The woman claims she suffered multiple injuries from the incident… pic.twitter.com/DgcnHp2FtZ17 March 2026
Saint Amour says the vehicle was in full self-driving mode before the incident, but her attorney Bob Hilliard acknowledged that his client disengaged the system moments before the impact. Tesla’s CEO, Elon Musk, immediately perceived this fact.
He took to X to write: “Log shows driver disengaged autopilot four seconds before crashing” – an argument he has used in FSD’s defense a number of times before.
But Raffi Krikorian, who used to head Uber’s self-driving car division, claims in his article that drivers need “five to eight seconds to mentally re-engage after an automated driving system gives back control”. In his opinion, this middle ground simply does not work.
“A machine that constantly fails keeps you sharp. A machine that works perfectly needs no supervision. But a machine that works almost perfectly? That’s where the danger lies,” he writes.
Saint Amour suffered two herniated discs in her lower back, one in her neck, sprained tendons in her wrist and experienced numbness and weakness in her right hand, according to Electrek, and is suing Tesla for over $1 million as a result.
Analysis: Tesla’s messaging has long been misleading
Tesla may have stopped referring to its advanced cruise control system as “Autopilot,” but that hasn’t stopped the company from pursuing and promoting its Full Self-Driving technology.
Despite adding ‘Supervised’ to the procedure, Elon Musk has long peddled the myth that his technology is more capable than it actually is.
He has made claims that a Tesla should be able to drive autonomously “90 percent of the miles” way back in 2013, just as drivers could soon “go to sleep in your car and wake up at your destination” on several occasions, the most recent being late last year.
The increasing number of lawsuits and high-profile crashes testify that we are still far from that point, and despite Tesla’s website warning Full Self-Driving users not to “get complacent,” customers are clearly relying on the systems more than they should.
As Raffi Krikorian writes: “when a car’s marketing says ‘self-driving’, but the small print says ‘responsible driver’, it’s a warning sign”.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds. Be sure to click the Follow button!
And of course you can also follow TechRadar on YouTube and TikTok for news, reviews, video unboxings, and get regular updates from us on WhatsApp also.



