Tesla’s self-driving systems have been a subject of debate for years now. Fans of Autopilot and Full Self-Driving praise the convenience they provide, but critics have blamed the technology for causing accidents. It’s been only a few days since a federal judge refused to overturn a $243-million verdict related to a Tesla Model S crash, leaving the automaker to take at least some of the blame in the incident. Now, a Tesla Cybertruck owner is seeking over $1 million in a Texas lawsuit after her vehicle allegedly ran into a concrete barrier.
Truck Almost Drove Off Overpass

Exotic Car Trader
The plaintiff, Justine Saint Armour, purchased her Cybertruck from a used car dealer in February 2025. The truck was equipped with the company’s optional FSD suite; the lawsuit, however, states the plaintiff was “using the Autopilot” in August 2025 when the crash occurred. Autopilot is the brand’s more basic driving suite without the full range of capabilities as FSD.
“More specifically, the subject Cybertruck was driving on 69 Eastex Freeway approaching 256 Eastex Park and Ride while on Autopilot,” states the lawsuit, according to Car Complaints. “On this ‘Y’ shaped overpass, where the vehicle should have followed the curve to the right onto 256 Eastex Park and Ride, the Cybertruck attempted to drive straight ahead into the concrete barrier and the freeway below, which caused Plaintiff to disengage the self-driving mode and take control of the wheel but it was too late and crashed into the barrier, causing the injuries and damages complained of herein.”

In the lawsuit, the plaintiff also accused Tesla boss Elon Musk of being “an aggressive and irresponsible salesman, who has a long history of making dangerous design choices, and overpromising features of his products.”
Several aspects of the lawsuit remain unclear, such as the speed the truck was going at the time of the crash and details surrounding injuries. The lawsuit also accuses the company of failing to provide LiDAR sensors (Tesla relies on a series of cameras to identify surroundings) and misleading “self-driving” advertising.
Related: Feds Audit Tesla Over FSD Crashes — What Owners Should Know
What It Means

While any accident is unfortunate, it sounds as if the plaintiff avoided an even more dire situation—the lawsuit claims the truck nearly went off the overpass, which would likely have resulted in more serious consequences. The currently available information does not state the speed of the truck at the time, which may also be an important detail.
Whether this was entirely the fault of the Cybertruck may come to light as the case progresses, but the gulf between what some drivers expect of self-driving systems and what they can do remains a problem. It’s why Tesla was forced to drop Autopilot from cars in California recently, as regulators found the name of the system to be misleading and threatened the automaker with a 30-day sales ban.
Tesla’s self-driving systems may be capable of thousands of incident-free trips with minimal driver intervention, but the number of crashes involving the technology still points to serious safety concerns. This particular lawsuit was filed in the Harris County Texas District Court: Justine Saint Armour v. Tesla, Inc., et al. It’s a case that adds to years of legal scrutiny surrounding Tesla’s self-driving tech, which is especially pertinent as the company plans to sell the Cybercab—an entirely self-driving vehicle with no manual driving controls at all.