A TeslaCybertruck owner from Houston, Texas is suing the automaker for $1 million after a terrifying incident that she claims put her and her infant’s lives in danger.
The all-electric truck, which was allegedly operating on Full Self Driving (FSD) mode when the incident happened in August 2025, was caught on camera nearly going off an overpass barrier on Houston’s 69 Eastex Freeway.
As the footage shows, had the Cybertruck not hit a lighting pole that sent it ricocheting it back, it would have probably flown off the overpass, plunging more than 30 feet and likely crashing into other vehicles. Nevertheless, the impact was violent and the woman claims she suffered multiple injuries, which is why she’s sued Tesla last month in a liability and negligence case.
Tesla Telemetry Data Holds the Key to This Case
“On August 18, 2025, our client Justine Saint Amour was driving her Tesla Cybertruck on Houston’s 69 Eastex Freeway with Autopilot engaged,” attorney Bob Hilliard told FOX Business. “Something terrifying happened, without warning, the vehicle attempted to drive straight off an overpass.”
Dashcam footage of the incident shows the vehicle driving normally before it was supposed to follow a right-hand curve at a Y-shaped overpass. Instead of doing so, the Tesla veered slightly right and continued straight ahead, crashing violently into a concrete barrier on the overpass. Parts of the vehicle can be seen flying off in the video as it ricocheted from the impact.

Now here comes the tricky part. The woman’s lawyer claims she had tried to disengage the driver assistance feature to take control of the wheel just before the crash. However, her Tesla Cybertruck was already too far in motion for any intervention to be effective, the law firm said. Basically, the argument is that the driver wanted to take over the vehicle when she saw it heading in the wrong direction, but the system didn’t allow her to.
It remains to be seen if this argument holds up in court, as Tesla may counter-argue that the accident happened on the driver’s watch after FSD had disengaged. It all depends on whether FSD was active at the time of the crash, and Tesla telemetry data should show that.
“She tried to take control, but crashed into the barrier and was seriously injured (mostly her shoulder, neck, and back),” lawyer Bob Hilliard said. Saint Amour also suffered two herniated discs in her lower back and one in the neck, as well as sprained tendons in her wrist and nerve damage to her right hand. Fortunately, the 1-year-old child, who was in the backseat during the incident, was unharmed, according to local TV station Khou 11.
Yet Another Case Alleging Tesla Is Misrepresenting Its ADAS Capabilities

The lawsuit, filed in Harris County District Court, accused Tesla of misrepresenting the capabilities of its driver-assistance system and being negligent in the design of its Autopilot feature. It also alleged that the EV maker failed to incorporate safety mechanisms such as more effective emergency braking systems or LiDAR.
The lawyer said Tesla’s self-driving “relies on cheap video cameras alone” and the vehicle “lacks a proper driver alert system to ensure drivers are ready to take over driving.”
Tesla hasn’t commented on this particular case yet. The EV maker was recently forced to comply with California regulations over false advertising claims related to its Autopilot and Full Self Driving features.
The California DMV said the use of these terms is misleading and a violation of state law. As a result, Tesla is now referring to Full Self-Driving as ‘Full Self-Driving (Supervised)” and Autopilot as ‘Traffic Aware Cruise Control” in California marketing. That said, Tesla filed a lawsuit last month to reverse the “false advertiser” ruling, arguing the DMV did not prove consumers were actually confused.