The Crash That Went Viral
A dashcam video circulating online shows a Tesla Cybertruck traveling at highway speed and colliding with a concrete overpass barrier on a Houston freeway. The driver claims the vehicle was operating under Full Self-Driving when the crash occurred. The video has reignited one of the most contested debates in electric vehicle circles: what exactly is FSD responsible for, and when?
Elon Musk responded to the video on social media, citing Tesla's internal telemetry logs. According to Musk, the data shows the driver disengaged FSD approximately four seconds before the collision — placing the crash outside the envelope of autonomous operation by a narrow margin. Tesla's supporters quickly seized on this to call the story FUD, short for fear, uncertainty, and doubt, a term often used in Tesla communities to dismiss critical coverage.
What the Video Actually Shows
Watching the dashcam footage carefully tells a more complicated story. The Cybertruck appears to be traveling in a straight line on a highway, with no sudden lane changes, no erratic movement, and no obvious driver input visible in the seconds leading up to impact. The vehicle then drifts into the barrier with little apparent braking or evasive action.
Critics argue that even if logs confirm the driver technically disengaged FSD four seconds before impact, that framing misses the point. A driver who has been hands-off and mentally disengaged for an extended period — relying on the system to manage the vehicle — cannot be expected to fully regain situational awareness and react appropriately in four seconds. This phenomenon, known as automation complacency or the automation surprise effect, is well-documented in aviation and automotive safety research.
The core concern is not whether FSD was technically active at the moment of impact. It is whether FSD's design — and the way Tesla markets the system — creates conditions that leave drivers unprepared to intervene when the system fails or disengages.


