The Incident
A widely shared video captured a Tesla Model 3 operating on Full Self-Driving mode driving straight through lowered railroad crossing barriers in the Los Angeles area. The vehicle's forward camera footage, shared by the driver on social media, shows the car approaching the crossing as the barriers descend, then accelerating through them without any apparent attempt to stop or alert the driver.
The driver, who said they were monitoring the system but trusted it to respond to the crossing signals, reported that FSD gave no visual or auditory warning before proceeding through the barriers. The vehicle's display continued to show a clear road ahead, suggesting the system failed to detect or classify the crossing gate as an obstacle requiring a stop.
The video has been viewed millions of times across social media platforms and has reignited debate about the readiness of Tesla's autonomous driving technology for public roads.
NHTSA Investigation Timing
The incident's timing is particularly notable. It occurred on the same day as the National Highway Traffic Safety Administration's deadline for Tesla to turn over critical data from its ongoing investigation into FSD traffic violations. That investigation specifically includes railroad crossing failures as a category of concern.
NHTSA opened its probe into FSD traffic violations after receiving multiple reports of Tesla vehicles on FSD running red lights, failing to stop at stop signs, and ignoring railroad crossings. The agency has been accumulating data from Tesla's fleet to determine whether the software poses an unreasonable risk to motor vehicle safety, which could trigger a formal recall.
Tesla received two extensions on the data submission deadline before the March 9 cutoff. The company has not publicly commented on whether it met the deadline or the contents of its submission.
How FSD Handles Railroad Crossings
Railroad crossings present a particularly challenging scenario for vision-based driving systems like Tesla's FSD, which relies entirely on cameras without lidar or radar backup. Crossing barriers are thin, often partially transparent structures that can be difficult for computer vision systems to distinguish from background clutter, especially in variable lighting conditions.
Most railroad crossings also use flashing red lights and audible bells as warning signals. While these are obvious to human drivers, they require specific training data and detection algorithms for AI systems. If FSD's object detection model has not been sufficiently trained on railroad crossing signals in all their variations — different barrier styles, lighting conditions, approach angles — failures are possible.
Tesla has previously said that FSD is trained on billions of miles of driving data and that railroad crossing detection has been improved in successive software updates. However, the company's approach of releasing the software to hundreds of thousands of vehicles while continuing to refine it means that edge cases like railroad crossings can present real safety risks during the improvement process.
Regulatory Implications
The viral video adds urgency to an already tense regulatory situation. NHTSA has the authority to mandate a recall of FSD if it determines the software poses an unreasonable safety risk, and the agency has already issued one recall of FSD in 2023 that required a software update to address inadequate driver attention monitoring.
A railroad crossing failure is among the most serious potential consequences of an autonomous driving system malfunction. Collisions between vehicles and trains are frequently fatal, and the Federal Railroad Administration has long identified railroad crossing safety as a critical priority.
Several states have introduced legislation that would require additional safety certifications for autonomous driving systems operating on public roads, and the railroad crossing issue has become a focal point for advocates of stricter regulation.
Tesla's Position
Tesla maintains that FSD is a driver assistance system that requires constant driver supervision and that the driver is ultimately responsible for the vehicle's operation at all times. The company's terms of service and in-car messaging emphasize that drivers must keep their hands on the steering wheel and be prepared to take over at any time.
Critics argue that Tesla's marketing of the system as "Full Self-Driving" creates an expectation of autonomous capability that the current technology cannot reliably deliver, leading drivers to trust the system in situations where it may fail. The disconnect between the product's name and its actual capabilities has been a persistent source of controversy and regulatory scrutiny.
Whether the latest incident accelerates NHTSA's investigation timeline or leads to additional regulatory action remains to be seen, but the video's viral spread ensures that the issue will remain in public view as the agency weighs its options.
This article is based on reporting by Electrek. Read the original article.



