[Digital Today intern reporter Kyeongmin Hong (홍경민)] Controversy is growing after a video surfaced showing Tesla's driver-assistance system Full Self-Driving (FSD) failing to recognise a railroad crossing barrier and driving into it.
Electrek, an electric-vehicle outlet, reported on March 9 local time that a Model 3 in FSD mode hit a lowered railroad crossing barrier near West Covina, California on March 8. The incident is further fuelling controversy as it coincides with a deadline for Tesla to submit data demanded by the U.S. National Highway Traffic Safety Administration (NHTSA) for an investigation into FSD-related traffic violations.
Dashcam footage posted on social media platform Threads showed no sign the car detected the barrier ahead or tried to slow down, even though it was travelling at about 37 km per hour. The driver narrowly avoided an accident by braking just before impact and shared the video with the post, "FSD nearly killed me today," warning of the system's risks.
The incident also intersects with a broad FSD-related investigation already under way at NHTSA. In October 2025, NHTSA opened the probe based on 58 incidents linked to FSD, including 14 crashes and 23 injury cases. The probe covers about 2,880,000 Tesla vehicles equipped with FSD and includes not only failures to recognise railroad crossings but also alleged serious violations such as running signals and entering oncoming lanes. The investigation is also linked to criticism that Tesla expanded FSD commercially and advertised the name "Full Self-Driving" even though its actual safety level does not match.
Problems involving FSD malfunctions at railroad crossings had been raised several times before this crash. An NBC News investigation found more than 40 reported cases on social media alone, and in Pennsylvania, FSD led a vehicle onto the tracks, resulting in a major crash with a train. As the situation grew more serious, Senators Ed Markey and Richard Blumenthal sent a letter to NHTSA urging a formal government investigation into crossing safety issues, pressuring Tesla.
Against this backdrop, NHTSA is requiring Tesla to submit detailed records from 30 seconds before each violation. The requested data include the software version used, whether driver warnings were issued and outcomes such as casualties caused by crashes. Tesla says it is difficult to prepare the materials because there are more than 8,000 records that require manual review, but the request is expected to serve as key evidence for future safety assessments and regulatory decisions, beyond a simple document submission.
In particular, the issue is that the same software involved in this crash is also being applied to a robotaxi service in Austin. As a result, the outcome of data submissions related to crash records could directly affect safety evaluations and regulatory decisions for vehicles in actual commercial operation, drawing global attention.
As the crash video spread quickly online, the debate over FSD safety reignited. Some users stressed driver responsibility, citing that FSD is a Level 2 system that requires continuous driver supervision.
Others said using the name "Full Self-Driving" for a system that cannot identify even clear obstacles such as a barrier is deceptive, and raised doubts about the reliability of Tesla's technology. Experts analyse the crash as a case that exposes FSD's limits while also showing the need for future regulation of autonomous driving technology.
A Threads user named Laushi Liu posted dashcam footage from his Tesla Model 3 on Sunday, March 8, showing the vehicle on “Full Self-Driving” mode at 23 mph near West Covina, California. In the video, the car approaches a railroad crossing where barriers have just come down — and… https://t.co/5ntOueDxFq pic.twitter.com/rCLlMv1PbG