U.S. federal regulators have launched a new investigation into Tesla’s Full Self-Driving (FSD) system following 58 reported crashes involving vehicles running red lights and colliding with other cars. The National Highway Traffic Safety Administration (NHTSA) has opened an inquiry into the safety of 2.88 million Tesla vehicles running the FSD software, citing potential violations of traffic laws and accidents. According to Reuters, 58 reports describe Teslas that blew through red lights, drifted into the wrong lanes, and crashed at intersections. Fourteen of these cases resulted in actual collisions, while 23 caused injuries.
In one notable pattern, six Tesla vehicles reportedly ran red lights before colliding with other cars. A Houston driver reported that the FSD system failed to recognize traffic signals, stopping at green lights but running through reds. The driver expressed frustration over the issue being identified during a test drive but not being resolved. The NHTSA is also reviewing new reports of Teslas failing to handle railroad crossings safely, including a near-collision with an oncoming train.
This is not the first time Tesla has faced regulatory scrutiny. The company is already dealing with several investigations into its Autopilot and FSD systems. A recent case saw a California jury award $329 million to a family after an Autopilot-related crash killed a woman. Another investigation is looking into Tesla’s Robotaxi service in Austin, Texas, where passengers reported erratic driving and speeding, even with human safety drivers on board. Tesla continues to fight a false advertising lawsuit from California’s DMV, which claims the term ‘Full Self-Driving’ is misleading, as the software requires constant driver supervision. The company recently changed the name to ‘Full Self-Driving (Supervised)’ to reflect this reality.
Tesla’s latest FSD software update was released just days before the investigation began. The NHTSA states that the system has already induced vehicle behavior that violates traffic safety laws. This early-stage investigation could lead to a recall if the agency finds the FSD software poses a safety risk. If you drive a Tesla with FSD enabled, it’s important to stay alert, as the system is not fully autonomous and requires constant attention from the driver. The investigation serves as a reminder that ‘self-driving’ still means supervised driving.