The U.S. is cracking down on Tesla's ambitious self-driving technology, and the stakes are high. Nearly 2.9 million vehicles are under scrutiny, as the National Highway Traffic Safety Administration (NHTSA) launches an investigation into potential traffic violations and crashes caused by Tesla's Full Self-Driving (FSD) system.
But here's the catch: FSD is designed to assist drivers, not replace them. It requires drivers to stay alert and intervene when necessary. So, how did we get to a point where these vehicles are allegedly running red lights and driving against the flow of traffic?
The NHTSA has received a concerning number of reports, including 58 instances of traffic safety violations, 14 crashes, and 23 injuries, all potentially linked to FSD. One driver's complaint highlights the issue: "FSD is not recognizing traffic signals... Tesla doesn't want to fix it." But Tesla claims FSD is not meant to make the car fully autonomous.
And this is where it gets controversial: Tesla's other automated features are also under the microscope. In January, an investigation was opened into 2.6 million Tesla vehicles with a remote-control feature, and the company's self-driving robotaxi project in Austin is being reviewed. Is this a case of over-regulation or a necessary safety measure?
The NHTSA's preliminary evaluation could lead to a massive recall, but Tesla remains silent on the issue, only releasing a software update for FSD this week. As the investigation unfolds, the question remains: Are we ready for self-driving cars, and who should be held accountable when things go wrong?