Tesla’s Full Self-Driving Vehicles under Scanner?
Share
The National Highway Traffic Safety Administration (NHTSA) has launched an investigation into Teslas Full Self-Driving (FSD) feature after a pedestrian was hit and killed in one accident. This scrutiny comes after multiple incidents where Tesla vehicles were involved in crashes under low visibility conditions.
The investigation centers on four particular accidents that took place while the vehicles were operating with the FSD feature. In one of these incidents, a pedestrian was hit and killed, while another led to an injury. According to NHTSAs statement, all these accidents happened in areas with limited visibility due to factors like sun glare, fog, or airborne dust.
This investigation highlights ongoing concerns about the safety of Teslas self-driving technology and its effectiveness in handling difficult driving conditions.
Teslas Full Self-Driving (FSD) feature is integral to the companys vision for future expansion and profitability. Priced at the $8,000 option, this feature requires drivers to remain in the drivers seat and be prepared to take control of the vehicle at any time.
Despite this requirement, Tesla and its CEO, Elon Musk, claim that FSD is safer than driving done by humans. The company has ambitious plans to develop fully autonomous vehicles that wouldn’t require traditional controls like steering wheels, gas pedals, or brakes.
Musk recently announced plans for a fleet of self-driving “robotaxis.” These vehicles would operate autonomously and allow Tesla owners to rent out their cars when not in use, potentially generating income for the owners. However, investor reaction to these announcements was lukewarm, resulting in a nearly 9% drop in Tesla’s stock following Musks presentation.
This is not the first time the NHTSA has investigated Teslas self-driving features. In February 2023, the agency ordered a recall to update the FSD software on over 360,000 Teslas in the U.S. The recall was initiated because the NHTSA found that FSD posed “an unreasonable risk to motor vehicle safety” due to its failure to consistently adhere to traffic laws. The agency noted that the feature could violate traffic laws at certain intersections before drivers could intervene.
In its recall notice, the NHTSA specifically pointed out that the FSD Beta system might allow vehicles to behave unsafely around intersections. Examples of unsafe maneuvering included proceeding through an intersection in a turn-only lane, entering a stop-sign-controlled intersection without stopping, and advancing into an intersection during a steady yellow traffic light without caution.
In December 2022, the NHTSA ordered another recall affecting 2 million Teslas on U.S. roads. This recall aimed to limit the use of a less advanced suite of driver-assist features known as Autopilot. This action followed a two-year investigation into around 1,000 crashes involving the Autopilot feature, raising further questions about the safety and reliability of Tesla’s automated driving technologies.
The ongoing investigations and recalls serve as a reminder of the complexities and challenges surrounding the development of self-driving technology. As Tesla continues to push the boundaries of what is possible in autonomous driving, the safety of these innovations remains a top priority for regulators and the public alike.
Newsletter
Stay up to date with all the latest News that affects you in politics, finance and more.