Tesla Autopilot Safety Under Scrutiny After Post-Recall Crashes


In the world of auto safety, the U.S. Government and its auto safety regulation branch, the National Highway Traffic Safety Administration (NHTSA), has thrown its investigative gaze upon Tesla, the leading manufacturer of Electric Vehicles (EVs). The point of concern lies with Tesla’s Autopilot driving system, which was subjected to a recall last year. The burning question that lingers in the mind of regulators is whether this recall sufficiently ensured that drivers would keep their attention focused on the road while using the Autopilot feature.

Tesla reported an additional 20 crashes linked with Autopilot since they implemented the recall, a fact which has fuelled the regulator’s anxiety regarding the effectiveness of the recall’s remedy. Adding to the drama, it has been revealed that the recall actually encompassed more than 2 million vehicles – virtually all the vehicles Tesla had sold until that date.

Follow us on Google News! ✔️

NHTSA extended its influence to prompt Tesla into carrying out the recall after embarking on a two-year investigation. The focus of this comprehensive investigation lay with the driver monitoring system within Autopilot, a feature that measures the torque on the steering wheel from the hands of the driver. The agency noticed incidents where vehicles using Autopilot collided with emergency vehicles parked on freeways.

How did Tesla attempt to resolve such an issue? The remedy involved an online software update intended to reign in the use of Autopilot and amp up warnings for drivers. NHTSA, however, has reported evidence of post-fix crashes and signs that Tesla attempted to counter the issue with further software updates following the recall fix. Doubts remain as to whether these additional updates were successful.

The new investigation will scrutinise why these updates were not introduced as part of the initial recall. In addition, it will endeavour to determine whether the absence of these updates presents a significant safety risk to road users that could feasibly be considered unreasonable.

The recall brouhaha came into sharper focus when, just a week ago, a Tesla car – potentially operating on Autopilot – killed a motorcyclist near Seattle. This tragedy has cast a pall over the adequacy of the recall’s measures to ensure that drivers using Autopilot are paying sufficient attention to what lies on the road before them.

In spite of the uncertainty and concern raised by safety advocates that the Autopilot feature – which can keep a vehicle in its lane while maintaining a safe distance from objects in front – was not intended to operate on roads other than limited access highways, Tesla stated that vehicle owners can opt in to parts of the recall remedy.

Ultimately, the investigation closed a week ago, following the recall and subsequent investigation of its effectiveness. Hopes were high that the recall would result in changes to driver engagement and regulation of Autopilot capabilities. However, it is clear that these concerns are far from put to rest. NHTSA’s current objective is to evaluate the recall, including the prominence of Autopilot’s controls and its outreach, in order to counter misuse.

All these technological complications culminate to one undeniable human tragedy: The loss of Jeffrey Nissen, a 28-year-old motorcyclist whose life was cut short by a Tesla vehicle. Investigators are currently working to confirm whether Autopilot was, in fact, in operation at the time of the deadly crash. As we approach the future promised by technology, it is never more apparent that we must tread this path with care and due diligence.