Tesla Autopilot: NHTSA's Investigation Update

Tesla Autopilot: NHTSA's Investigation Update

 

The National Highway Traffic Safety Administration (NHTSA) is nearing the conclusion of its extensive investigation into Tesla's Autopilot system. This probe was initiated in light of concerns about Tesla vehicles colliding with emergency vehicles on the roadside, and it has since expanded to encompass a broader range of safety concerns.

Origins of the Investigation

The NHTSA's interest in Tesla's Autopilot system was piqued by a series of incidents, including a fatal crash in California involving a 2018 Tesla Model 3 sedan. This incident was not isolated, as another investigation had been opened by NHTSA in March concerning a fatal crash of a 2014 Tesla Model S in California.

Ann Carlson, the NHTSA Administrator, informed Reuters that the public would soon be apprised of the investigation's outcomes. She underscored the importance of drivers remaining vigilant when using Advanced Driver Assistance Systems (ADAS) like Autopilot.

Con Fire PIO-TESLA CRASHED

Scope and Findings

In August 2021, the NHTSA announced its intention to begin a Preliminary Investigation into Tesla's Autopilot system, covering Tesla Model S, X, 3, and Y vehicles from 2014 to 2021. By June 2022, this preliminary evaluation transitioned to a more in-depth engineering analysis, focusing on the Autopilot's functionalities across 830,000 vehicles.

During this phase, 191 crashes were analyzed, revealing patterns that went beyond the initial emergency vehicle incidents. Of these, 85 were dismissed, and from the remaining 106, comprehensive vehicle data was available for 43 cases. In 37 of these, data showed drivers had their hands on the wheel shortly before the crash.

Despite Tesla CEO Elon Musk's claims about the safety of Tesla vehicles, the data tells a different story. Since 2013, there have been hundreds of fatal crashes involving Tesla electric vehicles. At least 32 deaths in the US and three abroad occurred while Autopilot was active.

Public Perception and Concerns

Tesla promotes Autopilot as a significant safety feature on its website. However, NHTSA data from 2022 contradicts this portrayal, revealing 273 crashes involving Autopilot between July 2021 and May 2022. Moreover, leaked data showed that Tesla was aware of over 3,000 customer complaints about Autopilot.

Tesla's "Full-Self Driving" feature has also been under scrutiny. Due to concerns about the system's propensity to break traffic laws, Tesla had to recall its FSD Beta for nearly 363,000 vehicles.

Unintended Acceleration Issue

In a surprising turn, NHTSA reopened an investigation into claims of unintended acceleration by Tesla EVs. A white paper by a safety researcher in Minnesota highlighted a potential design flaw in Tesla's inverter that could cause unintended acceleration. This paper prompted NHTSA to reconsider its previous stance on the issue.

Conclusion

As the NHTSA wraps up its investigation, the automotive world awaits its findings. The outcomes could have significant implications for Tesla and the broader electric vehicle industry. Safety remains paramount, and it's crucial for both Tesla and regulatory bodies to ensure the well-being of drivers and passengers.

4 comentarios

zkwrerws
zkwrerws

Tesla Autopilot: NHTSA’s Investigation Update
azkwrerws
zkwrerws http://www.gkp73gz4a2174sae7d6ve65h8k25k78gs.org/
[url=http://www.gkp73gz4a2174sae7d6ve65h8k25k78gs.org/]uzkwrerws[/url]

Anthony Gee
Anthony Gee

Good evening I myself have a tesla model 3 2020 I have purchased months ago full self driving for £6800 in the UK and I use this feature as often as possible I know that the system is not fully self driving and I take this very seriously so when I use this feature myself I am in total concentration and because of this I don’t have any issues with the limitations of the feature
The trouble is some people are blind to the fact that you have to be in full control of your Tesla yourself and have total concentration at all times but drivers of self driving cars sadly take chances and don’t concentrate which is the problem of the driver and not the car entirely as I find the feature great when used properly
Also people drive to fast when using full self driving with its limitations at the moment in the UK you shouldn’t be doing the speed limits on the roads you should drive slower to get a better driving experience as the steering at the moment doesn’t steer fast enough to be driving around bends at the roads speed limits so people need to be aware of this and must be a major part in tesla crashes due to people driving with the feature on driving to fast again it is not the fault of the tesla its the fault of the driver driving to fast and not being in control of the full self driving feature

Colin Dawson
Colin Dawson

This happens to me in my model S P100D. And I keep getting told it’s google maps . My god first time this happened it took off form 60 to 100klms I was shocked. Now I’m aware from first time I’m ready and prepared to cancel Autopilot

Michael Garoust
Michael Garoust

Our model S with autopilot had a few scary occurrences. Once, as we approached a major intersection, the car correctly entered the left turn lane but suddenly veered into the thru lane where a car was. Fortunately I was able to wrestle the steering wheel back away from that car. Otherwise, with proper supervision, it works rather well, especially on interstates.

Dejar un comentario