Analysis of NHTSA data: Tesla's FSD and Autopilot were involved in 736 US crashes since 2019, far more than systems from all other car makers combined (Washington Post)
https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/
http://www.techmeme.com/230610/p12#a230610p12
@Techmeme @dsilverman As usual, the lede gets buried in the text.
“NHTSA reminds the public that all advanced driver assistance systems require the human driver to be in control and fully engaged in the driving task at all times. Accordingly, all state laws hold the human driver responsible for the operation of their vehicles.”
I’m probably not 100% correct, but every time I do further reading on one of these incidents it’s because of *driver error,* and sometimes even intentional misuse.
@bubbajet agreed 100%. I have never received a strike and rarely have issues with the fsd. I also know what to expect and aware of my surroundings as best as I can. Get off your damn phone, texting and FaceTiming and pay attention to what the car is doing or take public transportation. Grr.
@bubbajet @Techmeme Even then, Tesla’s characterization of the feature as “Full Self Driving” is not only inaccurate (and Musk’s comments about it), but it can give its owners a false sense of security, which can cause them to take risks they normally would not. So while drivers may be at fault, there’s a background hum for the behavior.
@dsilverman @Techmeme Agreed. I’ll add the use of “autopilot” as the name for what is actually cruise control and lane keeping.