- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”
Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”
Someone paying proper attention probably would be. But a huge chunk of accidents happen because idiots are looking at their phones or fall asleep on the wheel, and at least a self driving cars, even Teslas on Autopilot, won’t do that.
No, they just relinquish control to a sleepy driver without a warning whenever they are about to crash.
We aren’t at the point yet — with any self-drive car — where you should be behind the wheel unless you’re absolutely capable of taking over in seconds.
If you are referring to autopilot, yeah, technically it does that - it turns off once it realises it can’t do anything anymore to avoid the collision so that it doesn’t speed off afterwards due to damaged sensor or glitches etc. But the whole “autopilot turns off so it doesn’t show in statistics” was a blatant lie as Tesla counts all crashes where it has been on before the crash.
Do they count the times the human driver had to take control to avoid a crash?
In the case the crash happened later than 5 seconds after Autopilot was disabled, or it was never used in the first place, it would be in the “Tesla vehicles not using autopilot technology” part of the data.
As for automatically detecting not-crashes, that’s a bit harder to do don’t ya think?