The incident in Chinatown was due to the fact that Waymo AI expects everything to follow the law. Therefore driving through a busy part of the city should be fine because the streets will be clear as pedestrians follow the law.
Was the Waymo AI wrong? No. Does it need to learn human behavior? You bet!
The cyclist was hit because the rider failed to follow the law and maintain a “safe clear distance” from the vehicle in front. It was literally a few feet behind the big truck. By the time the AI saw the cyclist it was too late, because nothing should have been that close to the truck.
The AI we have now is perfectly fine as long as every sentient creature out in public knows and follows the law. Therein lies the next part of AI training…
The incident in Chinatown was due to the fact that Waymo AI expects everything to follow the law. Therefore driving through a busy part of the city should be fine because the streets will be clear as pedestrians follow the law.
Was the Waymo AI wrong? No. Does it need to learn human behavior? You bet!
The cyclist was hit because the rider failed to follow the law and maintain a “safe clear distance” from the vehicle in front. It was literally a few feet behind the big truck. By the time the AI saw the cyclist it was too late, because nothing should have been that close to the truck.
The AI we have now is perfectly fine as long as every sentient creature out in public knows and follows the law. Therein lies the next part of AI training…