Safe Streets Rebel’s protest comes after automatic vehicles were blamed for incidents including crashing into a bus and running over a dog. City officials in June said…
Safe Streets Rebel’s protest comes after automatic vehicles were blamed for incidents including crashing into a bus and running over a dog. City officials in June said…
So…
Your car is at fault. Their kid is dead.
Who pays for the funeral?
Does your insurance cover programming glitches?
If your insurance determined that an autonomous vehicle will cause less damage over time than a human driver, they will do that, yes.
Autonomous logic doesn’t pay insurance, does it?
If so, who TF is paying the insurance behind the scenes, and who is responsible?
The owner of the vehicle is probably very openly paying.
Here’s a question- if you have to agree to terms of service for the vehicle to function, and I’m guessing you would, is it really your vehicle?
We’re talking about autonomous vehicles here, no driver, company owned.
So is Alphabet responsible?
Do your homework, these vehicles are owned by the parent company of Google
and Apple, Alphabet. These vehicles have no private owner. So again, who TF is responsible?So what? It’s not the gotcha you apparently believe to have found, companies can have insurance…
Companies also never seem to be held accountable. OceanGate anybody?..
That’s not a good example. Courts move slow and that just barely happened and AFAIK is still being investigated (plus searching, the participants signed wavers – though wavers don’t give immunity legal negligence).
There’s plenty of examples of companies being punished for negligence. It happens all the time when, say, their poorly constructed building collapses, cutting corners causes an oil spill in the Gulf of Mexico, they falsified their vehicle emissions reports, or when they abuse their market dominance.
Corporations totally do get away with a lot, but I don’t see why you’d expect self driving cars to be a place where that would happen, especially since manually driven cars are already so regulated and require insurance. And everyone knows that driving is dangerous. Nobody is under any false impressions that self driving cars don’t have at least some of that same danger. I mean, even if the AI was utterly perfect, they’d still need insurance to protect against cases that aren’t the AI’s fault.
When the vehicle disobeys orders from the police, who is at fault?
https://piped.video/watch?v=5Jev_R-JVmA
Alphabet don’t own Apple.
I’ll take your word on that. I’ve edited my comment to reflect that, but last research I did a few years ago, both companies were under the umbrella of Alphabet.
Nope, Apple have never been owned by Alphabet. Alphabet is literally just googles new parent company that they formed when restructuring.
https://en.wikipedia.org/wiki/Alphabet_Inc.
deleted by creator
I mean, why shouldn’t it? Is a programming glitch in a self driving all that different from a mechanical issue in a manually driven car?
AI driven cars are just as prone to mechanical issues as well. Is AI smart enough to deal with a flat tire? Will it pull over to the side of the road before phoning in for a mechanic, or will it just ignorantly hard stop right in the middle of the interstate?
What’s AI do when there’s a police officer directing traffic around an accident or through a faulty red light intersection? I’ve literally seen videos on that before, AI couldn’t give two shits about a cop’s orders as to which way to drive the vehicle.