• Reality Suit@lemmy.one
    link
    fedilink
    arrow-up
    104
    ·
    edit-2
    4 months ago

    The company is responsible. Waymo should get the citation. If there were a live driver, the driver would get the citation. If companies want to start going down the route of AI, then whoever is in ownership or responsible for training, should be responsible for the actions of the AI.

    • FlowVoid@lemmy.world
      link
      fedilink
      English
      arrow-up
      55
      arrow-down
      1
      ·
      4 months ago

      Arizona law does allow officers to give out tickets when a robotaxi commits a traffic violation while driving autonomously; however, officers have to give them to the company that owns the vehicle. Doing so is “not feasible,” according to a Phoenix police spokesperson

      • Chozo@fedia.io
        link
        fedilink
        arrow-up
        49
        ·
        4 months ago

        I’m not sure why the police say it’s “not feasible” to issue Google a citation. Google are the registered owners of the vehicles and thus responsible for any actions it performs, just mail them a ticket?

        • FlowVoid@lemmy.world
          link
          fedilink
          English
          arrow-up
          29
          ·
          edit-2
          4 months ago

          I’m just speculating, but there is probably a very efficient workflow for sending a ticket to an individual (given the number of tickets police write and the revenue they generate), and I wouldn’t be surprised if the workflow doesn’t accommodate an AI operated vehicle. Kind of like how a restaurant would need to restructure its workflow to accommodate DoorDash.

          In other words, “infeasible” might actually mean “would take extra effort”.

          • Pasta Dental@sh.itjust.works
            link
            fedilink
            arrow-up
            7
            ·
            4 months ago

            Yeah they probably just use a 20 years old out of date system (like any government agency that respects itself) that doesn’t take into account that maybe a car doesn’t have a driver

          • SlopppyEngineer@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            4 months ago

            I thought the laws in the USA prevented this. It’s why you have manned speed traps because citations must be handed over personally to the driver while other countries have automated speed check systems and send the ticket to the owner of the car, and that can be a leasing company for example.

            • Ferris@infosec.pub
              link
              fedilink
              arrow-up
              11
              ·
              4 months ago

              how about you tape/glue copies of the ticket over the lenses of any exposed cameras and allow Google to figure out the logistics of how to pay the ticket?

            • FlowVoid@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              edit-2
              4 months ago

              citations must be handed over personally to the driver

              In Arizona, the operator of an AI vehicle must submit a law enforcement interaction plan that specifies how they will be ticketed.

              However, it’s quite possible that actually following the plan is a pain in the butt for traffic cops, and they simply don’t want to put in the effort.

            • Spiralvortexisalie@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              Generally in the United States you have an opportunity to cross-examine all evidence, these cameras are not calibrated regularly and generally not kept up (arguably they are so low budget they need no upkeep), so they become un-admissable when you challenge them, which many people win because the camera was last calibrated and cleaned when it was installed.

              • SlopppyEngineer@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                4 months ago

                We have that opportunity too. You can opt to not accept the proposed (automated) settlement, and challenge the citation itself. People have done that and won. However, administrative fees for that are often higher that the proposed settlement so it’s only worth it in special cases.

                • Spiralvortexisalie@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 months ago

                  Can’t speak to other countries but that generally offends American Courts, it comes off as retaliatory for exercising your American rights and has been struck down numerous times in various venues. One of the most scared rights in America is to be heard and reheard in front of a court of competent jurisdiction, we all have our day in court.

      • Reality Suit@lemmy.one
        link
        fedilink
        arrow-up
        12
        ·
        4 months ago

        How is it not feasible? Companies have addresses and records of employees. I know you’re just citing, but something doesn’t sound right. I mean, we are talking about Phoenix police so that could explain it.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      35
      ·
      4 months ago

      Corporations are people until a crime is committed, at which point you can’t punish a corporation for a crime a person commits.

      I don’t understand it, but apparently that’s how it works.

  • iknowitwheniseeit@lemmynsfw.com
    link
    fedilink
    arrow-up
    40
    ·
    4 months ago

    If a human did this, they would at least get a ticket with a fine, and have the violation recorded on their license which would be revoked if it kept happening. With the computer controlled car, the cop called customer support and was like, “hey you might want to look into it or something.”

    I guess we can’t expect the people hired to protect capital to act against capital, but it’s still a bit disturbing.

    • Chozo@fedia.io
      link
      fedilink
      arrow-up
      46
      arrow-down
      2
      ·
      4 months ago

      I used to work on the software for these cars, so I can speak to this a little. For what it’s worth, I’m no longer with the project, so I have no reason to be sucking Google’s dick if these weren’t my honest opinions on the tech being used here. None of this is to excuse or defend Google, just sharing my insight on how these cars operate based on my experiences with them.

      Waymo’s cars actually do a really good job at self-navigation. Like, sometimes it’s scary how good they actually are when you see the conditions they can operate under. There are so many layers of redundancies that you could lose all of the camera feeds, GPS, and cellular data, and they’ll still be able to navigate themselves through traffic by the LIDAR sensors. Hell, even if you removed the LIDAR from that scenario, those cars accurately know their location based on the last known location combined with how many times each tire has revolved (though it’d just run into everything along the way, but at least it’d know where it’s located the entire time). All of the other sensors and data points collected by the cars actually end up making GPS into the least accurate sensor on the car.

      That said, the article mentions that it was due to “inconsistent construction signage”, which I’d assume to be pretty accurate from my own experience with these cars. Waymo’s cars are usually really good at detecting cone placements and determining where traffic is being rerouted to. But… that’s generally only when the cones are where they’re supposed to be. I’ve seen enough roadwork in Phoenix to know that sometimes Mad Max rules get applied, and even I wouldn’t know how to drive through some of those work zones. It was pretty rare that I’d have to remotely take over an SDC, but 9/10 times I did it was because of construction signs/equipment being in weird places and I’d have to K-turn the car back where it came from.

      That’s not to say that construction consistently causes the cars to get stuck, but I’d say was one of the more common pain points. In theory, if somebody were to run over a cone and nobody picks it back up, an SDC might not interpret that obstruction properly, and can make a dumb decision like going down the wrong lane, under the incorrect assumption that traffic has been temporarily rerouted that way. It sounds scary, and probably looks scary as hell if you saw it on the street, but even then it’s going to stop itself before coming anywhere near an oncoming car, even if it thinks it has right of way, since proximity to other objects will take priority over temporary signage.

      The “driving through a red light” part I’m assuming might actually be inaccurate. Cops do lie, after all. I 100% believe in a Waymo car going down the opposing lane after some sketchy road cones, but I have a hard time buying that it ran a red light, since they will not go if they don’t detect a green light. Passing through an intersection requires a positive detection of a green light; positive or negative detection of red won’t matter, it has to see a green light for its intended lane or it will assume it has to stop at the line.

      In the video, the cop says he turns on his lights and the SDC blows through a red light. While I was working there, red light violations were so rare that literally 100% of the red light violations we received were while a human was driving the car in manual mode. What I’d assume was likely going on is that the SDC was already in a state of “owning” the intersection for an unprotected left turn when the lights came on. When an SDC thinks it’s being pulled over, it’s going to go through its “pullover” process, which first requires exiting an intersection if currently in one. So what likely ended up happening is the SDC was already in the intersection preparing for a left turn, the light turns red while the SDC is in the box (and still legally has right of way to the intersection), cop turns on the sirens, SDC proceeds “forward” through the intersection until it’s able to pull over.

      But, that’s just my speculation based on my somewhat outdated understanding of the software behind these cars. I’d love to see the video of it, but I doubt Waymo will release it unless there’s a lawsuit.

      • Prison Mike@links.hackliberty.org
        link
        fedilink
        arrow-up
        12
        ·
        4 months ago

        The red light bit seems spot on. In every article stating “it blew through a red light” there’s always the caveat that it’s just trying to clear the intersection while getting pulled over. Technically people are allowed to do that (and/or move to a safer area, such as getting into the right lane when being pulled over in the left lane).

        I think media like to add the intersection stuff to rile people up.

      • rhythmisaprancer@moist.catsweat.com
        link
        fedilink
        arrow-up
        12
        ·
        4 months ago

        This is pretty interesting to read, thanks! I would think that Waymo employs an abundance of visual sensors that could give us an idea of what happened if they so chose to do so. Construction zones can be hard, maybe they need to own this one?

      • bitwaba@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        4 months ago

        If you listen to the video of the interaction with the police officer and the two Waymo guys, it’s clear to me he’s not making anything up about the events that took place. The car did run through the intersection when he turn on the light. He’s not trying to issue tickets or anything - he really is interacting with the Waymo people to let the know “your car was behaving erratically. It needs to be off the road”. Its very possible the road construction uncertainty plus being in an oncoming traffic lane plus being lit up by the police triggered some very specific failure of process in the code.

      • Doubletwist@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        So I’ve been in situations where I was stopped at a red light, and emergency vehicles were coming and I was waved by a policeman to cross the intersection against the red light to clear the way.

        So what, is a self driving car going to just sit there and keep the intersection blocked?

        • Chozo@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          4 months ago

          (I’m assuming we’re talking about unprotected left turns.)

          I don’t know if I ever saw it happen, myself, so I can’t say for certain. My understanding of the SDC’s logic is that if it was already in the intersection, it would complete the turn, and then pull off to the right shoulder to let the emergency vehicle pass. If it hasn’t yet entered the intersection and detects siren lights behind it, I believe it will turn on the hazard lights and remain stationary unless honked at (I could be mistaken, but I think it’ll recognize being honked at by emergency vehicles, and will assume it to mean “move forward and clear a path”). The SDCs have an array of microphones around the car to detect honks, sirens, nearby crashes, etc, and can tell the direction the sounds are coming from for this purpose.

          That said, because it’s listening for sirens, the SDC will usually be aware that there’s an emergency vehicle heading toward it well ahead of time, and if they’ve got their lights on, the SDC will usually be able to determine which vehicle, specifically, is the emergency vehicle, so it can monitor its trajectory and make sure it’s staying out of the way when possible. Typically, they will be proactive about steering clear of anything with lights/sirens running.

          This would also considered a higher-priority event, and usually it will automatically ping a live human to remotely monitor the situation, and depending on the specific context, they may either command the SDC to remain stationary, proceed forward, make a U-turn, or whatever else may be necessary. In case the emergency vehicle has a loud speaker, we’d be able to hear any requests they’re making of us, as well.

          For what it’s worth, I know that Waymo also works pretty closely with the Phoenix PD, and provide them with updates about any significant changes to the car’s behaviors or any tips/tricks for dealing with a stuck car in an emergency situation, so if a situation got particularly sticky, the cops would know how to work around it. My understanding is that Phoenix PD has generally been very cooperative, though they’ve apparently had issues with state troopers who don’t seem to care to learn about how to deal with the cars.

    • OsrsNeedsF2P@lemmy.ml
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      4 months ago

      It did recognize the patterns, but the construction signs were (allegedly) inconsistent

      Also not that your comment wss alluding otherwise, but self driving cars only use AI for recognition. The decision making is deterministic algorithms

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    arrow-up
    15
    arrow-down
    5
    ·
    edit-2
    4 months ago

    Waymo told multiple outlets that the vehicle drove into the oncoming lane because of “inconsistent construction signage,” and that it “was blocked from navigating back into the correct lane.” The company said the car drove away from the cop “in an effort to clear the intersection” before pulling into the parking lot where the traffic stop took place

    Waymo didn’t immediately respond to The Verge’s request for comment. The company told Fox 10 Phoenix that its cars “are three-and-a-half times more likely” to avoid a crash than a human being.

    I actually rode in a Waymo yesterday. They’re quite cool, and as much as I hate the car centric society in the West, I hope they catch on

  • JeeBaiChow@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    4 months ago

    Isn’t a company at least responsible for the safe operation and training of human drivers? Wouldn’t it be the same for the training of self driving cars?

  • benji@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    4 months ago

    We’re still so far away from this technology being viable for everyday use, aren’t we?