“Translation: all the times Tesla has vowed that all of its vehicles would soon be capable of fully driving themselves may have been a convenient act of salesmanship that ultimately turned out not to be true.”

Another way to say that, is Tesla scammed all of their customers, since you know, everyone saw this coming…

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    edit-2
    2 months ago

    All the issues with self-driving could be solved if they actually gave a shit about making it work. You don’t let the machine choose. You give it hard fucking rules to follow. It doesn’t need to identify geese, human, ball, dog, child to react differently to each; it should see an obstruction and stop to avoid damaging the fucking object and car, regardless of what it is. They are making it way more complicated than it really has to be.

    • Eranziel@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 months ago

      You are making it far simpler than it actually is. Recognizing what a thing is is the essential first problem. Is that a child, a ball, a goose, a pothole, or a shadow that the cameras see? It would be absurd and an absolute show stopper if the car stopped for dark shadows.

      We take for granted the vast amount that the human brain does in this problem space. The system has to identify and categorize what it’s seeing, otherwise it’s useless.

      That leads to my actual opinion on the technology, which is that it’s going to be nearly impossible to have fully autonomous cars on roads as we know them. It’s fine if everything is normal, which is most of the time. But software can’t recognize and correctly react to the thousands of novel situations that can happen.

      They should be automating trains instead. (Oh wait, we pretty much did that already.)

      • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        2 months ago

        It would be absurd and an absolute show stopper if the car stopped for dark shadows.

        That’s why they use LIDAR and not just visual cameras. They don’t need to know the difference between different objects; they just need to know an object is there, in the way, or even moving in a way that could potentially put it in the path of the vehicle.

        They’re making it more complicated by working on both autonomous driving, and also image recognition for use by AI.

        • Eranziel@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          I agree that LIDAR or radar are better solutions than image recognition. I mean, that’s literally what those technologies are for.

          But even then, that’s not enough. LIDAR/radar can’t help it identify its lane in inclement weather, drive well on gravel, and so on. These are the kinds of problems where automakers severely downplay the difficulty of the problem and just how much a human driver does.

        • ristoril_zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          my point is that “if there’s an obstruction, stop” means these cars are going to be stopping and requiring human intervention all the time. That’s semi autonomous at best.

          I don’t know if you’ve encountered intransigent geese in your driving adventures, but the only way to deal with them is to slowly drive through the flock until they move out of your way.

          fully autonomous cars are never going to happen without major changes to our roads. we’d be better off investing in more busses and trains.