Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner. Only this time, the test […]

  • yesmeisyes@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    12 hours ago

    He didn’t even use the Tesla full self driving. He used the ancient autopilot software and even with that he apparently manually disengaged it before impact. Seems pretty disingenuous to me.

    • Kissaki@beehaw.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 hours ago

      and even with that he apparently manually disengaged it before impact

      Source?

    • Kissaki@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      The first kid test they did with both auto-pilot and self-driving (or whatever you call that). Was that different for the later tests?

      • yesmeisyes@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        8 hours ago

        My point is that FSD is much, much more advanced piece of software. It’s wrong to label the video self driving when you are not using FSD. Autopilot is just adaptive cruise control that keeps the car in lane.

        • August27th@lemmy.ca
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          3 hours ago

          Autopilot is just adaptive cruise control that keeps the car in lane.

          Anyone who watches the video in question knows this statement is misleading. Autopilot also stops when it detects an obstacle in the way (well, it’s supposed to, but the video demonstrates otherwise). Furthermore, decades old adaptive cruise from other brands will stop too because even they have classic radar or laser range-finding.

          If even the most basic go no-go + steer operation based on computer vision can’t detect and stop before obstacles, why trust an even more complicated solution? If they don’t back-port some apparent detection upgrade from fsd to the basic case, that demonstrates even further neglect anyway.

          The whole point that everyone is dancing around is that Tesla gambled that cheaping out by using only cameras would be fine, but it cannot even match decades-old technology for the basic case.

          Did they test it against decades old adaptive cruise? No, that’s been solved, but they did test it against that technology’s next generation, and it ran circles around vision not backed by a human brain.

          • yesmeisyes@sopuli.xyz
            link
            fedilink
            arrow-up
            1
            ·
            3 hours ago

            Autopilot hasn’t received any updates for years. Tesla is only focusing on FSD. This makes your point invalid.

              • yesmeisyes@sopuli.xyz
                link
                fedilink
                arrow-up
                1
                ·
                1 hour ago

                Ok so every car manufacturer is also demonstrating negligence because they can’t even get their updates working in the first place.

        • Hawk@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          8
          ·
          5 hours ago

          If you close your eyes, it doesn’t matter that you’re wearing glasses or not.

          If the car sensors could not pick up the wall, what software version is using does not matter.

          • yesmeisyes@sopuli.xyz
            link
            fedilink
            arrow-up
            1
            ·
            3 hours ago

            Tesla is only using vision. Software makes ALL the difference. If you don’t have a brain it doesn’t matter if you have eyes or not.

        • lemmyingly@lemm.ee
          link
          fedilink
          arrow-up
          11
          ·
          6 hours ago

          It’s wrong to label a Tesla or any of its software as ‘full self driving’.

          Quite clearly Mark demonstrated that the safety systems are engaged in what ever mode he had it in; otherwise the vehicle would never stop for the obstacle in front of it.

          • Jaime Visser@mastodon.nl
            link
            fedilink
            arrow-up
            1
            ·
            4 hours ago

            @lemmyingly @yesmeisyes Tesla’s safety systems only do emergency stops for certain
            stationary objects (cars, bicyclists, pedestrians). The real test would be to see if FSD would actively plan to drive through that wall.

            You can see that even AP wasn’t enabled in most of the test so it’s not a test of FSD.