• RecallMadness@lemmy.nz
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    8 months ago

    How long until it’s revealed they’re being driven by 1000 Indians like we recently did with Amazons “staffless” stores

  • Immersive_Matthew@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    8 months ago

    More proof that Elon is not as genius as he says he is. Well, we can agree genius at manipulating people, and sketching bad truck designs on napkins or whatever that story is.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      6
      ·
      edit-2
      8 months ago

      Drivers can activate Mercedes’s technology, called Drive Pilot, when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control. The technology does not work on roads that haven’t been pre-approved by Mercedes, including on freeways in other states.

      Meanwhile Teslas with FSD V12 drives just fine in just about anywhere, including roads that aren’t even mapped.

      • anyhow2503@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        I can only think of that one video where some guys Tesla desperately wants to veer into oncoming traffic.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          8 months ago

          There’s always going to be cases when these systems fail. Even with a self driving car that’s 10x safer driver than the best human there’s still going to be 4000 fatal accidents a year just in the US alone. FSD is probably already safer driver than a human. When it fails this generally means that it got stuck somewhere - not that it caused an accident. I haven’t seen the video in question but that probably was an older version or an autopilot, not FSD.

          • anyhow2503@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 months ago

            It seems like a good decision then to limit self driving systems to situations where they are less likely to fail.

            FSD is probably already safer driver than a human.

            Even with the horrendous driving skills of some people, that’s a very bold claim without some actual evidence.

            When it fails this generally means that it got stuck somewhere - not that it caused an accident. I haven’t seen the video in question but that probably was an older version or an autopilot, not FSD.

            It doesn’t make that much difference what Tesla calls their latest beta software update imho. If their autopilot is enough to get you into dangerous situations, how is a system with even less human oversight going to be fundamentally different? I’ll need to see some more critical reviews of this system after years of not delivering on their claims and only rolling features out to select beta testers to maintain plausible deniability.

            I didn’t find the specific video of older versions trying really hard to drive into oncoming traffic, though there are plenty. I found one of the FSD beta from 6 months ago though, where it can’t seem to decide which lane is correct.

            • Thorny_Insight@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 months ago

              It doesn’t make that much difference what Tesla calls their latest beta software update imho.

              Autopilot and FSD Beta are two different systems of which autopilot is the less advanced one. There’s only one death ever linked to the use of FSD Beta and that includes the older versions aswell.

              The only statistics available regarding the safety of FSD and autopilot are from Tesla itself which one should probably take with a grain of salt but they seem to idicate it being 5x safer than an average American driver.

              Then there are ofcourse plenty of independent YouTubers doing videos of putting these systems to test such as AI DRIVR and CYBRLFT who give pretty honest assesments on the strenghts and weaknesess of them.

              • anyhow2503@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 months ago

                Autopilot and FSD Beta are two different systems of which autopilot is the less advanced one. There’s only one death ever linked to the use of FSD Beta and that includes the older versions aswell.

                I know. Tesla has already advertised that their newer system is fully based on ANN. Factoring in their current track record doesn’t inspire any confidence in me. I’m not reading that paywalled article, but one death for a system that only had limited rollout until very recently isn’t enough to make me believe it’s reasonably safe either. There just isn’t trustworthy, large-scale data out there yet. We need to keep the perspective in mind here: this is pretty much Tesla’s last chance to actually make good on their empty promises and they have a lot to prove.

                At this point I’m not willing to take any statistical claim coming from Tesla, salt or not.