• stoy
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    5
    ·
    17 hours ago

    Cool, but using AI means that we won’t be able to completely trust the results

    • Flying Squid@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      17 hours ago

      It’s not the AI you’re thinking of. This has nothing to do with image creation or large language models.

      • MurrayL@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        ·
        16 hours ago

        Good (but sad) demo of how LLMs and stable diffusion have completely poisoned the term AI to the point that even legit use-cases get shit on by association.

        • cabbage@piefed.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          14 hours ago

          Or how it has normalized using AI for everything between heaven and earth, when what is actually going on is machine learning. AI implies that the machine is thinking for itself, which of course leads (reasonable) people to draw conclusions that it’s not a very reliable source.

        • _cnt0@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          10 hours ago

          Nothing in existence that has been labeled AI is AI. If AI (actual meaning of the words) exists, it’s in some secret facility and we don’t know about it. AI outside of fiction is a meaningless marketing term. That was already the case before LLMs and stable diffusion.