• 0laura@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      Yes I’m referring to LLMs, and image classification models. And image generation models. And even the code that controls the Creepers in Minecraft. AGI isn’t a thing, but we’ve had AI for a looong time. It’s just not as flashy as it often looks in Sci-Fi movies.

      • KomfortablesKissen@discuss.tchncs.de
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        1 day ago

        Okay, great. AI as you describe exist, but are still things. Not sentient beings. Never will be. My point is the only people that think that they could be, are people that humanize pencils. Or gods. Or other things.

        So yes, AI exist. But not as sentient beings.

        • 0laura@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          What makes humans different? If someone perfectly simulated my entire brain, would that digital brain be sentient? what even is sentience? I think it’s strange to say that AI will never be sentient.

          • KomfortablesKissen@discuss.tchncs.de
            link
            fedilink
            arrow-up
            3
            ·
            1 day ago

            Complexity for one. A cramped foot has an influence on the brain, as does apparently the gut bacteria. Focusing on the brain is a starting point and we don’t even understand that that well.

            If someone perfectly simulated your entire brain, would that digital brain be sentient?

            I don’t know. It could be. For now I don’t think so. Are you comparing that to an LLM? That would be like comparing the paths of snail slime to a comic. One could compare story lines and art styles to something that just isn’t there. And never will be.

            What is sentience?

            Sentience is the ability to experience feelings and sensations (wiki). A word not based on a clear understanding, but rather an attempt to categorize. Nonetheless, an LLM doesn’t experience anything. It uses pattern recognition and human provided categorization to try and create different stuff. All in the confines of the recognitions.

            I think it’s strange to say that AI will never be sentient.

            It’s why it’s important to distinguish between “AI” and “LLM”. AI, as an AGI, is something we might be able to build one day. LLMs might be a step on the way to this. But not the way they are now.

            • 0laura@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              16 hours ago

              You have a point with most of the things you said, it’s mostly a matter of perspective and how you define stuff. the only thing I really fundamentally disagree with is equating AI to AGI.