• FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    14
    arrow-down
    3
    ·
    2 months ago

    Yes, and I’m saying there’s nothing wrong with that “buzz word.” It’s accurate, just more generic.

    I see a lot of people these days raising objections that LLMs and whatnot “aren’t really artificial intelligence!” Because they’re operating from the definition of artificial intelligence they got from science fiction TV shows, where it’s not AI unless it replicates or exceeds human intelligence in all meaningful ways. The term has been widely used in computer science for 70 years, though, applying to a broad range of subjects. Machine learning is clearly within that range.

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      There’s a distinction into “narrow AI” and “Artificial General Intelligence”.

      AGI is that sci-fi AI. Whereas narrow AI is only intelligent within one task, like a pocket calculator or a robot arm or an LLM.

      And as you point out, saying that you’re doing narrow AI is absolutely not interesting. So, I think, it’s fair enough that people would assume, when “AI” is used as a buzzword, it doesn’t mean the pocket calculator kind.

      Not to mention that e.g. OpenAI explicitly states that they’re working towards AGI.

      • exocrinous@startrek.website
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        If I built a robot pigeon that can fly, scavenge for crumbs, sing matings calls, and approximate sex with other pigeons, is that an AGI? It can’t read or write or talk or compose music or draw or paint or do math or use the scientific method or debate philosophy. But it can do everything a pigeon can. Is it general or not? And if it’s not, what makes human intelligence general in a way that pigeon intelligence isn’t?