ChatGPT is losing some of its hype, as traffic falls for the third month in a row::August marked the third month in a row that the number of monthly visits to ChatGPT’s website worldwide was down, per data from Similarweb.

  • Prandom_returns@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    9
    ·
    1 year ago

    “I can suggest an equation that has been a while to get the money to buy a new one for you to be a part of the wave of the day I will be there for you”

    There, my phone keyboard “hallucinated” this by suggesting the next word.

    I understand that anthropomorphising is fun, but it gives the statistical engines more hype than they deserve.

    • chaircat@lemdro.id
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      Your phone keyboard statistical engine is not a very insightful comparison to the neural networks that power LLMs. They’re not the same technology at all and just share the barest minimum superficial similarities.

      • Prandom_returns@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        5
        ·
        1 year ago

        Ah “neural networks” with no neurons?

        I’m not comparing technologies, I’m saying those are not “hallucinations”, the engines don’t “think” and they don’t “get something wrong”.

        The output is dependent on the input, statistically calculated and presented to the user.

        A parrot is, in the most literal of ways, smarter than the “Artificial intelligence” sentence generators we have now.

          • Womble@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            1 year ago

            because they are being wilfully obtuse suggesting that “neural network” a term going back over half a century for a computatuonal method doesnt apply to things without biological neurons, and doing the same thing applying an overly narrow deffinition of halucination when it has a clear meaning in this context of stating textually probable but incorrect statements.