• sudneo@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    ·
    4 days ago

    There is a bunch of research showing that model improvement is marginal compared to energy demand and/or amount of training data. OpenAI itself ~1 month ago mentioned that they are seeing a smaller improvements in Orion (I believe) vs GPT4 than there was between GPT 4 and 3. We are also running out of quality data to use for training.

    Essentially what I mean is that the big improvements we have seen in the past seem to be over, now improving a little cost a lot. Considering that the costs are exorbitant and the difference small enough, it’s not impossible to imagine that companies will eventually give up if they can’t monetize this stuff.

    • icecreamtaco@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      Compare Llama 1 to the current state of the art local AI’s. They’re on a completely different level.

      • sudneo@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Yes, because at the beginning there was tons of room for improvement.

        I mean take openAI word for it: chatGPT 5 is not seeing improvement compared to 4 as much as 4 to 3, and it’s costing a fortune and taking forever. Logarithmic curve, it seems. Also if we run out of data to train, that’s it.

    • theherk@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      Surely you can see there is a difference between marginal improvement with respect to energy and not improving.

      • sudneo@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        Yes, I see the difference as in hitting the logarithmic tail that shows we are close to the limit. I also realize that exponential cost is a defacto limit on improvement. If improving again for chatGPT7 will cost 10 trillions, I don’t think it will ever happen, right?