• Kazumara@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    2
    ·
    2 days ago

    Hm even with DeepSeek being more efficient, wouldn’t that just mean the rich corps throw the same amount of hardware at it to achieve a better result?

    In the end I’m not convinced this would even reduce hardware demand. It’s funny that this of all things deflates part of the bubble.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      50
      ·
      edit-2
      2 days ago

      Hm even with DeepSeek being more efficient, wouldn’t that just mean the rich corps throw the same amount of hardware at it to achieve a better result?

      Only up to the point where the AI models yield value (which is already heavily speculative). If nothing else, DeepSeek makes Altman’s plan for $1T in new data-centers look like overkill.

      The revelation that you can get 100x gains by optimizing your code rather than throwing endless compute at your model means the value of graphics cards goes down relative to the value of PhD-tier developers. Why burn through a hundred warehouses full of cards to do what a university mathematics department can deliver in half the time?

      • AppleTea
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        2 days ago

        you can get 100x gains by optimizing your code rather than throwing endless compute at your model

        woah, that sounds dangerously close to saying this is all just developing computer software. Don’t you know we’re trying to build God???

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          Altman insisting that once the model is good enough, it will program itself was the moment I wrote the whole thing off as a flop.

    • peereboominc@lemm.ee
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 days ago

      Maybe but it also means that if a company needs a datacenter with 1000 gpu’s to do it’s AI tasks demand, it will now buy 500.

      Next year it might need more but then AMD could have better gpu’s.

    • sith
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      edit-2
      2 days ago

      It will probably not reduce demand. But it will for sure make it impossible to sell insanely overpriced hardware. Now I’m looking forward to buying a PC with a Chinese open source RISCV CPU and GPU. Bye bye Intel, AMD, ARM and Nvidia.