• li10@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      It makes sense to me; AI needs GPUs, and there’s an AI boom.

      Probably won’t be as bad as the crypto situation, but I imagine it will have an effect on GPU availability at some point.

      • beefcat@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        The AI boom started last year, it’s the reason prices are already high. So I’m skeptical that they are going to get higher, seeing as this whole AI thing feels like a bubble.

        • Vilian@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          true, if they are high, it’s right now, a lot of news debunking AI, regulations, chat gpt geting worse etc are appearing

      • redcalcium@lemmy.institute
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        AI needs GPUs with a huge amount of vram. Nvidia moves to put criminally low amount of VRAM in their low and mid range gaming GPUs will ensure those GPUs won’t be hoarded by companies for AI stuff. Whether gamers actually want to buy those GPUs with such low amount of VRAM is a different matter though.

  • li10@feddit.uk
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Remember when PC gaming used to be hands down better than console?

    Anyone who still believes that is stuck in the past.

    PC certainly has its benefits and is my platform of choice, but if somebody was getting into gaming or just casually interested then a PS5 is a much better choice these days.

  • armchair_progamer@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    But aren’t the GPUs used by AI different than the GPUs used by gamers? 8GB of RAM isn’t enough to run even the smaller LLMs, you need specialized GPUs with 80+GB like A100s and H100s.

    The top-tier consumer models like the 3090 and 4090 have 32GB, with them you can train and run smaller LLMs locally. But there still isn’t much demand to do that because you can rent GPUs on the cloud for cheap; enough that the point where renting exceeds the cost of buying is very far off. For consumers it’s still too expensive to fine-tune your own model, and startups and small businesses have enough money to rent the more expensive, specialized GPUs.

    Right now GPU prices aren’t extremely low, but you can actually but them from retailers at market price. That wasn’t the case when crypto-mining was popular

    • acedelgado@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      They’re not that different, really. CUDA processing cores are the most used in AI training, and those are the main processors used in both Nvidia’s consumer desktop cards and machine learning enterprise cards. As “AI” is on the rise, more and more of the supply of CUDA processors and VRAM chips will be diverted to enterprise solutions that will fetch a higher price from deals with corporations. Meaning there will be less materials available for the consumer-level GPU supply, which will drive prices up for normal consumers. NVIDIA has been banking on this for a long time; that’s why they don’t care about overpricing the consumer market and have been trying to push people towards cloud-based GeForce Now subscription models where you don’t even own the hardware and just basically rent the processing power to play games.

      Also just to be anal, the 3090 and 4090 have 24Gb of vram, not 32Gb. And unlike gaming nowadays you can distribute the workload to multiple GPU’s in one system, or over a network of machines.