Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

  • Dra
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    3
    ·
    edit-2
    5 months ago

    I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid.

    Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit that justifies the cost of 20gb VRAM outside of AI workloads?

    • Eccitaze@yiffit.net
      link
      fedilink
      English
      arrow-up
      28
      ·
      5 months ago

      An actual technical answer: Apparently, it’s because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it’s a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

      • Dra
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        Perfect answer thank you!

    • Asafum@feddit.nl
      link
      fedilink
      English
      arrow-up
      24
      ·
      5 months ago

      Lmao

      We have your comment: what am I doing with 20gb vram?

      And one comment down: it’s actually criminal there is only 20gb vram

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      10
      ·
      5 months ago

      Current gen consoles becoming the baseline is probably it.

      As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.

      That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.

    • Space_Racer@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      ·
      5 months ago

      I’m maxed on VRAM in VR for the most part with a 3080. It’s my main bottleneck.

    • AlijahTheMediocre@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      5 months ago

      If only game developers optimized their games…

      The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.

    • Hadriscus@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      5 months ago

      Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here

    • Obi@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      Personally I need it for video editing & 3D work but I get that’s a niche case compared to the gaming market.