• FooBarrington@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 hours ago

    If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?

    It doesn’t make any sense to compare games and AI. Games have a well-defined upper bound for performance. Even Crysis has “maximum settings” that you can’t go above. Supposedly, this doesn’t hold true for AI, scaling it should continually improve it.

    So: yes, in your analogy, MS would still buy a new video card this year if they believed in the progress being possible and reasonably likely.

    • Blue_Morpho@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 minutes ago

      Like games have diminished returns on better graphics (it’s already photo realistic few pay $2k on a GPU for more hairs?), AI has a plateau where it gives good enough answers that people will pay for the service.

      If people are paying you money and the next level of performance is not appreciated by the general consumer, why spend billions that will take longer to recoup?

      And again data centers aren’t just used for AI.