Lately we’ve been talking about games not performing well enough on current hardware. It’s had me wondering just what we should be asking for. I think the basic principle is that components from the last 5 years should be adequate to play current-generation titles at 1080p60. Not at max settings, of course, but certainly playable without resorting to DLSS and FSR.

It makes me wonder: is it really so much to ask? There are games from 10+ years ago that still look great or at least acceptable. Should we expect new games like Starfield to be configurable to be as demanding as an older game like Portal 2 or CS:GO. If the gameplay is what really matters, and games of the 2010s looked good then, why can’t we expect current games to be configurable that low?

From what I’ve seen, users of the GTX 1070 need to play Starfield at 720p with FSR to get 60fps. What’s better? Getting 60fps by playing at 720p with FSR, or playing at 1080p with reduced texture resolution and model detail?

It shouldn’t even be that hard to pull off. It should be possible to automatically create lower detail models and textures, and other details can just be turned off.

  • bstix@feddit.dk
    link
    fedilink
    English
    arrow-up
    17
    ·
    edit-2
    10 months ago

    There’s a reason why Minecraft is the best selling game of all times: it runs on your mothers laid off work laptop.

    If you want to have many players and sell many games, this is the type of pc to aim for.

      • bstix@feddit.dk
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        The silly thing is that it can run it perfectly fine on a ten years old cellphone, but my 5 years old pc has trouble.

        Anyway the point is that people play what they can play (Roblox,Amongus, etc.), so the question of what minimum requirements are expectable really depends on how many players the developer wants. While technology constantly advances, the market base for games with high requirements is dropping, because most gamers don’t upgrade their hardware as often as they used to.

      • Acters@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        My 1gb of ram and dual core cpu ram configuration running off a 5400 rpm hdd laptop would at least do 30 fps when I play minecraft back in the 1.2.5 days up to 1.7.10, even played modded minecraft at 15 fps sometimes

  • ninjan@lemmy.mildgrim.com
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    10 months ago

    Well a couple of issues. One is size and complexity, pretty low issue since games are already massive and where size is an issue (console physical media) they don’t need to be included.

    A bigger one is what does it say about the product? People are already using low settings disingenuously to say X game is ugly and not up to par, a potato setting would just exacerbate this and from an art standpoint do the developer really want to stand behind such a gimped product?

    Then we have the problem of lowering details not really being able to solve the problem. A long draw distance can make it still really rough on older components due to memory constraints or ability to juggle X amount of objects. Sure a draw distance slider isn’t uncommon but if the game uses said distance like say enemies spotting you from far away you can’t lower draw distance such that it impacts game play. Physics and other demanding aspects might also make lowering settings hard or make for a big gameplay impact.

    And finally why isn’t FSR acceptable to make the game playable? It seems the perfect solution.

    • GenderNeutralBro@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      And finally why isn’t FSR acceptable to make the game playable? It seems the perfect solution.

      “Perfect” is a bit of stretch. I’ve tried running games with FSR on my GTX 1080 and it looks like absolute ass. By lowering graphics settings I was able to run at a much higher resolution/framerate while looking leaps and bounds better.

      I realize that newer GPUs will give better results with FSR, but if I’m getting a newer GPU then presumably it would have better native performance anyway.

      I’ll certainly give it another go when I upgrade to an Ada/RDNA3 GPU, but until then it’s just a marketing gimmick to me.

      • TwanHE@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        Did you try fsr 1 or 2? The difference is quite noticeable for me, still not enough to justify ever using it on a 1080p monitor tho. Maybe fsr3

      • ninjan@lemmy.mildgrim.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        Wouldn’t a potato setting also look like absolute ass? I mean few games are very pretty at 1080p Low which OP thinks isnt going low enough to accommodate old hardware. In those situations FSR can be used to make it run at 60 FPS with 1080p output but yes it will for sure not be a pretty experience, but it will run and won’t destroy gameplay mechanics.

        Now I do believe that the absolute best would be if developers got better at building their games and graphics engines to be resource efficient or “optimized” as the gaming community likes to call it. But that’s quite an ask in reality as most studios don’t build their own engine and those that do generally aren’t building them with resource efficiency in mind, they’re focused on what kind of games they want to build with them. Id Software really stands out but they’re engine making wizards almost more so than game developers and really pride themselves with making good looking games that can be run well by just about anything. Doom Eternal runs and looks ridiculously well on the Steam Deck (in regards to what little juice that machine has) as an example.

        I’m being awfully apologetic of developers here, but I really don’t think the issue is that you can’t run the games on older hardware, because FSR has mostly solved that imo.

        • GenderNeutralBro@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          The specific example I experimented with the most was Firmament at 3440x1440, targeting 100fps.

          Using FSR at the recommended settings, it was a blotchy, blurry mess. Text was barely readable. Turning down shadows and basically everything except antialiasing I got native 3440x1440 at a pretty solid 100fps.

          It’s a real shame that the default settings in that game have FSR enabled, because I’m sure a lot of players just go with the defaults and think the game looks like shit, when actually it’s very beautiful with correct settings, even on relatively modest hardware.

      • gregoryw3@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        It’s really quite fair that FSR and DLSS don’t look great at 1080p since they weren’t designed for that use case at all. Ideally it’s meant to upscale to 4K where the base resolution is at minimum ~1080p where’s there’s enough pixels to get a good output. When trying to upscale to 720p or 1080p the base resolution goes down to the 300ps (maybe even lower) which just isn’t enough that least with todays models.

        So for me I don’t see FSR or DLSS being a solution at all the hardware question for longevity. To me it just allows a weaker GPU to look nicer on a 4K screen (theoretically) not to increase FPS as a main feature.

        Maybe GPUs could get cinema upscalers? I don’t know how they work or feasibility other than for a while 4K Bluerays we’re just the 2K footage but ran through these upscalers.

      • gregoryw3@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        It’s really quite fair that FSR and DLSS don’t look great at 1080p since they weren’t designed for that use case at all. Ideally it’s meant to upscale to 4K where the base resolution is at minimum ~1080p where’s there’s enough pixels to get a good output. When trying to upscale to 720p or 1080p the base resolution goes down to the 300ps (maybe even lower) which just isn’t enough that least with todays models.

        So for me I don’t see FSR or DLSS being a solution at all the hardware question for longevity. To me it just allows a weaker GPU to look nicer on a 4K screen (theoretically) not to increase FPS as a main feature.

        Maybe GPUs could get cinema upscalers? I don’t know how they work or feasibility other than for a while 4K Bluerays we’re just the 2K footage but ran through these upscalers.

    • whileloop@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      A bigger one is what does it say about the product? People are already using low settings disingenuously to say X game is ugly and not up to par, a potato setting would just exacerbate this and from an art standpoint do the developer really want to stand behind such a gimped product?

      To clarify, I’m not saying that games should design for the lowest level of hardware, but they should make it easier for people with lower tier hardware to adjust the game to run better. Some people might use it disingenuously, but I really think people will see right through that if the potato setting became common enough.

      You made another point about lower details not being enough. In cases where it isn’t possible to improve performance without sacrificing gameplay, I think developers should put gameplay before making the game playable with older hardware. Its not about demanding all games be playable on a mid-range machine from 2010, its about asking developers to make an honest effort where possible.

      As for FSR, I don’t think its a full solution since it still uses the same model and texture quality as the lowest setting (if not higher), which could mean higher VRAM use than needed. I’d rather see FSR paired with downscaled textures and simplified models.

    • fidodo@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Another thing to consider is that developers consciously make design choices to adapt to the hardware available at the time, and these are not things you can just turn on and off with settings. For example maybe there’s a zone transition that slows the game down on older hardware and in the past they would have added a subtle loading area like a tunnel but that’s not needed anymore for their current target hardware. Should they completely change the game to be a bit smoother on 5+ year old hardware? You can get a ton of compatibility with changing settings, but to get the same level of optimization as games made at the time the old hardware was the target hardware would oftentimes mean significantly changing the game itself.

    • whileloop@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      I’ve been playing the RTX mod for Half-Life 1 recently. If its fun to see old games with new techniques on them, what about new games made to look old? I’d love to see someone make a tool that does the opposite of RTX remix - downscale all the textures and simplify the models, remove a ton of shaders and such. Probably never gonna happen, but it’s a cool idea.

        • whileloop@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Hmm. Never heard of a demake outside the context of an underpowered console or handheld, like 360 games being published for the Wii.

  • boletus@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Game dev working in a veteran studio here (not a veteran myself but lots of exposure to vets)

    I agree, modern games spec their minimum a bit too high for what is possible. I am very against when games judge their performance using DLSS and FSR. I think they are perfectly good tools for giving MORE fps or allowing the use of higher resolutions without tanking the performance, but modern games need to stop using it as the baseline for performance.

    But the very last line you mentioned really isn’t true, at all. Rendering is incredibly complicated. Automatically creating lower detail models and textures is not simple, LODs and lower res assets are made easier with tools but it is still a complex process and requires lots of efforts by many talented artists. Ensuring they work well in your engine is not an automatic process.

    I know somebody is going to mention something like blah blah nanite blah blah lumen blah blah unreal engine, but unreal engine is not a fix all for everything. We don’t want the games industry to be all using a single game engine, that is unhealthy for software, the games industry, and locks all talent to a single piece of software. Also lumen and nanite don’t even help with performance on lower end devices, they are mostly designed for mid to high end graphics as both are intensive processes on their own.

    Then there’s the whole thing about modern rendering techniques. Ever since the birth of graphics, hardware has constantly improved and they’ve changed and things have had to be left behind…

    • Fixed pipeline cards vs programmable pipeline graphics cards brought a huge challenge to developers at the time because they now had to support both types of cards. We no longer support fixed pipeline cards as they are obsolete.

    • Compute shaders allowed for offloading work to the gpu and also modern enhanced post effects and rendering techniques. This meant games had to support both a compute and non compute solution for necessary graphics effects. This is not an easy process. Modern games tend to require compute cores as all modern and last gen consoles support some sort of compute shader support I believe, as well as modern gpus.

    The same will happen with modern rendering techniques and raw gpu power. The difference between a 980ti and a 4080ti is absolutely insane, and the advent of Ray tracing and AI cores has widened the gap even more. Devs need to make concessions and cut off a certain range of hardware to make achieving their games possible. Tech innovations allow game devs to use tools, methods and realise concepts that were previously either impossible, or significantly affected due to technological limitations, but they can’t make those innovations if they are held back by a much older set of hardware that can’t do what modern hardware can. That balance is important, and some games (teardown for example) need to leave behind aging hardware so that the game is actually possible.

    That said, I know for a fact that if they can make a game run on a switch, or an xbox one, or a ps4, then they can most definitely make the game run on a graphics card of that time. Game devs do a lot of hacky shit to get games to run on old hardware like that (ps4 came out ten years ago), so I understand if it doesn’t quite reach that level of optimisation, but if your game runs on a ps4 it should run pretty well at low graphics on a 980ti.

    Anyways I’m pretty tired so mind any mistakes, but the issue isn’t just “game devs are lazy”, there’s so many layers to it. The tools for games nowadays are vast, but they are still incredibly hard to make as complexity of games continues to rise, so issues you face are likely issues that software engineers struggle and struggle to resolve. Not saying games like starfield don’t deserve criticism, just saying to be mindful and check your assumptions before assuming that it’s a simple problem to solve.

    • AdamEatsAss@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      It’s my game I should be able to play it how I like it! I know nothing about game design but I assume if you already did the work to make the setting adjustable it wouldn’t be that much work to add a literally unplayable setting to the slider.

  • fidodo@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    10 months ago

    I’m a little confused by your timeline. I agree, 5 year old hardware should definitely support 1080p60, but the 1070 is 7 years old now. Since the 1070 could support that when it came out and those are static targets I think we should expect the 1070 to support 1080p60 forever for games similar to games that were coming out at the time, but it’s a bit unfair to compare starfield to portal 2 and cs:go when those games are in constrained and controlled environments while starfield is vast and open, and environments definitely take a GPU toll, so you will lose some performance to that compared to those games. I haven’t played starfield yet so I don’t know the details, but given the scope I know of it, it doesn’t sound unreasonable for it to miss the 1080p60 mark a bit given the difference in game environment.

  • Cyberwitch_7493@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    Yeah, I’m all for picking up indie games on itch.io, and they usually aren’t demanding enough to require a dedicated graphics card, integrated graphics usually runs fine.

  • chemsed@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    This video (https://youtu.be/4_WIhy4jbr8?si=m9TlvuLm0uyyIwmF) shows how silly it is that these games are so demanding. We should be close to complete the transition to 1440p or 4k gaming, but some of the new games ask too much to play at higher resolution than 1080p.

    Also, the most recent API, DX12 and Vulkan don’t necessarily improve performance vs DX11 and Unreal Engine 5 seems demanding to run (https://youtu.be/mtsxlKPMthI?si=lUVVDyZs_iG_8_z3).

  • Aux@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    10 months ago

    You said you want to play games on a 5 year old hardware. Fair enough! But then you mention GTX 1070 which is a 6 year old product. I mean it’s definitely out of date even by YOUR standards. Don’t expect anything to run on it.

    • StijnVVL@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      10 months ago

      This is not true. Until a couple of months ago, I ran a 10 year old rig with a gtx960. It could handle Elden Ring to name one without problems.

        • StijnVVL@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          10 months ago

          To not expect anything to run on that setup. The gtx1070 was a phenomenal card back in the day and I’m sure it can handle a lot of newer games still.