A prominent open-source dev publishes their findings as to what's going on with Starfield's performance, and it's pretty darn strange.
According to Hans-Kristian Arntzen, a prominent open-source developer working on Vkd3d, a DirectX 12 to Vulkan translation layer, Starfield is not interacting properly with graphics card drivers.
The problem is so severe, in fact, that the aforementioned translation layer had to be updated specifically to handle Starfield as an exception to the usual handling of the issue.
“I had to fix your shit in my shit because your shit was so fucked that it fucked my shit”
This is how games and drivers have been for decades.
There are huge teams at AMD and nVidia who’s job it is to fix shit game code in the drivers. That’s why (a) they’re massive and (b) you need new drivers all the time if you play new games.
I read an excellent post a while ago here, by Promit.
It’s interesting to see that in the 8 years since he wrote it, the SLI/Crossfire solution has simply been to completely abandon it, and that we still seem to be stuck in the same position for DX12. Your average game devs still have little idea how to get the best performance from the hardware, and hardware vendors are still patching things under the hood so they don’t look bad on benchmarks.
I’ll give a different perspective on what you said: dx12 basically moved half of the complexity that would normally be managed by a driver, to the game / engine dev, which already have too much stuff to do: making the game.
The idea is that “the game dev knows best how to optimize for its specific usage” but in reality the game dev have no time to deal with hardware complexity and this is the result.
To attribute this most recent failure to an overabundance of hardware variety is a joke. This issue persists on all Nvidia and Intel cards. Why? Because it’s an oversight pertaining to one thing they all share in common: their shared interaction with DirectX.
Let me repeat myself for the people in the back. The number of items they had to account for with this failure is one. One API.
This sounds more like hardware manufacturers haven’t provided a good enough abstraction layer across their devices, or they did (vulkan) but everyone is just stuck on bad apis that don’t properly map to the abstractions for the hardware. Or even more likely the publishers cheaped out and pushed something to release when it wasn’t ready like they have been forever.
It’s also a lack of specialized talent. There’s lots of great “talent” at game devs and even middleware devs. There’s just not much great talent that deals with renderers and API development. The vast majority of devs just lean on the middleware developer to push out the renderer codebase. In a situation like Bethesda running their own studio engine, they just don’t have the right people for it. This plagued the 90’s when people were trying to code for Glide, OGL, DX5,6,7,8, and 9. Many studios folded because they couldn’t get their tech to work with hardware acceleration.
Pc gaming is and forever will be way better then games on consoles.
Why?
I’ve 3 letters for you.
R G B
( ͡° ͜ʖ ͡°)
tbf pc gaming was always a fight for performance, I never felt superior back in the day fighting with qemm, irqs for the soundblaster or glide3d, it’s always had been a shitshow. It was a super shitshow in the nineties, it was a bit better in the zero’s and nowadays it again became a tad better.
As far as I know that’s what graphics drivers do, like, all the time. Every major title is handled specifically. I am not a developer. I heard this from engine developers
It’s poorly optimized code, and the comments from the top brass has been “lol your PC sux” when they can’t even get it running right on their own hardware.
It’s not the variations of PC that’s the issue, it’s a design and quality control issue. Direct X and Vulkan are the bread and butter of PC gaming. Microsoft developed direct X to establish a common graphics framework for Windows and Microsoft game studio still fucked up working with it.
This is just classic corpo shit, developing their own proprietary stuff when no one asked for it. Apple with Metal too. Then it falls on developers to write abstraction layers
“I had to fix your shit in my shit because your shit was so fucked that it fucked my shit”
This is how games and drivers have been for decades.
There are huge teams at AMD and nVidia who’s job it is to fix shit game code in the drivers. That’s why (a) they’re massive and (b) you need new drivers all the time if you play new games.
I read an excellent post a while ago here, by Promit.
https://www.gamedev.net/forums/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/5215019/
It’s interesting to see that in the 8 years since he wrote it, the SLI/Crossfire solution has simply been to completely abandon it, and that we still seem to be stuck in the same position for DX12. Your average game devs still have little idea how to get the best performance from the hardware, and hardware vendors are still patching things under the hood so they don’t look bad on benchmarks.
I’ll give a different perspective on what you said: dx12 basically moved half of the complexity that would normally be managed by a driver, to the game / engine dev, which already have too much stuff to do: making the game. The idea is that “the game dev knows best how to optimize for its specific usage” but in reality the game dev have no time to deal with hardware complexity and this is the result.
deleted by creator
To attribute this most recent failure to an overabundance of hardware variety is a joke. This issue persists on all Nvidia and Intel cards. Why? Because it’s an oversight pertaining to one thing they all share in common: their shared interaction with DirectX.
Let me repeat myself for the people in the back. The number of items they had to account for with this failure is one. One API.
This sounds more like hardware manufacturers haven’t provided a good enough abstraction layer across their devices, or they did (vulkan) but everyone is just stuck on bad apis that don’t properly map to the abstractions for the hardware. Or even more likely the publishers cheaped out and pushed something to release when it wasn’t ready like they have been forever.
It’s also a lack of specialized talent. There’s lots of great “talent” at game devs and even middleware devs. There’s just not much great talent that deals with renderers and API development. The vast majority of devs just lean on the middleware developer to push out the renderer codebase. In a situation like Bethesda running their own studio engine, they just don’t have the right people for it. This plagued the 90’s when people were trying to code for Glide, OGL, DX5,6,7,8, and 9. Many studios folded because they couldn’t get their tech to work with hardware acceleration.
*for current wage
Excellent point.
Lol
Pc gaming is and forever will be way better then games on consoles.
Why?
I’ve 3 letters for you.
R G B
( ͡° ͜ʖ ͡°)
tbf pc gaming was always a fight for performance, I never felt superior back in the day fighting with qemm, irqs for the soundblaster or glide3d, it’s always had been a shitshow. It was a super shitshow in the nineties, it was a bit better in the zero’s and nowadays it again became a tad better.
But somehow I enjoyed that shitshow. Still do.
As far as wedding vows go, they’re not the MOST romantic… 🤷
As far as I know that’s what graphics drivers do, like, all the time. Every major title is handled specifically. I am not a developer. I heard this from engine developers
They released on two different platforms. PCs have so much variation in hardware, it’s not surprising there are issues with it.
It’s poorly optimized code, and the comments from the top brass has been “lol your PC sux” when they can’t even get it running right on their own hardware.
It’s not the variations of PC that’s the issue, it’s a design and quality control issue. Direct X and Vulkan are the bread and butter of PC gaming. Microsoft developed direct X to establish a common graphics framework for Windows and Microsoft game studio still fucked up working with it.
They could have picked Khronos’ APIs. They think they are smarter than everyone else including GPU developers.
This is just classic corpo shit, developing their own proprietary stuff when no one asked for it. Apple with Metal too. Then it falls on developers to write abstraction layers