What’s so hard to understand about sacrificing a bit of latency and a few visual nits for much better and more consistent framerate?
As much as the notion of developers getting lazy about optimizing for hardware is real, you really have to ignore a lot to suggest upscaling technology is snake oil.
What’s so hard to understand about sacrificing a bit of latency and a few visual nits for much better and more consistent framerate?
As much as the notion of developers getting lazy about optimizing for hardware is real, you really have to ignore a lot to suggest upscaling technology is snake oil.