Right, the 5090 isn’t scaling linearly. Performance doesn’t scale linearly with # of GPUs in SLI/nvlink either.
I’m unimpressed. 33% better performance over 4090 for 27% more watts is a joke, in my personal opinion.
4090 at least had 49% better performance vs a 3090 in the most generous benchmark results but using 29% more watts
3090 was a 23% performance increase over 2080 Ti using 40% more watts 💀
The writing has been on the wall since the 2000 series that NVIDIA can’t hit marketing perf uplift targets without substantially increasing power usage. I am just not in their target market because I am not interested in any GPU that uses more than 300W stock.
The only thing I’m impressed by is that they can cool a 575W card in a 2 slot form factor, except they’ve disabled the junction temp sensor for the 5000 series, so I would say “we’ll see” what the temps are like, but actually we won’t. You’re not allowed to see!
Right, the 5090 isn’t scaling linearly. Performance doesn’t scale linearly with # of GPUs in SLI/nvlink either.
I’m unimpressed. 33% better performance over 4090 for 27% more watts is a joke, in my personal opinion.
4090 at least had 49% better performance vs a 3090 in the most generous benchmark results but using 29% more watts
3090 was a 23% performance increase over 2080 Ti using 40% more watts 💀
The writing has been on the wall since the 2000 series that NVIDIA can’t hit marketing perf uplift targets without substantially increasing power usage. I am just not in their target market because I am not interested in any GPU that uses more than 300W stock.
The only thing I’m impressed by is that they can cool a 575W card in a 2 slot form factor, except they’ve disabled the junction temp sensor for the 5000 series, so I would say “we’ll see” what the temps are like, but actually we won’t. You’re not allowed to see!
https://videocardz.com/pixel/nvidia-has-removed-hot-spot-sensor-data-from-geforce-rtx-50-gpus