The first salvo of RTX 50 series GPU will arrive in January, with pricing starting at $549 for the RTX 5070 and topping out at an eye-watering $1,999 for the flagship RTX 5090. In between those are the $749 RTX 5070 Ti and $999 RTX 5080. Laptop variants of the desktop GPUs will follow in March, with pricing there starting at $1,299 for 5070-equipped PCs.
The prices are high, but what really is shocking are the power consumption figures. The 5090 is 575W(!!), while the 5080 is 360W, 5070Ti is 300W, and the 5070 is 250W.
If you are getting one of these, factor in the cost of a better PSU and your electric bill too. We’re getting closer and closer to the limit of power from a US electrical socket.
It’s clear what must be done - all US household sockets must be changed to 220V. Sure, it’ll be a notable expense, but it’s for the health of the gaming industry.
It’ll buy us about 8 more years. At this rate, the TGP is increasing at about 10% per year:
3090: Late 2020, 350W 4090: Late 2022, 450W 5090: Early 2025, 575W
Therefore, around 2037, a single 90-tier GPU will pop a 110V breaker, and by 2045, it will pop a 220V breaker too.
/s
Don’t be silly.
Just move your PC to your laundry room and plug it into the 240V dryer outlet.
Anyone getting a 5090 is most definitely not someone who worries about the electric bill
I know plenty of people who’d get a 5090 and worry about the electric bill.
1000W PSU pulls max 8.3A on a 120v circuit.
Residential circuits in USA are 15-20A, very rarely are they 10 but I’ve seen some super old ones or split 20A breakers in the wild.
A single duplex outlet must be rated to the same amperage as the breaker in order to be code, so with a 5090 PC you’re around half capacity of what you’d normally find, worst case. Nice big monitors take about an amp each, and other peripherals are negligible.
You could easily pop a breaker if you’ve got a bunch of other stuff on the same circuit, but that’s true for anything.
I think the power draw on a 5090 is crazy, crazy high don’t get me wrong, but let’s be reasonable here - electricity costs yes, but we’re not getting close to the limits of a circuit/receptacle (yet).
Actually the National Electric Code (NEC) limits loads for 15 Aac receptacles to 12 Aac, and for 20 Aac receptacles 16 Aac iirc because those are the breaker ratings and you size those at 125% of the load (conversely, 1/125% = 80% where loads should be 80% of the break ratings).
So with a 15 Aac outlet and a 1000 Wac load at minimum 95% power factor, you’re drawing 8.8 Aac which is ~73% of the capacity of the outlet (8.8/12). For a 20 Aac outlet, 8.8 Aac is ~55%% capacity (8.8/16).
Nonetheless, you’re totally right. We’re not approaching the limit of the technology unlike electric car chargers.
That’s just the GPU with efficient other parts. Now if we do 575W GPU + 350W CPU + 75W RGB fans + 200W monitors + 20% buffer, we are at 1440W, or 12A. Now we’re close to popping a breaker.
This makes me curious: What is the cheapest way to get a breaker that can handle more power? It seems like all the ways I can think of would be many 5090s in cost.
How many RGB fans does this theoretical build have to use 75W alone?
How else are you gonna cool 925W in a PC form factor? Ever seen fans for server racks?
hire an electrician (or dependent on local laws DIY) to add a dedicated 240v 20a outlet with 12/2 wire.
Out of curiosity, how much does this cost with an electrician?
How far out are we from gpus that also dry your laundry
Going to need to run a separate PSU on a different branch circuit at this rate.