- Next-Gen flagship GPUs NVIDIA AD102 and AMD NAVI31 might both draw more than 400W of power - VideoCardz.com
hey now all these marketed 1000-1500W PSUs of the years around 2010 can finally come to full use...
yes i know 1% of these were actually used for these fine dual-tri-quad-SLI/CFX (0.4 - 1.2kW) configurationsID: h72y8m2
I was one of those 1% as I was planning on running a Quad GPU setup way back in 20122013. I ended up with a GTX Dual SLI setup that I wanted to expand over to AMD Quad Crossfire eventually. Well that day never came thanks to the Fury X performance, but the PSU has worked well for a decade. 400W GPUs......I'm really not looking forward to that, but with power limitations and FPS limits it's should be so bad.
I'm replacing my 1300W SuperNOVA G2 with a 850W G5. Both has 10 year warranty. In my reviews I noted my power consumption and even with a ancient platform I was under 700watts......even with a highly OC'd older CPU + RTX 3080. Most PSU can go higher than their rated wattage in some cases, but it's best to not gamble on that. I guess we will see what actually happens soon.
The way way things are going it won't matter since the majority probably won't get a chance to even buy the next generation of GPU :).ID: h74b7x2
In 2017 I picked up a 1000w plat modular PSU, first time I got a fancy PSU. Thinking was gonna get dual badass GPU's for mGPU when it went mainstream. Still waiting, attempting to be patient.
I remember when back in the Pascal Generation, people were talking about how "no one needs over 550W PSUs, GPUs tend to use less power these days".
I told em that the maximum NV and AMD would likely still stop at is 375W (2x8 pin + PCIE slot) and yet here we are LMAO.ID: h73dnqn
I'm glad I went for 650W PSU when I bought my current one in 2017. Because of that I can safely upgrade to an RX 6700 XT without having to worry about not having the headroom for it.
I never understood people who chose PSUs that left them with almost zero headroom for the specs that they chose especially when good PSUs tend to outlive the builds they were originally bought for.ID: h747ubb
I got stuck in the "Wait till next year" since getting my 1000w plat PSU back in 2017. Although the GPU's that caught my eye were released, still need to see em'.
More direct tot he post. If the rumors are true of Navi31 getting chiplets, or regardless double of SIMD units, it makes sense.ID: h75080q
I mean unless you're buying a 4080 or 4090, the power usage for GPUs are still pretty normal. And power efficiency has been going up.
The last thing we need is a 0.5kW piece of the circuit board in the machine. Will it have a separate PSU? What? Why not?ID: h72pjgh
Inb4 you need a separate power cable for the GPU only.ID: h72yk2l
3Dfx Voodoo 5 6000 all over again.ID: h73crdj
Why an increase from the current (stock) maximum of 300W to 400W would require you to have a separate PSU for your GPU?
There are PSUs that can provide up to 1500W.
Also if current high end GPU recommendations call for a 750W PSU then an 850W PSU will be plenty for those upcoming flagships.ID: h74a78b
Well it makes sense if you think about it, and if the rumors of not only this, but other things we're hearing, are true. Navi31 will have double the cores of current Navi21 with a node shrink so it makes pretty good sense it'll be power hungry.ID: h73fdvt
300W is already crazy enough. Too crazy, actually.
They'll have to increase power consumption to provide the expected jump in performance. Datacenter cards have been at 450W for some time now.
We've been stuck at this arbitrary 300W max (often blowing past it), yet we keep wanting more performance.
Something has to give.ID: h73atrf
These aren't for 'expected' performance increases, though. They're apparently trying to scale well beyond normal performance increases for these halo products.
I'm pretty sure we'll have something that gets a more 'expected' performance increase for a more reasonable TDP(at more 'normal' high pricing).
All I want is a fairly low-power GPU for 1440p at medium/high, 60-ish fps. Freesync between 35 and 90 so anywhere in there is nice enough.
But that's too much to ask for apparently. Going to remove my custom loop and put a Morpehus on my Vega instead.ID: h734a31
Is the 5700xt too power demanding? Cause it would work for that.ID: h73wsrt
Lowest price in my country is 610 euros. And with current prices the way they are (674 eur for cheapest 6700 xt) it's really no point considering I paid MSRP for my Vega 64 4 years ago. 6700 XT is better sure, but it's hardly worth what it costs.
My plan was to wait for a 6700, 6600XT or something like that but the current prices that seems way out of any value proposition.ID: h73rg9u
5700x already smashes that and a 6700 or higher is plenty still heck even the new 6600xt is probably well sufficient.
It's different if you were looking at high refresh rates but 1440p 60fps is not really extreme anymore!
How much is fair low power in your mind? Vega is one hot machine but the 5700xy is half the power for similar performance and the 6700xt would that that again really so they are already using low power if the load is 1440p 60fps.ID: h73x14y
5700 XT is over 600 euros here so it's hardly worth upgrading from a value proposition. It's not that my Vega struggles with my demands, it's more that I want to get rid of my custom loop. 6600XT was my original plan but we'll see what the prices turn out to be.
Yea, I'm never gonna justify having a PC that sucks down power like a microwave while I'm gaming.
It's not just the costs, it just feels very wrong. Gaming is already not exactly environmental friendly, but it doesn't need to be egregiously awful, either.ID: h73c5ee
Gaming is already not exactly environmental friendly
The power consumption increase when you're gaming is insignificant when it comes to the impact on the environment especially since most people game a few hours per day at most.
A far more important reason why you might not want to increase your PC's power consumption is because that power is converted almost entirely into heat which will end up in your room.ID: h754uv6
Uh no. That's an absurd fucking statement.
Power consumption when gaming isn't ever insignificant unless you're running a very efficient rig. I try and keep mine to something somewhat reasonable in this area.
If we start normalizing PC's that suck microwave-levels of power while gaming, this is very problematic. People run microwave for minutes at a time. Running a PC at that level for hours at a time, even if it's only a few hours, is still very fucking significant.
But I can see how selfish your mindset is. "How much power we collectively use isn't important, but how much heat you might have in your room is totally super important". smhID: h7576x8
You're getting downvotes but I agree. The heat would be unbearable for me personally, but beyond that I'd just feel iffy nearly doubling the power usage of my PC when I'd likely get by with a weaker GPU.
LMAO, so do we need a separate PSU for the new GPUS?ID: h73b8yd
No. You can actually get currently available cards to pull that much power and since PSUs go up to 1500W we are far from requiring a separate PSU for the GPU.
Thats what competition will do.
Both, nvidia and AMD are going for the best they can do, it was obvous that this will become an arms race.
IMO the most interesting part ise the release, could be Q3, could be Q4.
Whilst I'm sure these will be blazing fast, I really just don't want that much heat dump in my room. 250W cards already kick out a lot. 400W will be difficult for my case to handle and it'll be a nightmare for temperatures in my room.
Is a big performance increase necessary if power consumption is this big.. i say no they should increase ray tracing not raster