- [Hardware Unboxed] Nvidia Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs
-
This goes to show the absolute importance of benchmarking at 1080p (techpowerup made a mistake changing their absolute benchmark to 4K after the 2080ti), even an argument can be made that Ultra settings make things more taxing and the 3090 takes the lead it was only super marginal when compared to a 5700xt.
Basically, the rule of thumb, is benchmark what people actually use, 4K while interesting is more niche than fucking VR. If you look back far enough you see me arguing the opposite, but that changed the minute I saw the 4K monitor steam data.
-
Does anyone member when this was reversed? I member.
-
Seems like quite the claim after trying this with 2 games.
-
They tested a lot more than 2 games, extra information is available to their floatplane/patreon subs as well.
-
Lets be honest here, Nvidia drivers actually suck behind the scenes. Even booting up the damn control panel. I have a 3950x and a 3090 combo with 32gb ram. NVME and everything. Opening the driver control panel takes time. Even navigating within, takes time. Saving the settings, takes even longer. Its absolutely horrific. But what can you expect when Nvidia doesn't really care about gamers. Yeah, they may have been a gaming company made by gamers for gamers, but they have morphed and changed into something that doesn't give two shits about gamers. Yes my 3090 performance is pretty fucking amazing. But I honestly miss the ease of use and drivers from AMD. I also hate that Nvidia nerfs performance on freesync monitors on purpose, so I have to run a custom resolution to fix the issue.... Funny, when I run an AMD gpu, I never have to run a custom resolution, it just works. (the issue being ghosting. nvidia doesn't read the monitor specs properly, and tries to run outside the monitors range, and you get ghosting. running a custom resolution you can plug in the proper data like AMD gpu's run, and thus the monitor works correctly without ghosting)
-
So if you have anything less than a fast CPU, use team RED!
-
I noticed this back when I had an FX processor and switched my Radeon to a Geforce (because no Radeons were in stock for 4+ months) and instantly I saw poorer performance.
-
TL;DW version would be the driver overhead issue affects Nvidia cards, both Turing and Ampere, specifically in cases presenting with CPU bottlenecking. In cases of CPU bottlenecking (for example on the video using Ryzen 1600X and 2600X and Intel 10100), Nvidia cards would have 20 - 30% reduction in average FPS in comparison to AMD cards. The way to "fix" this would be to eliminate the CPU bottleneck by introducing "GPU bottleneck" (increase the graphics settings); on higher GPU loads, the cards tested would perform as expected.
This is the same thing that happened with Navi vs. Turing series; AMD cards tend to age like fine wine, in my opinion. That being said, this is a minor problem to be presented; I think this meant that Nvidia drivers are more "sensitive" to CPU bottlenecking than AMD drivers.
The video had also referenced an older video (about 3 years ago, courtesy of NerdTech); this is not the first time this happens with Nvidia cards.
-
I still believe Nvidia has the upper hand in most DX9-11 games, particularly those that are more single-thread bound, as their approach to scheduling works wonders for alleviating the drawcall bottleneck in the main thread, and has since their 337.50 "Wonder Driver" from 2014. OpenGL is not even a question, AMD just stinks there.
AMD has caught up somewhat, but I think this improved CPU performance stems from modern APIs as well as properly multithreaded DX11 games. I'd like to see this kind of testing made with CS GO, ARMA 3, Kingdom Come: Deliverance, World of Tanks, GTA 5 or even Fortnite.
GameGPU.com has a GPU chart that can change depending on CPU. I recently took a look at the games with the i3 4330 chosen, and the results were mixed, which is better than a few years ago, when you could see a clear split with Nvidia at the top and AMD at the bottom in most games.
Also, a similar test with more games shows it's a mixed bag:
-
GameGPU.com has a GPU chart that can change depending on CPU.
That chart was fake. They "estimated" results rather than actually running tests. It's impossible to know which results are real and which are made up.
-
That chart was fake
How do you know?
-
oh the nvidiots is not gonna like this one, paying more for a worse product, big yikes
-
This is just stupid fanboyism. I own a 6800, but to be honest 3080 vs 6800xt is a 3080 any day of the week
We still don’t even have information about FidelityFX Supersampling which we were promised to know/learn about when the 6900XT launched. Product>>promise
-
ah yes there's the stupid fanboyism! 3080 10 gb vram shortage =) fools be blinded
-
Not exactly paying for worse product, more like paying for cards that is more sensitive to CPU bottlenecks in comparison to AMD cards (or more specifically, Nvidia drivers are more sensitive to CPU bottlenecks in comparison to AMD drivers).
There is no question that Big Navi RT is leagues behind Ampere RT performance (but that is besides the point).
-
It's more like amd scales better with lower end cpus when it comes to dx12/vulkan, Nvidia is still king on dx11.
-
[deleted]
-
It's the worst GPU because it's not worth the price difference compared to the RX 6800 XT.
-
The 3080/3090 are the same then. 2GB more VRAM than the 3070 and a small bit more speed for 100% more price.
And 3090 even worse with like 300% more price than the 3080.
-
[deleted]
-
If all GPUs were at MSRP, the 6900XT and 3090 would be both shit for gamers.
The 6800/XT and 3080 are much better bang for buck with similar high perf. As for the 3090, at the least it has good CUDA support so prosumers can benefit from it. 6900XT isn't even ROCm supported!
So HUB's conclusion is very accurate.
-
[deleted]
-
The RTX 3070 is $500 MSRP and gives the same performance as a 5700 XT, when paired with a Ryzen 3600 and run at 1080p medium / HRR. The 3070 should be smoking the 5700 XT in every benchmark.
Ryzen 3600X + RTX 3070 is a very realistic build, it should be noted.
-
This comparison is also a bit pointless for $1000+ GPUs, you shouldn't pair a 2600X with a 6900XT/3090 anyway.
hey! That's me!
-
Yeah. I mean, it's interesting in a way. But if you're pairing a 3080/3090 with a 1600/2600x or playing at 1080p you should probably rethink your purchasing priorities. lol
So the TLDR is that Nvidia drivers aren't optimized to work with outdated hardware essentially?
-
HUB has some strange concepts to say the least. I would advise you to form your own opinion.
-
strange concept like what? paying 50% more than 6800xt for 10% more performance?
-
Let me guess... another "HUB is AMD fanboy" post?
The concept and takeaway of the video would be to watch other videos for your particular combination of CPU and GPU; the reviews with test benches are performed to evaluate raw GPU performance, thus it requires the elimination of CPU bottleneck (with test benches often using the highest-end CPU available for consumers).
While the less performance of RTX 30 series cards in CPU-bottlenecked scenarios are quite startling, it is "not a big deal" if you crank up the GPU workload.
-
I don't really think they have "strange concepts" as you said. 3090 gets more VRAM compared to 6900XT and 3080, more performance in both RT and traditional field, plus DLSS and better encoder. 3090 is good for those who want the best of the best and don't really care for price tags. 6900XT on the other hand, isn't that much different from 6800XT since the only difference is literally 8 cu. You can OC a 6800XT and it will match 6900XT
-
I dont understand the testing methodology here. It has been known for a while that nvidia gpu's is not a good combo with any pre zen 3 cpu. I think a better test would have been to use one powerful cpu such as a 5800x/5900x to test a 3090 and 6900xt. Test both gpu's with the exact same game settings on the same cpu and figure out what their average fps is. Then use RTSS to set a frame cap to something lower where both GPU's can hit that frame cap the whole duration of the test, then examine the cpu usage. This way the cpu and gpu's have the same "game" load and we can observe differences in the cpu usage caused by the drivers. I think what this video shows is that the nvidia drivers are hindered by the pre zen 3 cache latency performance. In a perfect test scenario there will be driver overhead differences between amd and nvidia, but I dont think it will be anywhere as large as whats being shown here.
-
I think you misunderstood what's being tested and why here.
The point is that Nvidia's drivers have considerable overhead, and when you run into a scenario where you become CPU limited (for whatever reason), performance basically takes a double-hit: You're CPU limited for the game, and now that impact is also impacting your GPU.
Which effectively means that as time goes by, your performance degrades, assuming game engines CPU load and GPU load increase, which has always been true.
-
Im not misunderstanding the test, my argument is that I dont think its a good test case for testing the claim that nvidia has higher driver overhead. He tests cpu overhead by comparing performance across different cpu's which introduces more variables. What I was trying to say in my previous post is that cpu overhead could have been tested and compared using a single cpu. If a game is capped to a frame rate that both gpu's can easily do, then the game work load should be the same for both gpu's. But then we can observe the cpu usage to determine which gpu needs more cpu to render the same amount of frames.
Nvidia GPu's doesnt like the duct tape in the ZEN1. We already knew that. Thats not to say Radeon isnt scaling better in lower resolutions. They are. Have atleast 6 core intel from the last 4 years or ZEN2+ and you are fine. ZEN1 gaming performance was honestly trash. No matter if its radeon or geforce. Who even has that with GPUs of that caliber anyway ?
They tested on Intel 10th gen too.
I think you missed the point of the video. Besides they tested it with an i3 10100(or whatever it is called).
Nvidia defenders assemble!
Its more about the fact that my intel cpu is as old as ZEN1 but it has no issues pushing 150+ fps. i mean, makes you think about how trashy the first gen ryzen was. People got bamboozled with the low per core pricing
I think Nvidia's software scheduler multithreading approach meant a lot of cross-CCX communication, which had quite the penalty in early Ryzen.
You are forgetting that GPU performance isn't static. As time goes on CPUs like the 1600(X) and 2600(X) are likely to be paired with new midrange GPUs which will perform similarly to higher end GPUs from older generations.
Also it's an objective short coming of Nvidia's driver.
You also seemed to have missed the part of the video where Steve confirmed that this wasn't just an issue with Zen 1 CPUs.
this issue will be present with a 2600x-3060/3060ti combo , which should be a logical pair and yet still suffer from the issue in the same cpu bound titles
so it needs to be adressed...
people act like no one will pair 2600x with a 3070-3080, ok i can get that, but im quite sure this issue will happen with any rtx 3000 series gpus
let's say, rtx 3060ti equivalent amd gpu gets better fps with a ryzen 2600x, then that's a lost customer for nvidia
You also seemed to have missed the part of the video where Steve confirmed that this wasn't just an issue with Zen 1 CPUs.
but it is. Iam not buying that 4 core i3 result. Which by itself is smashing ZEN1 anyway but thats not the point. Point is that iam not getting those results he showed. Hell i can play 720p 200+ fps and then be CPU tied but his video shows that Ampere can barely push over 100fps when CPU tied. i mean what the hell. Even with the ZEN3 its still need to be tied to ZEN overall as he already showed intel vs amd cpu limited benchmarks. idk when including 5600X he could atleast include 10th gen as well
Probably the most intelligent frame-rate benchmark testing I've seen in quite some time!...;)
Maybe Nvidia haven't optimized their flagship drivers for lower end CPUs but, it seems like a very weird thing to make a long video about. This has been a thing before. Even with lower end Intel there was a whole dual core problem many we're having back in the day. I'm not sure make a 200$ 1600x from 5 years ago run better with a 2080ti needs to be priority
This is definitely something reviewers and buyers should keep in mind. There is little point in watching reviews that test a GPU with high end CPU if your build is not going to use that same CPU. This same thing will happen to Ryzen 5 3600, and comparable CPUs, in a couple of years.
Also something to keep in mind if you are interested in high frame rates more than Ultra quality.
There is a difference between showing ZEN1 which has like haswell level of gaming performance which is a 8 years old CPU vs showing ZEN2 (which they didnt for a reason) or any intel CPU based on skylake since 2015 (which again they didnt) lol. All they showed from intel was 4 core i3. Also all of the results you see in the reviews are GPU limited. You see 1440p ultra not 1080p medium. People are not looking for 1080p medium when considering 3070,3090 or RDNA2
So, it's the same as it's always been. Here's a 20 minute video about it.
I'm not sure make a 200$ 1600x from 5 years ago run better with a 2080ti needs to be priority
It is. Not everyone can update to the new and shiny 5600x. So if you end up having to choose between a 3070 or 6700xt, this issue will severely affect one of those cards and influence its performance.
Of course, not everyone can do it. That even highlights how ridiculous this video is even more though no? The last thing any of us have right now is the luxury of choice. If you want something new then you're taking what's available or you're paying 2-3X MSRP. There's nothing budget about that either. It wasn't a great idea to run a 1600x with a 2080ti last year and it still isn't now. It should be no surprise that the new more powerful cards are also a bad idea regardless of the reason.
I just skimmed through that video again and unless I missed it they never even test a 6700xt,6800, 6800xt or 6900xt in the entire video. Honestly it just makes it that much more pointless when all the data they're showing is just proving that the older Ryzens fit better with Radeon 5700xt and below. Granted I didn't watch that whole thing again but, I clicked through all the timestamps for every game and only saw 3090, 3070, 5600xt and 5700xt results. Couldn't find a single 6000 series in the lot. So to reiterate the data shows us that the 5600xt and 5700xt are better GPUs for the 2600x and 1600x. We don't even really know if the 6700xt is good to go. Maybe it is or maybe it's going to be insanely bottlenecked as well. What about weaker Nvidia cards? Could they possibly work better with them? I bet there's a ton of people with 2060s and 2070 supers that have those CPUs and are perfectlt happy with the performance. Honestly the more I think about it the worse it gets.
It's also very suspicious to me that there's other comparison channels that already made Ryzen 1600s, 2600, 3600 and 5600 compared up to 4 months ago. They're not hard to find. Can we get another round of applause for HU and their unrivaled journalism!?
引用元:https://www.reddit.com/r/Amd/comments/m2mv1d/hardware_unboxed_nvidia_has_a_driver_overhead/