[Hardware Unboxed] Nvidia Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs

1 : Anonymous2021/03/11 10:01 ID: m2mv1d
[Hardware Unboxed] Nvidia Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs
2 : Anonymous2021/03/11 21:09 ID: gqm8r0w

This goes to show the absolute importance of benchmarking at 1080p (techpowerup made a mistake changing their absolute benchmark to 4K after the 2080ti), even an argument can be made that Ultra settings make things more taxing and the 3090 takes the lead it was only super marginal when compared to a 5700xt.

Basically, the rule of thumb, is benchmark what people actually use, 4K while interesting is more niche than fucking VR. If you look back far enough you see me arguing the opposite, but that changed the minute I saw the 4K monitor steam data.

3 : Anonymous2021/03/11 21:24 ID: gqmaon7

Does anyone member when this was reversed? I member.

5 : Anonymous2021/03/11 18:35 ID: gqlniy9

Seems like quite the claim after trying this with 2 games.

6 : Anonymous2021/03/11 21:27 ID: gqmb3in

They tested a lot more than 2 games, extra information is available to their floatplane/patreon subs as well.

7 : Anonymous2021/03/11 20:50 ID: gqm66ug

Lets be honest here, Nvidia drivers actually suck behind the scenes. Even booting up the damn control panel. I have a 3950x and a 3090 combo with 32gb ram. NVME and everything. Opening the driver control panel takes time. Even navigating within, takes time. Saving the settings, takes even longer. Its absolutely horrific. But what can you expect when Nvidia doesn't really care about gamers. Yeah, they may have been a gaming company made by gamers for gamers, but they have morphed and changed into something that doesn't give two shits about gamers. Yes my 3090 performance is pretty fucking amazing. But I honestly miss the ease of use and drivers from AMD. I also hate that Nvidia nerfs performance on freesync monitors on purpose, so I have to run a custom resolution to fix the issue.... Funny, when I run an AMD gpu, I never have to run a custom resolution, it just works. (the issue being ghosting. nvidia doesn't read the monitor specs properly, and tries to run outside the monitors range, and you get ghosting. running a custom resolution you can plug in the proper data like AMD gpu's run, and thus the monitor works correctly without ghosting)

8 : Anonymous2021/03/11 17:27 ID: gqle2k2

So if you have anything less than a fast CPU, use team RED!

9 : Anonymous2021/03/11 19:20 ID: gqlttkn

I noticed this back when I had an FX processor and switched my Radeon to a Geforce (because no Radeons were in stock for 4+ months) and instantly I saw poorer performance.

10 : Anonymous2021/03/11 12:03 ID: gqkc0rq

TL;DW version would be the driver overhead issue affects Nvidia cards, both Turing and Ampere, specifically in cases presenting with CPU bottlenecking. In cases of CPU bottlenecking (for example on the video using Ryzen 1600X and 2600X and Intel 10100), Nvidia cards would have 20 - 30% reduction in average FPS in comparison to AMD cards. The way to "fix" this would be to eliminate the CPU bottleneck by introducing "GPU bottleneck" (increase the graphics settings); on higher GPU loads, the cards tested would perform as expected.

This is the same thing that happened with Navi vs. Turing series; AMD cards tend to age like fine wine, in my opinion. That being said, this is a minor problem to be presented; I think this meant that Nvidia drivers are more "sensitive" to CPU bottlenecking than AMD drivers.

The video had also referenced an older video (about 3 years ago, courtesy of NerdTech); this is not the first time this happens with Nvidia cards.

11 : Anonymous2021/03/11 14:08 ID: gqknurr

I still believe Nvidia has the upper hand in most DX9-11 games, particularly those that are more single-thread bound, as their approach to scheduling works wonders for alleviating the drawcall bottleneck in the main thread, and has since their 337.50 "Wonder Driver" from 2014. OpenGL is not even a question, AMD just stinks there.

AMD has caught up somewhat, but I think this improved CPU performance stems from modern APIs as well as properly multithreaded DX11 games. I'd like to see this kind of testing made with CS GO, ARMA 3, Kingdom Come: Deliverance, World of Tanks, GTA 5 or even Fortnite.

GameGPU.com has a GPU chart that can change depending on CPU. I recently took a look at the games with the i3 4330 chosen, and the results were mixed, which is better than a few years ago, when you could see a clear split with Nvidia at the top and AMD at the bottom in most games.

Also, a similar test with more games shows it's a mixed bag:

12 : Anonymous2021/03/11 21:10 ID: gqm8w52

GameGPU.com has a GPU chart that can change depending on CPU.

That chart was fake. They "estimated" results rather than actually running tests. It's impossible to know which results are real and which are made up.

13 : Anonymous2021/03/12 01:31 ID: gqn4het

That chart was fake

How do you know?

14 : Anonymous2021/03/11 11:48 ID: gqkawyz

oh the nvidiots is not gonna like this one, paying more for a worse product, big yikes

15 : Anonymous2021/03/11 15:47 ID: gql0byq

This is just stupid fanboyism. I own a 6800, but to be honest 3080 vs 6800xt is a 3080 any day of the week

We still don’t even have information about FidelityFX Supersampling which we were promised to know/learn about when the 6900XT launched. Product>>promise

16 : Anonymous2021/03/11 16:17 ID: gql4f6w

ah yes there's the stupid fanboyism! 3080 10 gb vram shortage =) fools be blinded

17 : Anonymous2021/03/11 12:06 ID: gqkc8ja

Not exactly paying for worse product, more like paying for cards that is more sensitive to CPU bottlenecks in comparison to AMD cards (or more specifically, Nvidia drivers are more sensitive to CPU bottlenecks in comparison to AMD drivers).

There is no question that Big Navi RT is leagues behind Ampere RT performance (but that is besides the point).

18 : Anonymous2021/03/11 12:38 ID: gqkeunn

It's more like amd scales better with lower end cpus when it comes to dx12/vulkan, Nvidia is still king on dx11.

19 : Anonymous2021/03/11 10:13 ID: gqk2p8m

[deleted]

20 : Anonymous2021/03/11 10:15 ID: gqk2vqq

It's the worst GPU because it's not worth the price difference compared to the RX 6800 XT.

21 : Anonymous2021/03/11 10:17 ID: gqk32wf

The 3080/3090 are the same then. 2GB more VRAM than the 3070 and a small bit more speed for 100% more price.

And 3090 even worse with like 300% more price than the 3080.

22 : Anonymous2021/03/11 10:21 ID: gqk3ea2

[deleted]

23 : Anonymous2021/03/11 10:54 ID: gqk668w

If all GPUs were at MSRP, the 6900XT and 3090 would be both shit for gamers.

The 6800/XT and 3080 are much better bang for buck with similar high perf. As for the 3090, at the least it has good CUDA support so prosumers can benefit from it. 6900XT isn't even ROCm supported!

So HUB's conclusion is very accurate.

24 : Anonymous2021/03/11 10:22 ID: gqk3hh4

[deleted]

25 : Anonymous2021/03/11 11:17 ID: gqk88r5

The RTX 3070 is $500 MSRP and gives the same performance as a 5700 XT, when paired with a Ryzen 3600 and run at 1080p medium / HRR. The 3070 should be smoking the 5700 XT in every benchmark.

Ryzen 3600X + RTX 3070 is a very realistic build, it should be noted.

26 : Anonymous2021/03/11 10:38 ID: gqk4vy4

This comparison is also a bit pointless for $1000+ GPUs, you shouldn't pair a 2600X with a 6900XT/3090 anyway.

hey! That's me!

27 : Anonymous2021/03/11 11:38 ID: gqka8mi

Yeah. I mean, it's interesting in a way. But if you're pairing a 3080/3090 with a 1600/2600x or playing at 1080p you should probably rethink your purchasing priorities. lol

So the TLDR is that Nvidia drivers aren't optimized to work with outdated hardware essentially?

28 : Anonymous2021/03/11 10:28 ID: gqk410u

HUB has some strange concepts to say the least. I would advise you to form your own opinion.

29 : Anonymous2021/03/11 11:28 ID: gqk98rr

strange concept like what? paying 50% more than 6800xt for 10% more performance?

30 : Anonymous2021/03/11 12:17 ID: gqkd5h6

Let me guess... another "HUB is AMD fanboy" post?

The concept and takeaway of the video would be to watch other videos for your particular combination of CPU and GPU; the reviews with test benches are performed to evaluate raw GPU performance, thus it requires the elimination of CPU bottleneck (with test benches often using the highest-end CPU available for consumers).

While the less performance of RTX 30 series cards in CPU-bottlenecked scenarios are quite startling, it is "not a big deal" if you crank up the GPU workload.

31 : Anonymous2021/03/11 10:58 ID: gqk6leu

I don't really think they have "strange concepts" as you said. 3090 gets more VRAM compared to 6900XT and 3080, more performance in both RT and traditional field, plus DLSS and better encoder. 3090 is good for those who want the best of the best and don't really care for price tags. 6900XT on the other hand, isn't that much different from 6800XT since the only difference is literally 8 cu. You can OC a 6800XT and it will match 6900XT

32 : Anonymous2021/03/11 19:17 ID: gqlthjm

I dont understand the testing methodology here. It has been known for a while that nvidia gpu's is not a good combo with any pre zen 3 cpu. I think a better test would have been to use one powerful cpu such as a 5800x/5900x to test a 3090 and 6900xt. Test both gpu's with the exact same game settings on the same cpu and figure out what their average fps is. Then use RTSS to set a frame cap to something lower where both GPU's can hit that frame cap the whole duration of the test, then examine the cpu usage. This way the cpu and gpu's have the same "game" load and we can observe differences in the cpu usage caused by the drivers. I think what this video shows is that the nvidia drivers are hindered by the pre zen 3 cache latency performance. In a perfect test scenario there will be driver overhead differences between amd and nvidia, but I dont think it will be anywhere as large as whats being shown here.

33 : Anonymous2021/03/11 21:26 ID: gqmazvz

I think you misunderstood what's being tested and why here.

The point is that Nvidia's drivers have considerable overhead, and when you run into a scenario where you become CPU limited (for whatever reason), performance basically takes a double-hit: You're CPU limited for the game, and now that impact is also impacting your GPU.

Which effectively means that as time goes by, your performance degrades, assuming game engines CPU load and GPU load increase, which has always been true.

34 : Anonymous2021/03/11 22:23 ID: gqmiecp

Im not misunderstanding the test, my argument is that I dont think its a good test case for testing the claim that nvidia has higher driver overhead. He tests cpu overhead by comparing performance across different cpu's which introduces more variables. What I was trying to say in my previous post is that cpu overhead could have been tested and compared using a single cpu. If a game is capped to a frame rate that both gpu's can easily do, then the game work load should be the same for both gpu's. But then we can observe the cpu usage to determine which gpu needs more cpu to render the same amount of frames.

35 : Anonymous2021/03/11 18:17 ID: gqll2d3

lol

/comments/m2muts/hardware_unboxed_nvidia_has_a_driver_overhead/" class="reddit-press-link" target="_blank" rel="noopener">https://www.reddit.com//comments/m2muts/hardware_unboxed_nvidia_has_a_driver_overhead/

36 : Anonymous2021/03/11 16:07 ID: gql32z9

Nvidia GPu's doesnt like the duct tape in the ZEN1. We already knew that. Thats not to say Radeon isnt scaling better in lower resolutions. They are. Have atleast 6 core intel from the last 4 years or ZEN2+ and you are fine. ZEN1 gaming performance was honestly trash. No matter if its radeon or geforce. Who even has that with GPUs of that caliber anyway ?

37 : Anonymous2021/03/11 16:25 ID: gql5hrp

They tested on Intel 10th gen too.

38 : Anonymous2021/03/11 16:37 ID: gql75cc

I think you missed the point of the video. Besides they tested it with an i3 10100(or whatever it is called).

39 : Anonymous2021/03/11 16:59 ID: gqla3qj

Nvidia defenders assemble!

40 : Anonymous2021/03/11 17:34 ID: gqlf09o

Its more about the fact that my intel cpu is as old as ZEN1 but it has no issues pushing 150+ fps. i mean, makes you think about how trashy the first gen ryzen was. People got bamboozled with the low per core pricing

41 : Anonymous2021/03/12 01:40 ID: gqn5gae

I think Nvidia's software scheduler multithreading approach meant a lot of cross-CCX communication, which had quite the penalty in early Ryzen.

42 : Anonymous2021/03/11 16:26 ID: gql5lxn

You are forgetting that GPU performance isn't static. As time goes on CPUs like the 1600(X) and 2600(X) are likely to be paired with new midrange GPUs which will perform similarly to higher end GPUs from older generations.

Also it's an objective short coming of Nvidia's driver.

You also seemed to have missed the part of the video where Steve confirmed that this wasn't just an issue with Zen 1 CPUs.

43 : Anonymous2021/03/11 16:52 ID: gql95ku

this issue will be present with a 2600x-3060/3060ti combo , which should be a logical pair and yet still suffer from the issue in the same cpu bound titles

so it needs to be adressed...

people act like no one will pair 2600x with a 3070-3080, ok i can get that, but im quite sure this issue will happen with any rtx 3000 series gpus

let's say, rtx 3060ti equivalent amd gpu gets better fps with a ryzen 2600x, then that's a lost customer for nvidia

44 : Anonymous2021/03/11 17:39 ID: gqlfnpr

You also seemed to have missed the part of the video where Steve confirmed that this wasn't just an issue with Zen 1 CPUs.

but it is. Iam not buying that 4 core i3 result. Which by itself is smashing ZEN1 anyway but thats not the point. Point is that iam not getting those results he showed. Hell i can play 720p 200+ fps and then be CPU tied but his video shows that Ampere can barely push over 100fps when CPU tied. i mean what the hell. Even with the ZEN3 its still need to be tied to ZEN overall as he already showed intel vs amd cpu limited benchmarks. idk when including 5600X he could atleast include 10th gen as well

45 : Anonymous2021/03/11 17:03 ID: gqlaoqy

Probably the most intelligent frame-rate benchmark testing I've seen in quite some time!...;)

46 : Anonymous2021/03/11 12:37 ID: gqkermz

Maybe Nvidia haven't optimized their flagship drivers for lower end CPUs but, it seems like a very weird thing to make a long video about. This has been a thing before. Even with lower end Intel there was a whole dual core problem many we're having back in the day. I'm not sure make a 200$ 1600x from 5 years ago run better with a 2080ti needs to be priority

47 : Anonymous2021/03/11 13:19 ID: gqkikm3

This is definitely something reviewers and buyers should keep in mind. There is little point in watching reviews that test a GPU with high end CPU if your build is not going to use that same CPU. This same thing will happen to Ryzen 5 3600, and comparable CPUs, in a couple of years.

Also something to keep in mind if you are interested in high frame rates more than Ultra quality.

48 : Anonymous2021/03/11 16:09 ID: gql3dmn

There is a difference between showing ZEN1 which has like haswell level of gaming performance which is a 8 years old CPU vs showing ZEN2 (which they didnt for a reason) or any intel CPU based on skylake since 2015 (which again they didnt) lol. All they showed from intel was 4 core i3. Also all of the results you see in the reviews are GPU limited. You see 1440p ultra not 1080p medium. People are not looking for 1080p medium when considering 3070,3090 or RDNA2

49 : Anonymous2021/03/11 13:25 ID: gqkj79w

So, it's the same as it's always been. Here's a 20 minute video about it.

50 : Anonymous2021/03/11 21:13 ID: gqm97t4

I'm not sure make a 200$ 1600x from 5 years ago run better with a 2080ti needs to be priority

It is. Not everyone can update to the new and shiny 5600x. So if you end up having to choose between a 3070 or 6700xt, this issue will severely affect one of those cards and influence its performance.

51 : Anonymous2021/03/12 02:59 ID: gqnefuk

Of course, not everyone can do it. That even highlights how ridiculous this video is even more though no? The last thing any of us have right now is the luxury of choice. If you want something new then you're taking what's available or you're paying 2-3X MSRP. There's nothing budget about that either. It wasn't a great idea to run a 1600x with a 2080ti last year and it still isn't now. It should be no surprise that the new more powerful cards are also a bad idea regardless of the reason.

I just skimmed through that video again and unless I missed it they never even test a 6700xt,6800, 6800xt or 6900xt in the entire video. Honestly it just makes it that much more pointless when all the data they're showing is just proving that the older Ryzens fit better with Radeon 5700xt and below. Granted I didn't watch that whole thing again but, I clicked through all the timestamps for every game and only saw 3090, 3070, 5600xt and 5700xt results. Couldn't find a single 6000 series in the lot. So to reiterate the data shows us that the 5600xt and 5700xt are better GPUs for the 2600x and 1600x. We don't even really know if the 6700xt is good to go. Maybe it is or maybe it's going to be insanely bottlenecked as well. What about weaker Nvidia cards? Could they possibly work better with them? I bet there's a ton of people with 2060s and 2070 supers that have those CPUs and are perfectlt happy with the performance. Honestly the more I think about it the worse it gets.

It's also very suspicious to me that there's other comparison channels that already made Ryzen 1600s, 2600, 3600 and 5600 compared up to 4 months ago. They're not hard to find. Can we get another round of applause for HU and their unrivaled journalism!?

引用元:https://www.reddit.com/r/Amd/comments/m2mv1d/hardware_unboxed_nvidia_has_a_driver_overhead/

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x