- Nvidia Resizable BAR Tested, As Good as AMD SAM?
-
Timestamps: 01:24 - Test setup 05:35 - Assassin’s Creed Valhalla 06:02 - Forza Horizon 4 06:26 - Horizon Zero Dawn 06:42 - Borderlands 3 06:55 - Cyberpunk 2077 07:11 - Star Wars Jedi Fallen Order 07:31 - Wolfenstein Youngblood 07:44 - Shadow of the Tomb Raider 08:00 - Far Cry New Dawn 08:28 - Watch Dogs Legion 09:32 - Death Stranding 09:56 - 20 Game Average 10:36 - 1080p Breakdown 11:08 - 1440p Breakdown 11:30 - 4K Breakdown 11:47 - Radeon RX 6800 vs GeForce RTX 3080 12:43 - Final ThoughtsID: guczwsm
-
People are forgetting AMD planned this when they were creating 6000 series, Nvidia essentially patched in a fix to be able to say they also have the "technology" available.
I wonder how much of this is not only optimizations from AMD beforehand but also, Infinity Cache, 6000 series and Zen 3 arch all coming together in unisson.
ID: gudk8k0also, Infinity Cache, 6000 series
I would say only these points, points one and two. I would think the Zen 3's unique design has less to do with the increased performance it offers over other processors other than its raw performance level (that is, higher throughput and latency). Zen 3's best in class raw performance (or higher throughput and latency) of any x86 processor today definitely contributes to the higher gaming performance regardless of the brand of card that you have. Nonetheless, all the CPU does is copy data out from the CPU memory space to the GPU's memory space in the shared BAR (or I/O region), which is now larger than 256 MB. So I wouldn't say necessarily there is a secret sauce in Zen 3's microarchitecture that gives it a certain synergy when mixed specifically with AMD GPUs. So, if Intel comes along with a faster processor when Alder Lake is released, I would say only points one and two will hold true because (as you rightly observed) Navi's unique Infinity Cache-based memory topology that will allow it to more advantageously cache items from the enlarged shared I/O region (or the BAR) in its GDDR6 memory to its unique on-die Infinity Cache.
ID: gudl6gbI meant more how Zen 3 has gigantic cache levels that might have something at play with Infinity Cache and alike.
But It's just my suposition.
ID: gudrb9qThis is false. nVidia was already using it for some of their Tesla compute cards, many of which are based on the same silicon than they use for their gaming cards. Example of 2016 Tesla P100 (It is GP100 based so there is no GeForce equivalent, but on other cards you may find it too):
-product-literature/NV-tesla-p100-pcie-PB-08248-001-v01.pdf" class="reddit-press-link" target="_blank" rel="noopener">https://www.nvidia.com/content/dam/en-zz/Solutions/Data-Cente-product-literature/NV-tesla-p100-pcie-PB-08248-001-v01.pdf
BAR1: 16 GB (Compute) or 256 MB (Graphics)
You may also find articles on nVidia about switching from graphics to compute mode, so I suppose that these sizes were fixed depending on mode.nVidia was forced to expose this feature on GeForces because AMD did so first, and people reacted as it was a major feature with everyone asking for SAM support. I don't see it as a major feature, I see it as "it was there all along and people JUST noticed that it was never enabled".
People are forgetting AMD planned this when they were creating 6000 series
According to TechPowerUp, that's AMD being "lucky"
Short answer: no
Welp thanks for that TLDR. So is this another driver overhead for nvidia?
Not likely, there were either more things that weren't as optimized already in AMD gpus and thus they profited more from resizable BAR or AMD has implemented it better by taking bigger steps on an architectural level, but with both resulting in the larger performance gap
On AC: Valhalla benchmark with 11900k, rtx 3090, 1440p theres a 12% improvement. 17% on 720p. Not sure why there's such a huge difference in results from what they got.
720p bar off
720p bar on
1440p bar off
1440p bar on
Could be a number of things, one being where they tested, second the fact that their config is different than yours. Also, 8% vs 12% isn't exactly a huge difference
I'm pretty sure the resizable bar has a bigger effect the more vram on the card, 3080 is affected less than 6800xt bc 10 vs 16GB, makes sense that the 3090 would see the most benefit of the Nvidia cards with the 24 gigs of vram
3090 has a larger VRAM and thus better performance with RBAR.
Some results are really strange- performance downgrade at 1080p and 1440p, but uplift at 4K. With just 10GB VRAM on the 3080, you'd expect it to be the other way around.
Now it would interesting to see 3080 (10GB) vs 3090 (24GB) comparison. Nvidia is rumoured to release Ampere rebrand with more VRAM soon, so if 3090 gains from ReBar are better- the same gain could be expected from those new cards. Also- the gain on 3060 12GB should be higher too.
Some results are really strange- performance downgrade at 1080p and 1440p, but uplift at 4K. With just 10GB VRAM on the 3080, you'd expect it to be the other way around.
Resizable BAR doesnt mean you have less VRAM available for your games. It should not impact GPUs with little VRAM badly. Even 4 GB cards would be OK with it.
It is only 256mb (or less) cards that would crumble from it, hence why it exists.
TL;DR AMD has better gains in titles it works and AMD has no regression in titles where it runs bad (except 2% in one title)
Nvidia ended up being less than 1% faster until we go to 4k where it was 2% better. But this isn't the whole story
The takeaway from this is once Nvidia actually blacklists games that run worse in their driver SAM will be worth turning on if you have an nvidia GPU but right now it is only worth it if you play specific games.
Personally if I owned Geforce I would turn it on as the games I play like Destiny 2 benefit greatly
If you own AMD you should always turn it on anyways.
Looks like Nvidia needs to do more validation testing on their whitelist. I wonder if they can do it per-resolution aside from per-game.
Yep, just needs some time to mature (on both brands).
They may be able to do it by game engine rather than per game, suspect a lot of games using the same back end (unreal, unity etc) may be able to get optimised in driver on mass.
It also seems to be reducing performance in games that are not on the whitelist. This shouldn't be happening.
IF you look at the majority of the ones that lost raw FPS, they saw an uptick in 1% lows. To me that means it's addressing microstutters, so it appears that it's a QoL improvement vs a raw performance one.
For that matter, I really wonder how it performs with DLSS enabled. Same +1% as seen at 1080p when displaying pseudo 4K with DLSS, or same +2% as playing at v massive 4K?
How do i enable SAM on my i7 8700k and rx 6800xt combo?
you should check on your motherboard page if there is a bios update for SAM support, im not sure if it is possible right now to support sam with 8700k cpu though.
Resizable BAR has nothing to do with the processor. It is a an open standard part of PCIe for more than a decade. You just need the motherboard to be updated. So with a 8700k likely a z370 or z390. Some of those will have a bios update and others not. Some like Asus are adding it in the next month or two for z370/z390.
So go check the motherboard support page under bios updates.
it's called ReBAR (resizeable BAR) - SAM is just a BS marketing term
No shit but i thought it would clarify that i use a AMD / Intel combo and not a nvidia / intel combo
I have a feeling that nVidia won't like the editorial direction of this video.
Shame, hopefully it gets better further down the line, i'll probably leave it off for now
Why say no to free performance where there is?
I guess just in case there are games I play where performance is worse
Because it's making some games that are not on the whitelist perform worse. What's the point of a whitelist that doesn't work? The whitelist was supposed to prevent a bad experience, now we have to test every game to make sure there isn't a regression when Nvidia promised this wouldn't be the case.
To add insult to injury even some games that Nvidia supposedly tested and added to the whitelist, like Watchdogs Legion, perform worse.
Not the same guy, but I leave it off for now so the GPU BIOS has time to mature. Don't want to break my 3080 with a faulty update.
I do have dual BIOS if worse comes to worst, but the current numbers are not worth it and some games actually run worse.
Not playing AC:Valhalla, so it can wait.
Nvidia themself said that it will be evolving in the coming months. Remember AMD prepped their move a lot time in advance, while nvidia patched it.. Nvidia has way more money for this so i expect to be a little bit better in the coming months
I don't know why you're getting downvoted, look at 3Dvision, it's great now!
Is this sarcasm?
People can graps that nvidia have more money to throw at drivers than amd, amd did boost it up last year after the black screen madness with the 5xxx series. Amd have to manage cpu and gpu at the same time so it's normal.
SAM is probably part of ms new console too si they had time to play with it. Just great that they have put that tech into the pc world too, witch forced nvidia to do the same in a kind of panic move. Just as amd is working on a dlss type thing with MS right now.
It's awesome for us customers
I wonder if Nvidia's implementation is "shit" or smaller increase is due to 3080 in this case already having enough bandwidth (due to GDDR6X) to satisfy it's cores or it's just unfinished yet (aside from BAR enabled games which shouldn't be). I remember before 3090 reviews came out (they were released week or two after 3080) people were speculating that 3090 having higher bandwidth on their G6X RAM will truly unlock Ampere's performance, but it wasn't the case. The GPU was only about 10% faster aka around the number of cores it has over 3080, more bandwidth didn't do much. Retesting with 3070 and 3090 should give us a clearer picture. If my theory is right 3090 should benefit even less while 3070 (due to having only G6) should benefit more. More testing is definitely needed, though I feel for the tech guys, the amount of testing and combinations of them is getting kinda ridiculous.
I don't know why people are comparing it to AMDs implementation. What is ruling out the possibility of it simply being a change that benefits architectures like Navi's one more?
I don't see how your 2 sentences are related. Something being better on one platform than another doesn't mean they shouldn't be compared, in fact it means they should be compared.
Sorry, I didn't express myself properly, what I meant is that we can't really call nvidia's implementation "worse" just because it doesn't net the same gains.
Its the same technology, resizable bar being from the pci-e spec, it comes down to driver optimization and who does it better
Did you watch the video? Steve also speculates that AMD has better gains as they developer RX 6000 series with SAM in mind.
it comes down to driver optimization and who does it better
Why? Nvidia's implementation could be "better", despite providing less of a performance advantage.
The way people downvote on this sub is peculiar.
We shall see
Hardware Unboxed has always had a Pro-AMD, Anti-NVidia slant
their video was unsurprising
What is surprising is that nobody's talking about the fact that ReBAR will only improve on NVidia (AMD had a lot more time to optimise it), and it's a free real-world performance boost
Benchmarks comparing the RX 6000 and RTX 3000 series will need to be updated, and will show an EVEN LARGER gap than before.
I want to like AMD, but NVidia is just better for gaming
After many years we are in a position that we can say that there is no company which is universally better for gaming in any meaningful way at this moment, for same games is amd for some other is Nvidia for old but solid CPUs ( r5 2600x) amd is by far better because of overhead for extra features like dlss if someone have good monitor and average gpu and play dlss games nvdia is better. And for now at least for rebar amd is better.
For me if someone want to buy a gpu now days(impossible i know) there is no go to gpu company anymore, It depends in availability (unfortunately) badget , what cpu you have, what games u want to play and what monitor u have.
Crying shill because findings based on objective measurements don't agree with your slant of fanboyism. Shiggydiggydoo
Can someone explain to me what resizable bar is ?
引用元:https://www.reddit.com/r/Amd/comments/mq05vy/nvidia_resizable_bar_tested_as_good_as_amd_sam/
InvincibleBird
Thanks for the TimeStamps i forgot them.