-
I was curious how much of a difference it would really be, but couldn't really find much for more CPU bound games. Initially I intended to grab a 5900X (580€), but my order was cancelled by the retailer after 4 months of waiting. Grabbed a 5800X (420€) instead.
System uses 2x16 GB 3200 MHz CL16 Dual Rank RAM (I might OC it again to 3600 CL16, but all in time). A Noctua NH-D15 was used for cooling. The CPU is paired up with a 3080 TUF (non-OC) running at stock. 650W Gold PSU (knock on wood, no crashes yet).
3700X had PBO on and +200 MHz, 5800X is stock (for now, haven't started to fiddle).
Usual boost behavior:
3700X with PBO all core around 4100 MHz, single core up to 4390 MHz. Max temperature was around 71°C during Cinebench runs.
5800X stock all core around 4600 MHz, single core up to 4800 MHz. Max temperature is around 85°C during Cinebench runs (this thing runs hot!)
Performance in games was measured with the Nvidia Performance overlay:
Benchmark 3700X 5800X Diff % Cinebench R23 SC 1277 1574 +23.3% MC 12452 14848 +19.0% Rocket League (Limit 250 fps, Freeplay) Avg. FPS 250 250 0.0% 99% FPS 208 226 +8.7% 7 Days to Die (Limit 154 fps, offline, solo, not moving) Avg. FPS 126 154 +22.2% 99% FPS 116 138 +19.0% Grim Dawn (Limit 154 fps, offline, solo, not moving) Avg. FPS 112 153 +36.6% 99% FPS 106 140 +32.1% Apex Legends (Limit 154 fps, Firing range, Lifeline, not moving) Avg. FPS 153 153 +0.0% 99% FPS 125 121 -3.2% Apex Legends (Limit 154 fps, Dropship during ranked looking over island) Avg. FPS 124 153 +23.4% 99% FPS 92 104 +13.0% Witcher 3 (Limit 154 fps, standing in forest looking at tents) Avg. FPS 148 149 +0.7% 99% FPS 129 130 +0.8% Valheim (Limit 154 fps, offline, solo, standing at spawn) Avg. FPS 93 103 +10.8% 99% FPS 66 79 +19.7% Escape from Tarkov (120 fps ingame limit, offline, no PvE, Customs near red warehouse, standing still) Avg. FPS 121 121 +0.0% 99% FPS 111 110 -0.9% Escape from Tarkov (120 fps ingame limit, offline, no PvE, Customs near red warehouse, quickly moving camera) 99% FPS 100 104 +4.0% Minecraft (155 fps ingame limit, OptiFine 1.16.3, BSL 7.2.01pre2 Shader, Empty private server, Fullscreen, Day, ingame stat counter) Avg. FPS 68 118 +73.5% Minimum FPS 54 109 +101.9% Minimum FPS during dips 28 39 +39.3%So all in all still a benefit, despite running at 1440p with a 3080. Some results were disappointing, for example Witcher 3 (I thought I'm still CPU bound, but it turns out the GPU was already very close to its limit). Other results quite surprising, Minecraft behaved really weird (I thought shaders would hit the GPU a lot harder, but suddenly with the 5800X it got a massive boost).
On second thought I should have benchmarked Witcher 3 in the middle of Novigrad, but now it's too late unfortunately.
This is subjective, but my system feels snappier. For example opening the browser and going to YouTube is quick as hell now, before that it hang a little at times. Could be my imagination, but so far it feels great.
Disclaimer: I did my best to keep the benchmarks as similar as possible per game. Exactly the same spot, looking at the same thing, waiting half a minute or so without moving before writing down the numbers.. but those are all games that are anything but static. For example in Valheim the fps keep changing over a full minute (weirdly going higher all the time) and of course something like Apex Legends is a mess to benchmark (So I only did a quick look down over the island from the start of the match and otherwise stuck to the firing range).
These are also the "worst case" benchmarks for a 5800X as I haven't played around with it yet. There are plenty of users saying theirs boost above 5 GHz. At stock mine only does 4.8 GHz at most, so there might be some room to improve. The 3700X was at its limit already.
Tiny update: Witcher 3 does run smoother, I had weird micro stutters before when running around. I still suck at Apex Legends, lol. Tried to OC the CPU (PBO), but my x570 Aorus Elite seems a bit broken here, neither the curve optimizer (undervolt) nor the maximum boost override seem to work (I'm always at stock speeds). Limiting the power though got me slightly better Cinebench scores as the CPU doesn't run as hot. Hopefully they'll fix the bios :-/
-
Thanks for the results, nice to see gains even on 1440p. I'm also cpu bottlenecked at 1440p with my 2600X in some games. Will probably upgrade later this year to a 5600X or perhaps to Warhol, if it will be already released.
-
not seeing all your results yet but I assume for 1440 and 4k gaming the fps difference will be quite low, maybe unnoticeable
ID: gqwtpkd -
thx for sharing...1440p adoption is getting worldwide especially the 1440p 144hz panels.
ID: gqxvnf4I recently upgraded to 1440p and I still find it weird how there’s so little content out there at resolutions higher than 1080p
ID: gqy4x52Which is exactly why so many of us are happy to wait.
480x with a brand new 5600x. Will be trying for a 6700xt. 1080p 144fps 144hz all day baby.
ID: gqy5yafThere is plenty for 4K by now. Unfortunately Netflix doesn't give you the 4K content when you run 1440p (I already tried to get it, no chance) 🙁
ID: gqy3u561440p gaming monitors are pretty much the same price as 1080p gaming monitors. All depends on what kind of GPU you have, and FPS.
-
I mean... you are playing at 1440p, so you are more GPU bound.
-
Part of the performance gain is probably from Nvidia GPUs heavily favoring faster CPUs, as demonstrated by Hardware Unboxed.
-
I was thinking of upgrading from 3600 to 5800X but I might not bother if it only gains about 10%. The only gains seem to be in lighter games which are getting very high fps.
ID: gqyaxo7Yep, it's not a massive upgrade, but at least noticeable. The whole system feels snappier.
I wanted to use PBO for the 5800X (others report clock speeds of up to 5 GHz), but my current BIOS doesn't seem to work for override max boost :-/
So hopefully I can get a bit more performance out of the CPU in the future.
-
For those interested, I made this same upgrade (3700x-5800x, but with 32GB 3600mhz cl16) and there was a significant gain in FPS in MSFS at 1440p. Running EVGA 3080 XC3. Very slight gains in Cyberpunk 2077 with settings mostly on ultra with RTX/DLSS on.
ID: gqycpqeDo you use PBO? On my x570 Aorus Elite (f33c BIOS) both max boost override and curve optimizer don't seem to work. My CPU always boosts to stock speeds (Around 4820 single core, 4600 all-core) 🙁
ID: gqyihoxYes I’m using PBO and get between 4825-4850 all core. Using Noctua U-12s, which seems to do an alright job but I’ll probably get the new 120mm noctua and then do some proper benchmarking/tuning after that.
-
Made the same upgrade, it is noticeable, especially in competitive games. Zen 3 is a beast.
-
Witcher 3 greatly benefits from memory bandwidth. OC your ram to 3600 or 3800 and you will see a difference.
-
Peoples who said upgrading cpu for 1440p wont make a different because at that resolution are gpu bound can stfu now. I have a 3600x and cant wait to upgrade it, its shit in Warzone with a 3080.
-
I don't really get the point of this? You're comparing an "overclocked" 3700x to a stock 5800x. The numbers aren't really useful other than in that specific use case.
It would have been more useful to see both CPUs represented in the same way.
ID: gqx18wbThere is zero difference in games for a 3700X between stock and OC. The only thing that changes is MC benchmarks because at stock the CPU power limits.
Single core score in Cinebench doesn't change even a point when you use PBO.
But yeah, the 5800X might get even faster with tweaking, but I can't guarantee it would actually be 100% stable with that (I just got it today..). The 3700X was stable, not a single crash in over a year of using it daily.
Doing a quick and dirty 5800X OC and then claiming "it's x faster!" would be irresponsible.
ID: gqx5djwI still think for the sake of clarity they should be represented in the same way. Also, your title implies you're comparing a base 3700x Vs 5800x.
If someone doesn't know about PBO or these things, then it could be confusing.
-
before OCing the RAM check the die with Thaiphoon.
If its anything else then Micron E Die or Samsung B Die, you kill the RAM.
ID: gqwwhyrIt's Micron E-Die, I already had it OCd to 3600 CL16 at 1.42V, but after half a year of usage it got unstable again (Unstable for me is 1 error after 5 hours of Karhu Memtest).
Tried to quickly fix it my loosening timings, but it somehow only got worse. So I went back for XMP and decided I'll only invest the time again with my new CPU.
RAM OC is a pain in the butt, I wish I would have bought 3600 CL16 from the start.
ID: gqy3v1mUpdate your flair.
Also I would be interested in what your XMP looks like, timing wise.
引用元:https://www.reddit.com/r/Amd/comments/m4yelm/3700x_to_5800x_with_rtx_3080_1440p_155hz/
Now I'm done, editing a large table on Reddit is a pain in the butt.