I have to say I am fairly pleased with this new release from AMD, however, I do have some recency bias from not having upgraded for 4 years. I'll post my specs below and detail a few observations I had.
AMD Ryzen 5600X
AMD Radeon 6700XT
G.Skill Ripjaws CL16 DDR4 32Gb 3600Mhz (and actively runs at 3600Mhz)
MSI B550-A PRO (SAM enabled)
Corsair RM750X PSU
At first glance, I noticed the GPU idling at 50C but after further research, learned that this is normal because of the zero RPM fan which can be toggled on and off. During full load it never exceeds 77C and never seems to have any issues loading textures or accounting for voltage changes. Smart Access Memory appears to be a "hit-or-miss", meaning it really does depend on the individual game as to whether it will give you an increase in framerate. As for the sound, it really is a decently quiet card. In fact, my crappy AMD stock CPU heatsink was the number one contributor of fan noise.
I've ran several games on it and of course it depends on the game variables such as CPU dependency, graphics intensity, and background processes, but on ULTRA settings 1440P, it gets around 115FPS. Reflecting on specific benchmarks available on the web, I noticed that I get around 10FPS less compared to what they are reporting. Not a huge error of margin for the average gamer with a 144hz screen.
I was pleasantly surprised with AMD's Radeon software "Adrenaline 2020 edition". Coming from an NVIDIA platform, the user interface was simple but had far more options and features compared to Geforce Experience. You can adjust clock speeds, control streaming and recording options, optimize game settings, see FPS averages, and monitor your metrics real time. It really does the job of many programs and provides them in a single cluster. NOTE: I do wish they could provide CPU temp metrics.
Damn your specs are real nice! One thing I have to ask though is where did you get that 6700xt from?ID: gsardtx
Straight from AMD Direct. I have to say it was 90% luck and 10% effort. I was on the webpage about 10 minutes early and kept refreshing until the 9AM EST drop. However, they released it a minute late. Once I saw it appear I tried adding it to my cart. That didn't work, kept refreshing, sometimes getting 503s and after a min it was added.
Then was the task of entering in my shipping and billing details. Same process, constant refreshing and 503 crashing for a bout a minute and a half. All the while i'm sweating recklessly.
Once I get passed billing, is the submission of order. Same process, and made it out ALIVE!ID: gsbqa5q
This is the wayID: gschvse
it was 90% luck and 10% effort
And 100% reason to remember the name?ID: gscawid
Same thing happened to me. Button became available at 9:03 and nevertheless I was able to grab one as well.ID: gsarjkx
There is a way..! X)ID: gscqeep
How do you figure out when the drops are?
Its essentially a RDNA 2 refresh of the 5700XT, its slightly faster and more efficient, not sure its worth its MSRP however, especially the reference versions.
Its a good card if you dont have a GPU but for anyone wanting to upgrade from a 5700XT its terrible value at its current prices.ID: gsav4mh
I will agree with you, but I note my recency bias lol. However, it may be 2023 before we see regularity.ID: gsavlfk
Yeah, if you can find a GPU right now grab it while you can but under normal circumstances this card would be collecting dust on store shelves at its current MSRP.ID: gsb6iwd
I wouldn't have bought one if I didn't sell my 5700xt.
I was able to purchase a 6700xt reference for $479 from Microcenter and sold my 5700xt to a miner for $800. So I literally made money from upgrading. While I don't think its worth the money, in this particular market it was worth it for me.ID: gsb0oui
I took the upgrade because I could do it for "free".
Although, I can't believe I paid MSRP for a ROG Strix 6700XT. (Basically MSRP for a 3080/6800XT).ID: gsbwk18
Still much much less than what I find 3070/3060Ti/6700XTs at where I'm at... (~$1500)ID: gsb3gsy
It's not terrible value if you can sell your 5700XT for more than a new 6700XT costs @ AMD (check second hand prices in your area, they're absolutely insane here). The hard part is ordering one.ID: gsbi99h
I got crazy lucky last year. My old 1050 rig died in August and I replaced it with a whole new Red Devil 5700XT build. Had I not done that and waited for 6800XT - as originally planned - I'd have lost my mind completely by nowID: gsbds2p
Let’s be honest. Unless you have money burning a hole in your pocket, the performance jump in a single generation is never sufficient to justify the upgrade.
The 6700XT is just a hair shy of the GPU in the XBox Series X. The OP refreshed their system for the first time in four years. Realistically, this card will last at least as long.ID: gsbj4ua
Very true. Although the current miner market is so insane, the mining capabilities of the 5700 XT have it selling for 3x it's MSRP. So, if you can find a 6700 XT, you can theoretically upgrade, sell your 5700 XT, and still pocket money.
what were you using before?ID: gsb0evs
NVIDIA GTX 1060
not bad for the time lolID: gsb6jk0
NVIDIA GTX 1060
It feels like this was the most common card for that period. It's like everyone who has upgraded recently in the last couple of years started with the 1060.
How does it work with latest Linux ?ID: gsbmr38
Can't comment on The 6700XT specifically, but the 6800XT works well in Fedora. I did install from a respin instead of the original media, just to be sure I wouldn't have any mesa or kernel issues that have since been fixed.
Wayland worked, but I've since had to drop back to Xorg (ugh) because DRM-leasing isn't in Wayland yet.ID: gsb4hax
Honestly, have not tested that. Sorry!
I just got my AMD reference 6700XT yesterday as well. Also loving it, somehow and I don't know how I got the #1 spot on 3dmark Timespy with it for a 6700XT with Ryzen 3600
Super impressed with is so far.
I owned both new gen AMD and NVIDIA and agree that Radeon software is sooo much more user friendly. Looking/navigating the NVIDIA control panel takes me back to 1967.
Each time I've installed an NVIDIA GPU, my game looks like shit due to weird ass settings. The AMD GPUs were basically plug and play.ID: gsbjax4
Its a world of difference, up until yesterday I was ignorant. Now I'm asking my self, did I live under a rock? Why didn't anyone tell me about Radeon Software!!ID: gsbkiwo
Very user friendly!!!!
Except the part where they tell you, you need a 3950x or 5950x for maximum Among Us performance/ID: gsbe1la
Weird. Never had any problem with my NVIDIA GPU. Maybe it was some game settings that made it look weird instead of the GPU itself? Maybe you hadn't installed the correct drivers?ID: gsbhfi6
I would download the latest and would use default settings. It would takea little tweaking such as choosing "let the 3d app decide" or whatever.
Don't get me wrong though, NVIDIA kills it with their tech but in terms of plug n play and UI, AMD wins.
That's really good to hear! Mine should be out for delivery any time now and I'm stoked.ID: gsbzce7
I'm psyched for you!
Very nice but I'm waiting to see what intel will come up with since the latest leaks is looking to be a real monster for their high end cards. Just look it up. 🙂
Wish I could buy one...ID: gsb5cyf
Sending my love your way. I know how it be.
This is good to see. I have a 6700xt (XFX Reference so same one essentially) sitting on my desk, and a 5600x arriving Sunday. The benchmarks seem to show it as a strong pairing.
Kind of an impulse buy as I already have a gaming laptop (10750h/1660ti Legion). Only have the laptop though because I sold my desktop in anticipation of all the new releases, and my old laptop died a week later... So now I have a 'pretty good' laptop, but it just doesn't hold a candle to a decent desktop.
The sw is basically geforce experience + afterburner + nvidia control panel all in one and I kinda miss it now that I switched to a 3070, it's a bit buggy sometimes but it's really amazing nonetheless.
What FPS do you get in rdr2 max settingsID: gsass2t
65FPS in 1440pID: gsbb6s1
That’s pretty good. I’m getting around 80 with some dips into the low 70s because I set resolution scaling to 1.5. It looks a lot better with that setting. Hair and animal fur look scuffed without it. And the stars at night look incredible with that setting.
Pretty similar build you got there, I'm also rocking a 5600x and 3600Mhz 32GB with a 750 Watt Corsair PSU.
Hell yeah dude, Im rocking a 6800XT, 5600x, MSI mag X570 with 32gb 3600 Corsair vengance pro. My games are butter smooth now
Rocking a 5800x with a 6700XT. congrats!!!
Re the 10 FPS: if you've turned virtualization on that can cause FPS loss in games if Windows has automatically turned on Memory Integrity in the Core Isolation section in the Windows security settings. Also, turning on Hyper-V in Windows has the same effect.
Got my 6700 XT Nitro+ just recently, up from my a 5700 XT Nitro+ (Basically swappped them) and very pleased with it! Actually runs about 10c cooler than my old card whilst being quieter, both cards were/are running undervolted, this one being -100mv. Only thing is I got a fair bit of coilwhine when gaming which is a shame :/ Though I wear headphones so can't hear it.ID: gsbjj40
I haven't experienced any coil whine, however, I have read that it can go away after some use.
GeForce Experience is crap. The actual options are in Nvidia control panel, which hasn’t been improved in like a decade. On the one hand I wish they’d merge these. On the other, that would put things like antialiasing behind a password and login, which I also hate.
One thing I'd suggest on the 5600x is to upgrade to a cheap air cooler like the gammaxx v2 or similar 25$+- range cooler.Just doing that upped my friends boost clocks as the stock one under really heavy load was tip toeing right at its tjmax.
I was able to snag one too, still hoping to get a 3080, but if not within the next week or so I'm just going to give up (been trying for months) and use the 6700xt. Thanks for reporting positives, I was getting nervous that I would be disappointed but I too am coming from having not upgraded in years so it should be good 🙂