I recently acquired 2 cards at MSRP - an EVGA 3070 FTW3 and Rog Strix 6700XT (Shout out to Memory Express in Canada).
My plan was to use them both for a couple weeks and compare decide on the card I wanted to end up keeping knowing that resale of either wouldn't be difficult.
I used the 3070 for the past 2 weeks and was extremely happy with everything about the card. It looked great, ran cool and quiet and kept my frames at 240 no matter the scenario in Dota 2 @ 1440p on a 32 inch TN LG panel.
I just plugged in the 6700XT last night. I immediately did a double and triple take. I was floored by the image, I couldn't put my finger on it, but the sharpness and color, both in games and Windows made me feel like I had just got a new contact lens prescription and a new monitor at the same time. I didn't even know this was a thing until I hopped on Reddit to take a look.
Purely anecdotal evidence but I came in with no knowledge that the image quality would be different on different brands - but out of the box score a BIG win for AMD.
Reviewers make huge deals about the software support like DLSS and Ray Tracing, but the number one job of the Graphics Card is to produce an image - and to see that one brand is clearly superior was shocking to me. I'm sure there are many variables and things that you can do to an NVIDIA setup to improve upon this gap, but it made my choice of what card to keep an easy one. I'm surprised this isn't a bigger deal in the review landscape.
I got a 1060 and my gf has a 5500xt. Her image looks better, but I thought that was just differences in monitors or image sharpening maybe? But I've heard people talking about this long before AMD implemented RIS, so I have no clueID: gueq9nq
I'm not sure either, but how things look out of the box is important. Might be worth flipping monitors to see but I suspect the difference is in the cards.ID: gufsrud
Your monitor uses a TN panel. Most TN panels have banding issues and require dithering for smooth color transition. AMD enables dithering by default, Nvidia requires a registry tweak
AMD driver's default settings has slightly higher saturation than Nvidia
People are way overthinking thisID: guewogv
Time to try your monitor with her gpu
washed out colors and blurry image on nvidia back in the days it was said that AMD had fulll RGB color support while nvidia had only limited RGB support that you needed to tweek manually but has not been the case since i belive 1000 series. There was another theory that AMD gives you full 10bit color on the card while Nvidia uses 8-bit and keeps their 10bit for their quadro range or something. Another quess was that nvidia has been using compression to save on memory so maybe that. Maybe it's all of them combiined all i know is AMD gpus put out a more pleasing image for my eyes while Nvidias looks like everything has color washed right out.
I have both a 3090 and a 6900. I notice no appreciable difference.ID: guetzec
I had a 5600 XT and now I have a 2070 Super, same display, same cables, same everything. I see no difference either.ID: gueybqv
Think monitors are also a factor on what you might experience.
280x - vega64 - 2070super. Samsung pls monitor and aoc tn monitor, slight difference in colour and slightly sharper on AMD. I did feel the image become more what I would call faded on the 2070s but I never thought of it until then, something felt wrong.
For money reasons I had to sell the 2070s.
1660super - 6700xt. Two msi271cqp monitors DCI-P3 90% & sRGB 115% and reaches 400nit.
1660s using full range and tweaked settings with slight sharpening. 6700xt out of the box settings just installed adrenaline. One monitor run hdmi the other display port, same settings
Colours have much better range, image became sharper, my monitors actually turn off as they were supposed to and my idle PC actually goes into the "sleep" it is supposed to and instantly turn on when moving the mouse or pressing anything.
For pure gaming you will be happy with both but I do prefer the image AMD is producing in videos and gamesID: guetmya
I came in with no bias being completely ignorant that there might be a difference and it was startling. But that was after 2 weeks of heavy use of the 3070. The differences might be more subtle comparing them side to side.ID: gufni41
If you took screenshots from both you could compare the raw pixel values and also display the image from one card using the other card just to see if it’s a difference in how it talks to the monitor
I had a similar experience switching from a 1080 to a 6900xt, something about the image from the amd card is just better and I can't really put my finger on it, obviously the performance is miles better in my case too but even in low motion scenes like just walking around the city in cyberpunk the amd card just made it look more like the trailer than the actual gameplay I got from the 1080ID: guf4k1n
AMD doesn't compress textures the same way, the same image will require more bandwidth and a larger footprint in vram. This is measurable but reviewers don't talk about it much. Nvidia says that memory compression doesn't affect image quality, so I guess this whole post is a figment of my imagination.
Scroll to the bottom, it started with Pascal and got more aggressive in Turing:ID: guf7hm7
When you use lossless compression - you don't lose anything. That is what both AMD and Nvidia use in their vram compression.
These posts about image quality being different tend to come from users that don't install the proper color calibrated profile for their monitor - when doing the driver swap, the resulting image might look wrong.
Or have another setting enabled/disabled that affects it.
There have been multiple articles written about this - at this point in time, you will have the exact same experience at least on the desktop with a properly calibrated/setup monitor.
Obviously ingame, with dlss etc. things will differ.
Could be the RGB setting (limited/full) or something similar.
Nvidia compresses colors to make up for sloppy optimization. That's why AMD cards have better IQ.ID: gufhc3c
Image quality, or are they mensa material?ID: gufl3su
Haha iq has been used for image quality for decades 😛
I am not an engineer nor I have studied anything in the direction
of GPU architectures.
Some time ago someone explained it and the essence is, that
AMD/Radeon manages the data a bit different and doesn't
compress the shit out of it.
The result is a better picture.ID: guf574a
Nvidia claims that it doesn't affect image quality, and because Nvidia would never tell a lie you sir are a scoundrel.ID: gufdou9
i can't for the life of me find the damn video.. but there was a diagnostic tool for HDMI output bandwidth that measured what was ACTUALLY being sent from the device to the display. And the reason the test was initially conducted was to determine what the problem was with the cables and the display problems to explain them. Coincidentally the person conducting the test was using set top boxes and got a bit curious about graphics cards. Suffice it to say, they discovered that nvidia's hdmi bitrate being spat out to the displays was lower than what AMD's cards were doing which raised a few questions as well as explained why some of the cables were working fine on nvidia's cards while having issues on amd's. 144hz display requiring more data, but nvidia somehow managing to spit out less data while producing the "same" refresh rates at the same resolution while AMD was requiring more and resulting in a bad cable producing blanking/flickehand shakes or failing to even produce a signal (black screening even). Replacement of the cable with a better own would resolve AMD's issue.
Granted it's merely a variable. But there is consistently a difference in colour and clarity between the 2 products and always has been. Since the days of the TNT, and nvidia's introduction of Digital Vibrance to combat their lack of shall we say, colour accuracy, kinda like nvidia was stuck with a "slightly washed out appearance" is still today a common statement from people.
Colour profiles are irrelevant if nothing was setup to begin with and simply attaching one card to the display and another even at the same time (in the case of a highend tv for example) and swapping between both or doing side by side you can CLEARLY see there is differences even at the desktop level.
Nvidia often opts to go with a limited range via hdmi, 16-255 rather than amd's default 0-255. Manually changing this in the nvidia control panel can/will help but it still doesn't quite match the output amd delivers.
It's rather strange to be honest. It's certainly not placebo as you get enough customers in which a swap too or from one or the other gets the common utterances when i call in or they do about the displayed image coloudifferences.
I've sold an nvidia user and AMD product and they occasionally report that the display looks richer but they can't put their finger on it, and in the opposite case, i sometimes get a complaint from a user that has bought an nvidia card to replace their amd about the display looking washed out somewhat (to which i point to digital vibrance and while it does help, it doesn't usually look right to them)
Meanwhile we see posts on even this subreddit from users asking where AMD's "digital vibrance" is and i can't help but laugh a bit.
In terms of texture quality and such, AMD certainly tends to have and historically always has had the lead in that, not to mention other effects, Nvidia's often touted frame rate leads can be traced back to things they've done in the past that were proven to be short cuts or optimizations that have impacted visuals either significantly (easily called out), or rather difficult but once compared fairly obvious.
I owned a 3080 before getting a 6800xt.... i tried out DLSS and RT and then had the opertunity to compare it to the 6800xt. Honestly no matter how much calibrating i did on the nvidia card, i could never get quite that pop that AMD's cards consistently deliver out of the box every time. I REFUSE to adjust my displays colours and settings to make a video card spit out an image that's comparible, as calibrating the display for nvidia's cards buggers up the other devices and settings i may already have configured. I only had the card for a week and a bit roughly before i dumped it, as i wanted to see how DLSS mostly did and personally seeing the trash it produced on a 65" 4k display made it completely irrelevant and grossly overhyped. Maybe DLSS3.0 might fix it's glaring problems but it's most definitely not a selling point for me. I'm perfectly happy with a 6800xt spitting out 90-120FPS @ 4k in most of the games i play (or higher).ID: gufe61g
Nvidia says dlss is better than native which is also a lie Don't trust Nvidia.
2 cards at MSRP? Will you be taking part in the lottery next?ID: gueuxk4
Lol I know! I just phoned and the had got 6 of each in and the champ actually held them for me until I could buy them.
Back in the day, early 2000's, reviews used to rate 'image quality' and ATi regularly won out by a healthy margin. Rocking a Nvidia card usually led to more washed-out colors and everything looking a bit drab and not as sharp. Never understood why but perhaps that's still the case? I've never done a back to back but I wouldn't be surprised if it was still the case.
I mean, some people have TN monitors and think their colors and image look great. A LOT of people wouldn't be bothered by a slight lowering in image quality for higher performance
This was evident 20 years ago, as well, believe it or not. It was plain to see even then. It goes all the way back to when 3dfx introduced FSAA to the industry--no one else had it, including nVidia. Right after 3dfx imploded, ATi came on the scene from literally nowhere with the darkhorse ArtX acquisition GPU, R300, and the GPU did FSAA & general image quality so much better than nVidia that of course nVidia criticized it heavily for a couple of years until it could begin to field competitive FSAA GPUs to compete with R300 and beyond. Despite nVidia's adamant protestations at the time, FSAA became a staple in the industry and is a checkbox feature today for all 3d GPUs, including nVidia's.
nVidia has always been a company absolutely obsessed with frame-rate benchmarks to the exclusion of the best image quality possible. Considering the number of people who aren't very demanding about their image quality, that approach has sold a ton of GPUs for nVidia. But you are right--and you aren't the first to have noticed what is often the stark difference between the competing families of GPUs.
If you care about your image quality, imo, and it's as important to you as frame rates, or even more important (that's me), you are going to go with AMD. DLSS is really sort of amusing to me...;) I game at 4k with a 5700XT (which I will be keeping until I can land a 6800XT at MSRP, and so that might be a long time to come...;)) Anyway, if I want "DLSS" I just drop the res to 1440P, pour on the FSAA to a desirable degree--and I'm there...;) Frame rates go up appreciably and the image quality is but a tad lower than native 4k. RayTracing? It is 100% optional in all of the tiny number of games that support it--and many people don't really understand that when you use ray-traced shadows and/or reflections in a given scene/frame, and even ray-trace a few object surfaces, maybe as much 95% of everything else rendered on the screen is still rasterized, anyway! For instance--every object in every scene is rasterized--none are fully ray traced. None.
I don't have anything against DLSS or D3d ray tracing...but as you say, how many game reviewers ever bother to go that deep into the differences between D3d ray-tracing and the kind of ray-tracing special-effects companies rely on for movie production, in programs like Lightwave, for instance? I cannot think of a single one who really understands the enormous differences. The differences are vast. But that's OK--D3d RT is what it is. What's not OK is the idea that D3d RT'ing is a lot more than what it actually is...;)
So in the final analysis, all of these perceived differences boil down to marketing hyperbole--but you know, it's all in the eye of the beholder in terms of DLSS and D3d RT'ing. Shown the same exact frames, one guy will opine on how he has never seen anything quite so beautiful, while another will say, "Meh! I prefer to turn those features off, myself."
But in terms of general IQ--comparing the two GPUs on the same monitor in the same games with the same settings on the same desktop--is about as clear as it gets! Yep, the win is definitely AMD's.
It's because amd applies temporal dithering by default on everything while nvidia does not, on windows (works for linux on nvidia)
The excuse nvidia uses is "compression is ok as long as the graphics card run faster"' or something along these lines.
Search google for nvidia/amd dithering and the same picture quality difference threads will pop up by dozens.
I bet it's the Radeon Image sharpening, which is set to 80% by default. Try comparing with Nvidia sharpen. For me I felt everything was too grainy out of the box with RIS, so I lowered it down to 40%.ID: gufafvf
RIS is off by default when you install a new GPU into your system. I just did it recently with the 6700XT.
Boost, RIS, etc you have to manually enable.
NVIDIA compress the shit out of the data as it's moved around the GPU, they have always claimed that their compression is lossless, but I have also recently used a 3070 for a week, before selling it for a 6700XT. The vibrant colors and crispness of the image is instantly noticeable.
There is something off about the texture filtering on the 3070, its almost as if the AF it does, cut off earlier into the distance than on the 6700XT, despite the same x16 setting.
No, it isn't the HDMI bug (which NV claims they fixed years ago but its still sometimes broken), I run over DP1.4.
It was always known that AMD's image quality is just alot better. It's because they dont use hardcore color compression as that happens on Nvidia.ID: guex09c
I've been crazy in terms of reviewing articles and videos with these products this generation. I just found it interesting that it's never mentioned.
Weird, I felt the same way recently,
despite the 1070ti being far more powerful the image was just plain better with the old 570, please note I'm using a 1080P TV, games and movies were sharper and looked more clear.
Yes I have multiple computers in my house with different generations for NVIDIA and AMD graphics cards, I've noticed this as well throughout the years, and this is why I go with AMD cards in my main rig since I don't do any high FPS gaming with raytracing. Image is more vibrant and colors are richer.
It's like the colors are washed out on NVIDIA in comparison somehow.
However, I have "deuteranomaly" which is a color blindness where red light bleeds to much into the green cones, so my opinion may be partly based on that.
I see red and green as clearly different colors, it's just some earth type hues where it starts blending together, but I'm less affected by it as well on AMD cards.
I've also had a long standing fetish for high-end IPS monitors and have 1x 40" 4K, 1x 42" 4K, 2x 32" 165Hz as my main monitors, and another HP 34" ultrawide all with high-end IPS panels currently in my Home Office / Home Lab room.
So maybe my IPS high-end panel fetish and selection of AMD cards has been there to unconsciously battle my less than stellar color "sensitivity".
Ya, i am not sure why not enough known Tech reviewers talk about these two GPUs have different image quality. I remember back when i got an RX 390 then later got GTX 970, and was shocked how both looked different. Remembering the RX 390 had better shadows, but the GTX 970 had colors pop out more.
As for your two 6700XT, isn't the 3070 though not pushing more FPS vs your 6700xt? Especially when OC? And now Re Sizeable bar?
**edit also how did you through Memory Express? There is no checkout options?ID: gufijto
Well Dota2 has a frame cap of 240 FPS and both cards were able to maintain that @1440p max settings so for me personally it was a wash.
As for Memory Express I just happened to call the store in the morning and they had received stock that day. I never see their website with stock.
I too think that AMD manages to output a better looking image in games. Had an RX 570 in my system for a couple of weeks while I lend a friend of mine my Titan and I was actually shocked why the game I now played looked so much more alive, didn't do anything with the monitor. Only after getting my Titan back did I try to adjust my monitor to what I had experienced and gotten used to in those few weeks, I did come close but it never looked as good. Weird.
Tho I have to mention, It literally took me a single day and I was already adjusted to my old GPU and its image quality, it's a difference for sure, but I wouldn't place it on the top 3 of my personal criteria for buying a GPU. Plus I didn't feel like it did anything different in literally anything besides playing games
Yes, this is true.
I bought a 3070 for my LG OLED CX with hdmi 2.1. 3070 was the only 2.1 source when i got it. Image quality was ok. nothing to compare it too.
But i always suspected about the image quality about the nvidia cards, then few months later i managed to get a 6800 XT sapphire and when i swapped the cards. The image quality was a huge difference.
Note i did set all RGB / Full with the 3070 , but the 6800 Xt out of box just gave a better image on the OLED and you can really tell the difference on the OLED.
But the nvidia 3000 hdmi 2.1 ports can do full 48 Gbps and AMD 6000 series can only do 40 Gbps.
I scored a 6700xt as well and I really wanted to give an AMD graphics card a try and if I not I’ll trade for 60ti/70. But I have been thoroughly impressed with it. Coming from a 1660S I knew performance would be better, but like you said there just something about it. Glad to know I’m not the only one.
Same here. Jumped from a 3070 to a 6800XT Taichi. Performance and visual clarity overall improved. Very impressed !
Is your monitor connected via HDMI? Sounds like you were maybe running TV-levels instead of PC-levels.ID: gues7aq
Displayport for both.ID: guf9w0c
This HDMI RGB range 15-235 bug was claimed fixed by NV drivers, multiple times over the years. Its not related to that.
People talked about this, some liked it some hated the color. There ain't an official explanation about why the colors are a lil differentID: guescgs
I found image sharpness to be different as well.ID: gufa3wn
It's the texture filtering algo on AMD, it produces cleaner images, more obvious at higher resolution.
On NV, to get the same quality you have to force it in the drivers, instead of default, set to high quality.
This is something I have noticed for years, on the 980Ti, the 1070 and recently the 3070.
Probably Radeon Image Sharpening turned on...turn it off & see if there's a difference
You need to set the color range on nvidia drivers to full otherwise it comes capped out of the box, not sure why.
But yea, out of the Box, AMD image has better color reproduction.ID: guf059w
You need to set the color range on nvidia drivers to full
where is this?ID: guf254h
Nvidia control panel. I'm at work so can't look into exactly where it is. I believe it's at the bottom of the resolution screen though where you can change the monitors refresh rate. Its at the bottom with a few other options for color and what not.
Vega 64 -> 3080 no difference. Something is up with your monitor.
I've recently switched from 5700xt to 2070 Super and haven't noticed any difference in image quality. I've heard about AMD having better image quality. Never saw it mentioned in reviews tho.
I noticed a difference when hooking up my brothers system with a 1070 to my 2nd monitor (Benq G2420HD). Looked bad compared to my Vega 56. Both systems connected with HDMI and default settings.
I'm glad I am not the only one! I thought I was crazy. I got a 3080 and a 3070 and they both seemed washed out.
When I watch movies or shows on my PC I always tweak the setting because I am a little color blind and enjoy more vivid colors. When I tried to adjust the settings with Nvidia it was like the gamma was way off. For the life of me I couldn't get settings the way I liked them on my Vega 56.
I finally got a 6800 xt and the image quality is night and day to me. Seems that some here disagree with your findings but I am completely on board with it.
The only reason they would ever look different is if one was not set up properly.
Just going with the out of the box defaults.
Nvidia was GeForce full suite download and defaults and AMD the Adrenaline software defaults.
Check colour settings in Nvidia. For example if you got hdmi instead of dp called your colour will be limited instead of full by default. Also i believe amd has sharpening on by default now while Nvidia doesn't.
50 comments and not a single screenshot or photo demonstrating the effect.
I said that it was purely anecdotal evidence and just wanted a discussion.
change nvidia output setting to rgb full, default is limited because novideo is cheating benchmarks
The last time I installed an AMD card image sharpening was on by default.
I haven't experienced this myself since, but I've seen this brought up dozens of times during some years.
Switching between 6700 XT and 1660 Super looks the same to me. Is full RGB enabled on the NVIDIA card?
OP - Nvidia cards likely have Output Dynamic Range set to "Limited" instead of "Full". My 2070 Super was set to Limited when I first set up my PC and the colors looked "washed". You need to change it to "Full" in Nvidia Control Panel.
They probably just have some gamma or black level setting automatically set to make it look a certain way. You could 100% change you colour settings and get a similar image on the 3070.
This is total bunk, sorry OP but it reminds me of those 2006 myths about hardware you'd see on forums.
You're onto my secret agenda. To come onto Reddit and make up stories.
Don't know what to say dude. It happened as described and I had no prior bias and no idea that cards could even look different.