DLSS Performance Overhead Causes FSR To Have Similar Performance At Higher Internal Resolutions (1440p vs 1620p)

1 : Anonymous2021/07/17 11:10 ID: om285n

FSR does have worse upsampling that's a fact but because it doesn't rely on machine learning & AI it is much more performant than DLSS so you can use a higher internal resolution for near equal to GREATER performance which means although the upsampling is worse it looks comparable due to the higher resolution however this advantage has not been compared at 1080p or 1440p yet in a side by side, so we will have to wait to see if it holds up. I know a post regarding the recent comparisons was already made here but I wanted to share these two things: Firstly this discussion and second share the comparison slider image I made for a better view than a youtube video

Example: At one point in the video at 4k max quality settings for both upsampling solutions FSR had 129 FPS while DLSS had 122 FPS

4k Comparison Image 1

4k Comparison Image 2

Edit: FSR's internal resolution is 1662p not 1620p. It is 1662p vs 1440p. One's scale is 75% the other 66.66%, so it seems FSR needs around a 9-10% increase in resolution to visually compete with DLSS (at 4k) but this is fine since FSR will perform similarly to it at higher resolutions

2 : Anonymous2021/07/17 17:15 ID: h5j99my

I have an RTX 3070 here so no sour grapes for DLSS on my part here, but I actually find FSR and DLSS in these to be identical, at least in these static images, from my normal viewing distance from my 1440p monitor (arms length). I don't know what it looks like in motion and I'd have to play around with it in game, but wow that's impressive.

ID: h5jogvc

Gotta get a game with both DLSS and FSR and have one program feed into another.

DLSS 480p—>1080p

FSR 1080p (from DLSS)—>2160p

That would be neat to see the image quality

ID: h5k4wu9

I would like to see that result as well... Tho i don't suggest anyone should waste there time actually implementing it!

ID: h5k1v79

I don't know what it looks like in motion

Here's a timestamped link to the part of the video where they move around with both side by side

ID: h5k9jj5

I dont think the overall quality of this game can be used for such comparison, textures are muddy and low res compared to what some games are achieving nowdays.

3 : Anonymous2021/07/17 21:17 ID: h5k2ezy

Nit picking about FSR compared to DLSS 2.1 tells a really impressive story doesn't it?

AMD did great with FSR. Like people keep comparing by zooming 2-3 times to find the dirt.

Remember when DLSS 1.0 was out? Like you could see from miles away how shit it was. How much detail was lost in comparison to native...

While now DLSS 2.1 improved heavily and amd just released FSR... AND YOU CAN'T EVEN TELL A DIFFERENCE unless hard comparing and zooming in.

That is fucking huge achievement in my book. It works on all hardware. It is piss easy to implement and gamers are loving it.

I can't wait to see improvements and additions to FSR iterations and see how much AMD will up the game with its open source up scaler.

But I am impressed. I was sceptical at first but this is huge.

ID: h5k9rvy

AMD did great with FSR. Like people keep comparing by zooming 2-3 times to find the dirt.

I dont know man, I mean, even nowdays msot people are so blind they cant tell the difference between 30 and 60 fps, even less when its 60 to 120hz fps. And to me those people should be considered to be technically blind.

ID: h5kdc3w

Don't know what?

4 : Anonymous2021/07/17 13:39 ID: h5ija2g

I made a comparison in 3.440 x 1.440 with FSR on/off to compliment this here. I also had the frame counter activated so you can see what the performance increases on my RX480 were.

ID: h5izfkd

Thanks for this, I am really interested in performance for older cards. People keep showing performance differences with 3x series cards; what is the point when they are getting above 60fps regardless?

ID: h5j1ki1

Maybe all of them have bought brand new 360Hz gaming monitors. 😉

ID: h5izxws

U got a 3440x1440 Monitor with a RX480? You are a Madlad bro!

ID: h5j1fu3

Bought the monitor from the UK before Brexit for a price I couldn't say no to. Then I was hit by the current GPU crazyness. For now my RX480 8GB is on life support with a new Ryzen 5800X keeping it going. I can actually play most games between 50-70 FPS on high/med. The RX480 is a trooper.

ID: h5jofel

I had a 290 with a monitor of the same resolution. Finally snagged a 5700xt but yeah, not much ran well at the native resolution ha.

ID: h5j61wl

Is it just me or is the 1440p quite blurry out that little tunnel in the picture? Is it just like ambient fog that rolls around that was there for native but not present for FSR?

ID: h5kbqv5

The textures are just super blurry in general. Not much an upscaler can do when it's working with bad data.

5 : Anonymous2021/07/17 11:34 ID: h5i8daj

Worse..... not really literally watched a video yesterday of a game with FSR and DLSS and unless zooming to 3x noone can tell the difference,

Plus it's the first iteration so for version 1.0 to he imperceptible to dlss 2.1 8s a massive achievement.

In a game you are not going to be analysing it nowhere near as much anyway,

And what about fsr 1.1 when it releases or another 12 months when 2.0 is available?

And it doesnt even need to be as good as DLSS but is already pretty damn close anyway.

ID: h5itd88

It depends on what display you are using. A 27" 4K monitor? Yeah no difference. But it should be noticeable on a 65" TV.

ID: h5ixdds

But would it be? You aren't pixel peeping on a TV. You're sitting probably 8+ feet away.

ID: h5iuj22

It may start to become noticble on a 65" screen but most gamers even console gamers dont play on a 65",

Consoles will always look worse than a pc regardless so having slightly worse graphics (it will still look better than lower resolution natuve) while giving you extra performance to make console gaming smoother,

I have a lot of newer AAA games that are above 50fps and sometes hit the 60s on a 32" monitor giving an extra 40% will make the whole experience q lot more enjoyable on my freesyjc monitor smooth as butter for an imperishable difference and at 2.5x magnification there is a minimal amount of difference so even on a 65" tv with fsr on will look better than native 1440p.

ID: h5ixydg

You can't really build on FSR. You can tweak things to visual preference, but there isn't much more you can do with it. It's basically an overhyped AA option added in.

I appreciate what it's doing for games, but FSR alone doesn't have the foundation and potential for improvements and tweaking like DLSS does.

ID: h5j3osr

And with what kind of experience and knowledge in this field of Image processing are you saying that??

As a bachelors in computer science/application, I always believe that every solution in existence has room for improvement, no matter how much "finished" they look like. AMD could improve upon FSR more by adding a better temporal reconstruction component to FSR. Sure, implementing that in video games will become harder, but on the other hand, it'll clearly be superior than what we have now. Then comes all the other techniques that we still don't know about, but are being tested internally.

ID: h5j2j3u

You can't really say that there is no law or reason why a software based upscaling technique can't be as good or better than hardware based,

It's a 100x better than dlss 1.0 on its first iteration so what reason would there be to not believe that at some point it will be better than dlss 2.2.

6 : Anonymous2021/07/17 12:51 ID: h5ienpm

I mean DLSS also looks better, so I'm fine with that.

ID: h5jpo04

Eh... There are elements where DLSS is clearly superior (the chain in the first picture, for example, is completely artifacted with FSR, the reflections are also worse). But there are some elements, like text signs where I actually like FSR. Overall DLSS is better, but not 100% if the time.

ID: h5igzj0

Maybe I'm crazy, but in those comparison shots, I prefer fsr to dlss. It's sharper, seems to have more detail. The text on the signs look better.

ID: h5if6b8

I literally couldn't tell the difference besides that DLSS was slightly softer, I don't really think either "look better" their virtually identical at 4k. Other resolutions though like 1080p I'm certain DLSS will upsample better. I mean technically DLSS still upsamples better at 4k FSR just had the advantage of a higher internal resolution, but I don't know if I'd want to lose some FPS for a virtually identical image though.

So in this game at least 4k FSR > 1080p DLSS > 1440p???

Edit: Not sure what I said that was untrue or people disagreed with, it was all pretty sensible. But the downvote button wasn't meant to be a disagree button

ID: h5ihne0

I literally couldn't tell the difference besides that DLSS was slightly softer, I don't really think either "look better" their virtually identical at 4k.

Believe me, you probably wouldn't be able to see the difference between FSR and a good old Lanczos filter either. Just goes to show how overrated native 4K is.

7 : Anonymous2021/07/17 11:59 ID: h5ia9hq

What game is that?

ID: h5ial5x

Necromunda Hired gun

ID: h5iiwt6

Necromunda. A bad one that I returned after playing with it for 45 minutes lol.

edit: my opinion folks, i like lots of games, i didn't return it for any reason other than it's just not something i'd ever consider playing a lot, seeing what it is and what kind of time i need to invest to make my guy "good" i just don't want to.

ID: h5iznaf

Downvotes probablay not because of your opinion about the game but for not answering what game it is 😉

8 : Anonymous2021/07/17 11:24 ID: h5i7loz

Based on how FSR vs DLSS performs I understand why DLSS only runs on RTX GPUs due to no tensor cores on the GTX the performance overhead would lessen the FPS gained from DLSS and possibly make it even worse than native at certain quality presets. But I think it could work on GTX GPUs if you disable the quality option and balanced or performance was the highest available for GTX cards, obviously it wouldn't look as good as quality but it could run and it would help gamers, while also incentivizing them to upgrade still to unlock higher presets and recieve better gains from the tech. All the tensor cores do is accelerate it, so its possible to run without it. What do you think? Do you think NVIDIA should give DLSS to GTX cards by limiting which quality options they can use? Or do you think the overhead would still be too much for an FPS boost even at performance or ultra performance modes? If I'm wrong in my assessment I'd like to be educated

ID: h5i9jvn

They easily could offer it to GTX cards but the incentive isn't there. They offered ray tracing because it was so bad it could incentives people to upgrade. DLSS was a way to sell gamers server GPUs while giving a use for tensor cores. Otherwise they would just be permanently idle silicon on all of our cards. Now with FSR there is even less incentive for them to back port DLSS. NVIDIA need to double down and keep improving and stay ahead of AMD. I do not think back porting is the answer to that.

ID: h5iz5h4

They offered ray tracing because it was so bad it could incentives people to upgrade.

I checked Riftbreakers on my 1070GTX at 3720x1080, if I set RT to max and use FSR in performance mode I get 60fps - without FSR I get 40 fps.

I would be really interested to see what FSR can do with more stressful games with RT. For example, Control with RTX on with a 1070 is pretty much unplayable sub 30fps, but with FSR perhaps I could play with RT on and get around 40fps.

Not allowing DLSS is Nvidia trying to force people to upgrade. To my understanding, CUDA cores could be used to handle DLSS, just as they can be used to do RTX. Of course not as well as Tensor cores, but we will never know.

ID: h5iaz9w

DLSS2 onward model wouldn't run fast enough without tensor cores to be worth losing the performance the shader units would spend on rasterizing.

None of the Ampere gaming GPUs share cores with any "server GPUs".

ID: h5is9zy

They offered ray tracing because it was so bad it could incentives people to upgrade.

That was such a shitty little stunt by Nvidia. Peak Jensen moment

ID: h5j0ar5

The reason DLSS is as performant as it is, is because Turing and Ampere GPUs are capable of executing on Tensor cores and FP32 cores at the same time. So running the tensor cores on "RTX" gpus is effectively free.

As interesting as it would be to see DLSS running on Pascal, if only from an educational standpoint, I suspect performance would go completely backwards.

ID: h5jf9i7

I wonder if we could get slightly worse, if not better performance if the die space for tensor cores were used for more cuda SM's

9 : Anonymous2021/07/17 11:46 ID: h5i993w

FSR has no ghosting, end of story.

ID: h5ibqh9

No added ghosting.

FSR is dependent on the game's AA solution, so if they use a bad TAA implementation with ghosting, FSR will have it too.

ID: h5iljhw

AMD themselves says that you need to have AA to have good results at all. The upscaling algorithm will make aliasing stronger

ID: h5jubxa

Tons of games have ghosting as TAA often results in ghosting.

ID: h5iyksv

Neither does DLSS 2.2. If you are going to ignore the late improvements to DLSS, might as well pretend it's still 2018 and say it looks like shit like many others here do.

ID: h5jdjsd

This is a blatant lie. 2.2.10 still ghosts and its terrible in fast motion games. Sure its far less than it was in pre 2.2 builds but to say ghosting doesn't exist is a blatant lie.

I do think the Ultra Quality mode from DLSS coming soon is something that sounds neat because it may actually reduce this more but as of right now DLSS doesn't go well in FPS games. It also adds lots input lag in many engines like in Cod.

引用元:https://www.reddit.com/r/Amd/comments/om285n/dlss_performance_overhead_causes_fsr_to_have/

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x