Just got a game with FSR. Gotta say, holy shit, it’s so good.

1 : Anonymous2021/08/21 17:20 ID: p8vegc

I'm stuck with a GTX1070, and Chernobylite runs at 40-50ish fps 1440p on ultra. With FSR on ultra quality, it runs at 60-70 and i can't tell the difference at all. Fucking amazing. DLSS/FSR/whatever intel's is gonna be called is the future. Thank you AMD for making it available on all platforms. If nvidia don't wake up and make a platform agnostic version every game is gonna have FSR and they're gonna be sore about it.

EDIT: I can see the difference between quality and ultra quality FSR, but i really gotta pixel peep. And i get even more FPS and more stable frametimes. I'm going with Quality.

2 : Anonymous2021/08/21 18:38 ID: h9tg7h9

Only game that has it for me is Edge of Eternity. It works well on my 6800 though not really necessary. It’s better on my 1660ti laptop but still kind of limited. The best use I’ve seen out of it ironically, is on my Alpha R2 HTPC. I tried it with the unofficial injection into GTA V and can play it on max settings 1080p 60fps and it still looks good. EoE might be just limited to the GPU being relatively low end though.

ID: h9wd0uc

it has a lot of Potential in VR games, as the entry barrier for VR itself sinks but you still kinda want a decent Desktop for it, if you dont want to be limited to things like the Quests own hardware that is.

3 : Anonymous2021/08/22 03:01 ID: h9v6xzp

Not sure why people are starting to think Nvidia is going to fall behind in ML...to Intel and AMD. It's just not going to happen. Nvidia is still years ahead in R&D.

ID: h9vzg3u

FSR is unrelated to ML. It has mathematical similarities, but there is no learning per se.

ID: h9x6m2t

2014: Not sure why people are starting to think Intel is going to fall behind in fab technology... to TSMC. It's just not going to happen. Intel is still years ahead in R&D.

2021:

ID: h9w12ee

I mean, the AMD tech shows that it might be possible to make the ML redundant. If you can achieve almost the same result without it and the extra cores on the first iteration...

All that is required now is the support in games.

ID: h9w7lag

OP does not have an RTX series card to run DLSS so that is not even an option. Nvidia keeping DLSS proprietary in the long run won't work out well for them.

ID: h9wf5sk

[deleted]

4 : Anonymous2021/08/21 18:31 ID: h9tfabq

Games with FSR significantly reduced my 6800 XT temperatures (8-10C) on maximum setting at 144 FPS.

ID: h9tyfdh

Almost like it's lowering your GPU usage. Oh wait... 🙂

ID: h9unlxv

Lowering the resolution output also reduces your temps lol. Not a useful comment imo, but I’m also glad you’re happy with it at the same time.

ID: h9vfdcy

Y is this downvoted

ID: h9uobi4

Why would I need to lower my resolution? I was simply noting that FSR games performed better thermal wise for me.

FSR has a dual affect in most cases with compatible games. It improves the graphics while reducing the demand on the GPU. Typically this provides the benefit of lower thermals. It is often an overlooked benefit of FSR.

5 : Anonymous2021/08/21 20:02 ID: h9tr2zm

I still believe that FSR in it's currently state is fundamentally worse than DLSS, but they may improve it in the future. DLSS was trash in it's 1.0 version yet it's really good after they reworked it.

I'm hoping AMD can do the same.

ID: h9tu393

Temporal FSR is almost certainly already in the pipe

ID: h9tuqfr

I'm hoping so more competition is always good

6 : Anonymous2021/08/21 17:30 ID: h9t774n

FSR isn't going to replace DLSS, won't even compete (it's just fundamentally too different, and too limited in its current form) . XeSS on the other hand is promising (it seems very similar to DLSS, but open)

ID: h9t8a3i

I'd assume AMD aint sitting still, so we ought to see them going similar route with future FSR implementations. I mean current iteration being labeled 1.0 is an obvious hint.

ID: h9tb9vc

I actually wonder where FSR will go now that XeSS (supposedly open source) is there. I imagine they will just adopt and maybe adapt XeSS to future RDNA architecture.

ID: h9ty335

it'd make sense, the way i see it the current version of FSR was only meant to close the DLSS shaped gap, with something better coming in future, but seeing as most gamers seem to think "turning up sharpening to 11 = better quality" i don't see why would they bother with making a temporal/AI driven version of FSR now, in average gamers minds AMD already "won"

ID: h9unvy3

I disagree to an extent. Version 1.0 of software usually has the same inputs and outputs as later versions. Version 1.0 of FSR has no temporal information as an input, which is limiting long term. I honestly think they would rebrand and call it a different product if they started to utilize temporal information, but I could be wrong.

ID: h9tcxej

FSR isn't going to replace DLSS, won't even compete

You're right. DLSS can't compete with FSR on my GTX 1070.

ID: h9twjmp

Definitely makes me feel much better, since i'm "stuck" on a 1070 as well - but between FSR and lack of titles that _really_ require a next gen GPU it's not a bad place to be after all.

ID: h9u6tzf

If Intel plays their cards right, they will make both FSR and DLSS irrelevant.

ID: h9u3txz

"Freesync isn't going to replace GSYNC" ....oh wait

ID: h9uh774

That's the "I believe everything YouTube tells me" opinion. It doesn't matter how differently they work, they are aiming at getting the same results.

FSR is simple and great, better than the original DLSS. AMD is still working on it and the next iteration will be even better.

ID: h9ujb8r

FSR will be in every console game next year. Since PS5 and Xbox make up most of the home gaming market, it will be super popular.

ID: h9u7qh8

Yes it will compete with it. It already is close on 4k Image quality but offers different benefits. DLSS is good at lower framerates, but I recently talked with someone on the effect with High refresh rates. DLSS has a high cost that won't scale well and you will not see the benefits (as high as before) at 150+ fps. While FSR will still boost it similarely like before. So if you have a 240 hz Monitor FSR is more benefitial actually.

7 : Anonymous2021/08/22 04:25 ID: h9vgb0l

If nvidia don't wake up and make a platform agnostic version every game is gonna have FSR and they're gonna be sore about it.

What I find weird is how much everybody bashed DLSS in terms of picture quality but FSR (which objectively is worse) is constantly praised here. Personally I haven't tried FSR but if it's worse than DLSS, then no thanks.

ID: h9vznkz

Everyone bashed DLSS when it was in its early days, and rightfully so as it first wasn’t good at all. Version 2.0 is good though and I am not hearing anyone “bashing” DLSS anymore. It was just that DLSS 1.0 was bad.

ID: h9w42ah

I find it rather hilarious. With DLSS it was too blurry, ghosting, etc. "Literally unplayable". FSR is blurry as hell, can have the same ghosting issues etc "holy shit it's so good". No bias in this sub. No sir.

ID: h9vsl09

Maybe because its avaliable on all gpus so people with shitty laptops and older pc can run games smoother. My friend can play dota now on playable fps and doesn't have to upgrade.

ID: h9vphol

which objectively is worse

DLSS can be objectively better in regards to generating even more details than native, but then suffer from worse ghosting and screwing up effects like bloom or even removing them outright, leading to people preferring FSR overall.

Also, as we move to higher resolutions FSR would look even better against DLSS.

ID: h9wc2vi

The beauty of machine learning is, it will improve with time, in fact, ghosting and particle bugs have been fixed in the newest DLSS version, can't say the same about conventional solutions.

ID: h9w2l4p

I haven't tried either but every review and publication I've read/watched says dlss is better hands down. Plus machine learning is just getting started, just wait a couple generations when tensor cores are even more powerful and dlss is refined even more and I just don't think FSR would be able to keep up at all (in terms of quality)

8 : Anonymous2021/08/21 22:01 ID: h9u6foa

As convenient as FSR is, it's not the way forward. Dlss isn't either. Open sourcing an AI driven technique we can all use driver side is the end goal. Intel claims they're best of both worlds, but time will tell.

9 : Anonymous2021/08/21 18:10 ID: h9tchu1

DLSS/FSR/whatever intel's is gonna be called is the future.

Throwback to when /

knew it was just a gimmick.

ID: h9tz8qz

ray-tracing is still considered to be a "gimmick" round these parts, the whole industry is wrong about RT because AMD happens to do worse than Nvidia in RT, its quite hilarious.

ID: h9u45kz

I want AMD to take the lead in RT so people here actually start discussing it rather than pretending anything AMD sucks at is a gimmick.

ID: h9tip8a

Bro, you forgot what DLSS looked like. At release it WAS a gimmick and it stayed that way for a long time until 2.0.

And proprietary tech can never be the future since it can't be used by everyone.

DLSS likely won't be, wherehas XeSS or if AMD comes up with something better with open source ML, it has better chances to be the future. Unless Nvidia makes DLSS open source... Even then it's not guaranteed which implementation will become the next widespread used standard. We'll see.

ID: h9tuo7f

And proprietary tech can never be the future since it can't be used by everyone.

nvidia has 80% of the market. And every card they will ever release will have support. It absolutely can be the future. It doesnt really matter what amd does or releases or in what fashion, because their presence in the discrete gpu market is almost nonexistent

10 : Anonymous2021/08/22 09:13 ID: h9w38vf

Idk why u put fsr next to dlss and Intel ml upscaling, fsr are more likely to compete with taau and other traditional techniques it cannot compete with ml

11 : Anonymous2021/08/21 17:39 ID: h9t8dmv

Unfortunately too many nvidia fanboys will disagree with you here claiming that dlss is the future and it’s the only thing that matters. As we see with people paying x2 or x3 the msrp for those cards “because dlss, rtx and nvenc”, forgetting that dlss was useless in its first form also with the 20 series.

They were the first ones to implement such a technology, but I also believe that FSR or some other version of such, but not hardware limited, is the future. FSR is still at its infancy so it’s not even a competition to dlss, but if enough game developers go with it, or whatever intel will have to offer, we could see the mighty nvidia supremacy go down in flames.

Now, my little rant is over.

ID: h9toq7n

Unfortunately too many nvidia fanboys will disagree with you here claiming that dlss is the future and it’s the only thing that matters.

No, but I also don't agree with OP that FSR will take over DLSS. FSR is great as a simple solution for titles where nothing else is available, but as reconstruction techniques become more standardized(and they will), FSR is not gonna be the go-to solution for any given game. It has too many weaknesses.

ID: h9u3omk

claiming that dlss is the future

Because it is. There is now 3 of these: DLSS, XeSS and TAUU. They will be everywhere in a couple years.

ID: h9um7fb

dlss

DLSS is the Nvidia proprietary version. Sure similar technologies will be the future, but if it stays vendor locked it will have the same fate as PhysX.

ID: h9t9hct

wish I could agree with you but it's extremely likely FSR is another AMD software that was thrown into open source as an excuse for them not to do any work on it, especially when we already know an AI-enabled version is one the way (RDNA 3 is going to have "Matrix Cores") to compete with XeSS and DLSS.

ID: h9tvycx

Yep, a lot of AMD software ends up in the junk pile to be forgotten. They can't even update their Driver Suite so that 5000 series processors give a correct upgrade route. FSR will be open source, just so AMD doesn't have to work on it anymore and put in effort with expectations. Reminds me of Radeon Boost which is in like 18 games and has been forgotten by their marketing and development teams.

ID: h9tghtx

Are those "matrix cores" confirmed to be in RDNA 3?

ID: h9ttnxu

If XeSS is comparable to DLSS in quality per performance gain and AMD comes up with anything that is locked to specific hardware, it will be dead on arrival.

13 : Anonymous2021/08/22 08:25 ID: h9w00jx

You can enable FSR in games that do not support it. On Linux, you can use Proton 6.13 GE or newer and on Windows there is something called Magpie. I can't vouch for Magpie since I don't use Windows and the entire github page of the program is in what looks like Chinese, but on Linux, I can enable FSR in every game without any issues and in some cases get better quality than a game's implementation(I get a sharper image in RE Village with Proton FSR compared to the game's Ultra Quality FSR setting all while upscaling from a lower resolution). I also hear than Magpie adds like 1 frame of lag which is something you don't get with Proton on Linux. I personally wouldn't use Magpie if I was a Windows user because of the 1 frame of lag and the fact that I can't even read the GitHub page but I just want to make the point that due to the open source nature of FSR, we don't have to rely on devs to implement the technology in their games, we can have tools that inject it instead.

14 : Anonymous2021/08/21 20:18 ID: h9tt4oo

On that note, you might be interested in this comparison between FSR and TAAU in Chernobylite:

/comments/ow2wg/" class="reddit-press-link" target="_blank" rel="noopener">https://www.reddit.com//comments/ow2wg/

15 : Anonymous2021/08/21 23:09 ID: h9uerwv

Is FSR mainstream yet?

16 : Anonymous2021/08/22 02:50 ID: h9v5mry

I hope that Intel support FSR and both, AMD and Intel improve it to destroy that propietary garbage of DLSS. I own a RTX and it's supported by none of the games I played meanwhile is FSR is already gaining ground Soo fast.

17 : Anonymous2021/08/22 04:21 ID: h9vfv8z

Nice one! Does it work on older GPUs (I have two laptops, one with GeForce 610M 2 GB and other 1050 Ti 4 GB)

18 : Anonymous2021/08/22 17:17 ID: h9xiamm

Yes, it works on all GPUs that support DX11 or later (correct me if I'm wrong)

19 : Anonymous2021/08/22 17:18 ID: h9xiim6

Will be trying it once I will get my hands on that.

20 : Anonymous2021/08/22 04:27 ID: h9vgjjr

What's the difference between FSR and Display Scaling?

21 : Anonymous2021/08/22 17:19 ID: h9xiks8

FSR uses AI to make the loss much less noticeable, to the point where lower % of display scaling are basically impossible to notice the difference on, especially on higher resolutions where the AI has more pixels to work on.

22 : Anonymous2021/08/22 18:36 ID: h9xtcgv

AI? Lol, nice koolaid

23 : Anonymous2021/08/22 04:49 ID: h9vinw0

This is so awesome to hear.

24 : Anonymous2021/08/22 05:00 ID: h9vjpzl

DLSS has gotten better since it's launch for sure, i remember Control looked like shiiit and I decided to go 1440p low settings because it looked solo much better. Not to mention I'm an fps guy don't really care much about the graphics

25 : Anonymous2021/08/22 05:07 ID: h9vke2y

Since hardware can't be purchased - I think there's going to be a renaissance in software optimization again. There will have to be more if this drought is lasting well into 2022

26 : Anonymous2021/08/22 11:55 ID: h9wevrd

man I really wish it was a global thing given the games I play are older but still taxing.

27 : Anonymous2021/08/22 20:31 ID: h9y954i

I tested injecting into RDR2 using Magpie and didn't like what I saw compared to native. I was upscaling 1080 to 1440. If I run 1440 native windowed mode, I get a weird white bar at the bottom of the display and not much of any performance uplift, so kinda pointless. However, if I run Terminator Resistance at 5K FSR (Using Virtual Super Resolution) I can lock it at 140 FPS, and it looks stunning, even when downscaled to my 1440p display.

28 : Anonymous2021/08/21 23:59 ID: h9ukx8s

I tried it with an old gtx 1060 3gb, and there is hardly any point at all. I gain 10fps from lowering resolution, and than loose 7fps from using FSR. Totally different story for my rx 6600xt, though. FSR is made to run on AMD cards, and the performance loss is much less. It's also better on generally more powerful cards, because the set amount of performance it takes is a much smaller cut from the entire compute process.

29 : Anonymous2021/08/22 07:19 ID: h9vvag8

FSR leverages FP16, so it understandably runs better with an AMD card at least as recent as Vega. Maybe Nvidia could do something from a drivers level to better optimize FSR for their GPUs, but I doubt they care that much about the 10 series at this point, specially when it's an AMD tech we're talking about, and when they even neglect the 10 series for their own technologies like Integer Scaling or RTX voice (initially), which were only supported by the 20 series despite the 10 series being perfectly capable of running them.

Nvidia can't add FP16 support to the 10 series because that's a hardware capability, but they could try to optimize FSR to better utilize the strengths their GPUs has. Not that they will, but they most likely could improve FSR performance at least a bit.

30 : Anonymous2021/08/22 02:13 ID: h9v16ey

FSR is made to run on AMD cards

Being something backed into software, the companies of each brand are encouraged to tweak it, so it can run in old gtx cards too, but it is up to nvidia to fix them, amd does its parrt.

31 : Anonymous2021/08/22 03:18 ID: h9v8v38

Nvidia can't tweak their old architecture enough. And they won't bother promoting their competitors tech either. That's just AMD marketing speak.

People have already tested this, and disabled the features on AMD cards that makes it run better just to get a representation of why it performs worse on older Nvidia hardware. It's just a phycial feature missing from older Nvidia hardware. Same reasons Intel XeSS will likely run poorly on Pascal GPUs.

32 : Anonymous2021/08/22 17:10 ID: h9xhc9v

meh... FSR

I'm more interested in Intel XeSS, looks very good and runs on all GPUs.

引用元:https://www.reddit.com/r/Amd/comments/p8vegc/just_got_a_game_with_fsr_gotta_say_holy_shit_its/

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x