I know it isn't as good as DLSS, but it is still a great performance boost. Unlike Nvidia, AMD is all for it's gamers. They know people can't get the new GPUs and have to stick with their old ones or buy the old ones for now. They didn't gatekeep this performance boosts to their new GPUs, unlike Nvidia. This is why I love AMD
The motivation is developer adoption. Invest in the open source to encourage widespread developer adoption and cater their hardware to the hopefully evolving software, then make money. They don’t need to beat DLSS, but be good enough to have developers adopt it in place or alongside DLSS.ID: h07d8lj
I know people shit on stuff like AotS, but we need more things like that.
We need a game that does a good implementation of both DLSS and FSR (bonus if it has something like Epic's TAAU implementation), so we can compare them apples to apples.
forums like this often become a hellscape of negativity with people who will endlessly chant about how all corporations/companies are evil and only want money and will screw over anyone and anything to get it.
It's based is some truth - money as a motivator. But it's disingenuous to assume that all companies are equivalent in their disdain for their own customer-base.
Amd is not Nvidia. Neither of them are Intel.
They are all motivated by money, yes, but that doesn't mean they can't have their own principles, or recognize the fact that they can make money(their primary goal) through fostering positive relationships with their consumers. Sometimes, quite often I believe, goodwill, customer loyalty, etc. will trump screwing over customers.
Just look at AMD's history of supporting open-source solutions, compared to Nvidia's 'black boxes' of code that comprise gameworks.
Freesync vs. Gsync.
All AMD cpus unlocked compared to Intel where you have to pay a premium.
Now FSR is another example.
AMD has done well and earned respect, IMO. They're not perfect, far from it, and they deserve criticism for some decisions. Overall, however, they're miles apart from the practices of Nvidia and intel.ID: h08v88m
Freesync was them just relabeling VESA standard Adaptive Sync. Much like how they conjured up "SAM" when resizable bar was already there. Neither were developed by AMD, they just came up with a marketing catch phrase to make it appear that they did. lolID: h089nu7
So freesync vs g-sync is kinda like FSR vs DLSS.
Nvidia did it first, and locks it behind a hardware firewall that certifies that anyone who makes it gets the same experience. (No flickering, adaptive sync with LFC) AMD releases an open, but inferior product with no real requirements as to if there is flickering or anything... BUT everyone can use it! So it definitely has way more widespread adoption!
I think FSR will be the same thing eventually. Nvidia released and perfected a product, AMD plays catch up and releases an inferior (currently) product in FSR but everyone can use it. It should have more widespread adoption eventually.
In those cases it's like "Nvidia does it right if you're willing to pay, amd does it not quite as right but for everyone". I think it's good to have both approaches.
I would love to see both available, to be honest. Add them both into game engines and such! Let them both push each other further!ID: h08atnv
That's a pretty fair comparison.
Only thing I'd point out is that DLSS when it released was far from perfect. They had to iterate on it to get it where it is now, and its very good now, though I'd argue it's hardly perfect. People talk about 1080p with dlss up to 4k and I've seen that and been left confused. It's certainly good, but not as close to native 4k as some people would have you believe. 1440p to 4k is much better, obviously. I could game on that. Upscaling 1080p? Less so.
But I digress.
We don't know a lot about FSR yet, but the images released so far do indeed show less impressive results than DLSS. Let's see how it improves, and how long it takes to get there.
Edit: I should also say I've had freesync on two different monitors over the past 3 years and never had a problem, flickering or otherwise, for what it's worth.ID: h07u4mp
How much did amd pay you to defend them bro? If amd did what your saying, where is the 5600 or why isn’t 5600x prices cheaper like the 2600/3600? Bro amd/intel/nvidia are corporations and their sole objective is making money, they don’t care about being the good guy or anythingID: h07xyiq
They paid me zero dollars but just today they offered a vendor agnostic upscaling technique similar to dlss. So, they paid me that much?
Making money/profit is not mutually exclusive with wanting to foster goodwill with customers. So no, making money is not their "sole objective". There is a difference between "sole objective" and "primary objective".
Making money is absolutely what they want to do. It doesn't mean they can't also want to have happy customers.
They're a company, yes, but they're also made up of people. Most of whom, I'd wager, are computing enthusiasts of some kind or another.
Remember, threadripper exists because people at AMD did it on the side cuz they thought it'd be cool and then higher ups thought it was also cool and could simultaneously make them profit. But it wasn't conceived of as a mere money-maker. Just an example there of how there can be multiple factors behind the things a company does beyond just "making money".
As for your question: "where is the 5600?" And the other question about the pricing of the 5600x. Perhaps their wafer supply only merited certain skus this year. They're also pumping out consoles, mind you.
As for price, who can say? I don't have access to the unit costs for all that goes into a chip, but it's not hard to assume that maybe it's more expensive to make a 5600x than a 3600x? There's so many factors that I could spitball and probably loads more you or I have no clue about.
Maybe these chips are overpriced. I straight up said that AMD is not beyond criticism and they deserve it sometimes.
But no one, and no thing is black/white. No company is all pure evil. No company is all pure altruistic. Some are better than others though, and AMD has made some great moves for the PC enthusiast community as a whole.
Were they motivated by money? Absolutely.
Does that change that they did cool, good things that they didn't have to? No.ID: h07y2z5
You’ve used your bro quota up for the year, sorry!ID: h08x6ft
They're not evil obviously but I don't think they adopt open solutions for the sake of good will. I mean the most obvious is their adoption of opencl, which they never improved and essentially abandoned after Apple went to metal. Now nvidia owns most of the 3D rendering market because they build their own cuda/optix solutions that run circles around AMD because AMD's opencl driver is far behind and is somewhat broken on RDNA2. People generally dislike nvidia's walled in garden but they don't really have a choice when the alternative is buggy and slower.
Freesync is also a similar example. Freesync is AMD's port of a VESA standard called adaptive sync and they really have no choice but to leave it open as it leverages VESA displayport capabilities. AMD's proprietary features in freesync 2 and freesync pro require AMD gpus and will fall back to basic freesync if you don't have an AMD gpu.
Nobody is giving out a free lunch, intel and nvidia are just bigger dicks about it since they owned the market.
We dont actually know how good it is. that they even support older nvidia cards is a pretty cool move and i hope that fsr can deliver.ID: h07bb9q
Ye from the showcases it seems pretty good. We will have to see once reviewers get their hands in itID: h07gyd1
The 1060 showcase is dlss 1.0 level or even worse to be honest.ID: h08guw8
I don't actually care all that much what "reviewers" say about most anything...I care a lot more about what I think when I can get my hands on things...;) "Hands on" is the only way to fly. It's going to come with certain games and the upcoming drivers will be supporting it for free this month, so people will be able to try it for themselves soon. I will be interested to see what the differences, if any, will be compared to dropping resolution while pouring on the FSAA. Since I now game at 4k, VSR doesn't do me much good anymore as 5120x 2880 internally is the only VSR resolution I can use (5700XT)--unless I create one. VSR was great when I was still at 1080P or 1200P native res--but at 2160P natively, 2880P slows me down more than it improves the image quality. Anyway, it should be fun...;)
It's not "For Gamers", it's strategic. If you release something that only works on your hardware, at 30% or so of the market, the uptake won't be that good, even if everything else about it is amazing.
If it works on all hardware, then devs will want to implement it, since it's a feature you can tout for users of both new and old hardware, from all vendors.
If it works well to increase performance and fidelity, then you even get to grow your TAM to include people that otherwise might not have been able to play your game due to performance.
Additionally, it has to work on all hardware, since Nvidia will pay to implement gameworks features. As a company with less disposable income, you need to be able to convince devs to implement it even without paying them.
I like AMD and all but be realistic here.ID: h07dr13
They've done this twice in the past, with Freesync and most recently with BAR. This has led to cheaper monitors and CPU's that can take advantage of the graphic cards memory, they didn't charge for either, unlike Nvidia who makes people pay for their tech like G-sync, which adds around 200 dollars to the price of a monitor.
I'm not sure your statement is true, given their past history.ID: h07fgzt
With Freesync, they were, again, behind Nvidia who launched GSync first. Not charging allowed them to get it in place in a large part of the monitor market.
It gave you incentive to move to AMD. You could either a)Buy a (relatively) inexpensive monitor with Freesync, and have extra incentive to buy AMD, or b)Buy an expensive monitor with G-sync.
Nvidia moving over to support freesync after the fact was just a bonus that made them look good.
Resizable BAR was a part of the PCIe spec and has been for some time (you may have seen it as "Above 4G decoding"). AMD just branded it as SAM, and may have added some parts to their driveto help with it, since it seems they benefit more than Nvidia.
We should probably stop labeling AMD "Good" or "Bad". Its a strategic move by a for profit company.ID: h07v7mh
If all you focus on is "strategy implemented", it ignores the primary target, consumers. The "good" vs "bad" is still a critical part of the equation and simply casting it aside is frankly stupid specially in current and potential future impact which may be "profitable" for the company but greatly detrimental to the consumer.
What is considered by most as good is open standards, universal support, and not requiring any kind of unnecessary proprietary hardware to make use of it, there are copious numbers of ways this is "good" and a few ways it can be "bad"... but the overall tends to be hugely favoring the good for the consumers, it's pretty much a neutral standing for the company.
Ignoring a companies consumer friendly practices is a foul mindset to have as a consumer, it's best to reward and acknowledge a companies move to do what would be the best interest for the consumer and less so for itself. Though AMD is certainly having to side with the consumer since they are still arguably the serious underdog. Now as much as people despise the term and it's use regarding amd, it's the word at the moment to match. Fundamentally it's best to support companies that are making big wins for the consumer, not being anti-consumer like some other companies that so many seem to be adamant about down playing what amd has been often doing for several years now and praising the anti-competitive/monopolistic ones.
I'm always dubious of those that make such utterances as what you have.ID: h080h2e
Bottom line it will always boil down to making shareholders happy. Implying a company is being moral will lead to greater disappointment just as we saw with the price hikes of Zen 2 and later Zen 3 as AMD sought higher margins, being labeled anti-consumer or otherwise "Bad". AMD's strategic moves as the underdog will not always translate to what they'll do if they own even half of the respective market.
No company is “good” or “bad” lol They are all a corporation trying to make profit If amd was “good” why didn’t they make the 5600x cheaper?(equal to the 2600/3600 pricing)ID: h07ifat
Yeah, all these companies are just out here pretty much just to take market share as much as they can possibly do, both AMD and Nvidia are not consumer's best friend. They are both multibillion dollar companies that only is interested of taking away your money in exchange for their product.
AMD with their abysmal market share of 25% does not have a strategic choice but to cater thr open source approach. It is a strategic move no more than that. If they opt for a proprietary tech, then there is a bigger chance that it will evaporate like an alcohol. They are gunning for adoptation but nvidia is not fool like intel.
AMD, with their dominace with consumer CPUs is starting to become Intel as well.
There is no Good or Bad company, its all strategy. If they are on top, they will use that leverage as well.
Good example is the SAM. They implemented it for specific cpus and motherboard. When nvidia announced that it will cater both AMD and intel cpus, AMD made their move. Another is the Fidelity CAS and sharpening, only available for 5700 series. After the farcry from people, they enabled it to vega as well.
Bottom line, no company is good. They are after lightening up your wallet using leverages.ID: h07o752
Poor dude is salty at his GSYNC investments, and how a puny 25% market share company made his inventory obsolete with Freesync.
They ain't "good", they're trying to get their tech as industry standard and that'd decrease nvidia's competitiveness. It's all about competing, if amd's the guys with dlss they would not bring it to competitors
Nvidia innovates with DLSS, RTX, Gsync, Nvenc and reflex and tries to make themselves extremely exclusive and be the cool kid with a high cost of entry, where as AMD copies them and then makes it open source for everyone to beat Nvidia. If AMD had Nvidias engineers and AMD's business model they would be unstoppable
How are you going to add tensor cores to amd gpus and older nvidia gpus?ID: h07b7ru
FSR doesn't require tensor cores, it doesn't use the same techniques as DLSSID: h07bg92
That's exactly my pointID: h07i9az
FSR don't use AI for it's upscaling method, therefore it doesn't require Tensor Cores.ID: h07lr89
You guys have a really hard time understanding that sentence, don't you?
I don't really think AMD did it with intention of looking good on the crowd that's all just marketing PR stunt.
But more like they want to increase the potential mass adoption of it when it is available almost on every PC GPU for the past 5 years instead of it being limited to just 1 - 2 generation of GPUs.
Nontheless as a Nvidia user, i still really appreciate it, it should be a nice alternative to us when DLSS isn't available to particular game, although DLSS is boosting their supported games quite rapidly for the past few months now, and it won't stop now or on future with it supported by all popular current of upcoming big game engines such as UE4 - UE5 - Unity etc etc.
FFX will always mean final fantasy X to me.ID: h07t33b
Well, if they made it Radeon only, I doubt many developers would adopt the technology. But since they're making it available for almost every PC built in the last 5 years, it could (and probaby will) surpass DLSS in terms of adoptionID: h082hrw
Not really you are forgetting consoles, and that is checkmate right there.ID: h08cllg
Current generation consoles are not supported on release of this tech from what I can tell, so thats a moot point at the moment.
However, in its current iteration it appears to be equal or worse than DLSS 1.0. Which was an abject failure. You're likely better off just running at a lower resolution than using this if thats the case, which means it may as well not even exist.
Time will tell though.
They support final fantasy x ?ID: h07t4x7
I guess it's "just a shader", so it might be hard to block? I mean they both support the same feature set for programmable shaders so...
They had to. No one would adopt it otherwise when your competitor has 75% of the market
When you have 20% marketshare, you tend to be generous. AMD wouldn't be doing themselves or developers any favors by locking fidelityfx to AMD cards only.
AMD is a very good company. One day they sent some hookers here. Very good company.
How do you know it isn't as good as DLSS?--that remains to be seen, imo...;)
It's a good PR move, but let's not forget they are a company and are in it mostly for the profit.
These companies aren't your friends.
I dunno about you but Final Fantasy X was true shit! Classic game. Sad that Square dropped support but at least AMD is here to save the day.
It's not necessarily for the gamers. Provided the tech is even half-decent, it's a clever strategic play from AMD. DLSS is only available to one very niche market within a market, RTX 20 and 30 series PC gamers, so there isn't a great demand at the moment, hence the slow uptake in games.
FSR will be open-source, available to a wide range of NVidia and AMD GPU's within the PC market (as old as the 10 series and Vega cards) and most importantly, to the console market. With AMD-powered consoles like the PS5 and Xbox, game developers will be far more likely to produce next-gen titles with FSR because it'll be easier to implement and make cross-platform across console and PC.
DLSS might still be the more impressive technology, but if no-ones using it, it'll eventually die a death. It doesn't need to beat it for quality, it just needs to make it a more attractive proposition for game studios to develop and profit from and market demand will do the rest.
I imagine, Nvidia would consider this move... ruthless.
The scenario: Nvidia makes a nice piece of tech, everyone demands AMD should make an rival technology. But with the workload needed to make it work nvidia are probably paying developers to use DLSS itself and AMD doesn't want to fight for every title with every developer.
Solution: Kill DLSS. Make your alternative work on older cards and rival cards, and demand it becomes the new standard. It raises the value of older cards struggling right now, and makes all the Pascal owners demand they support it. They've basically weaponised half of nvidia's userbase agaainst them.