- [OC3D] AMD's R&D spending has been boosted by over 40% Year-Over-Year
-
Fuckin' a. Throwing money at things doesn't necessarily do anything on its own, but it sure does help.
ID: h6y72gqID: h6y7qogI’m glad they’re not pulling an intel and throwing money at securing a monopoly. I’ll never buy intel after that fiasco.
ID: h6ya9neBetter than throwing money at sales and marketing and cutting engineering...
ID: h6yy7lwJust a reminder they got results simply by enabling their engineering team with a fraction of not only their current R&D but their main competitors R&D. Of course this will help, but using their competitors R&D as an example simply having money thrown at a problem doesn't mean a guaranteed win.
ID: h6y8d5mIntel has that now too, an engineer as leader. It had me excited.
Then they announced the new marketing initiative, rebranding the nanometers, and even talking about how they are going to use Angstrom size soon. It screams: repeat of 10++++++ fiasco.
ID: h6y6cqowell look at how well they've done with the tiny R&D they've had for the past few years, I'm looking forward to what this increase will gain in a few years
ID: h6z75esThrowing money at things is a major part of getting things done in this sort of industry.
ID: h71se3pNo - throwing money to get highly skilled, capable engineers is how you get things done in this industry.
Throwing money around can get you a pile of marketing that is selling a 5 year old processor that has a core speed 80 of that of the current market leader - and the market leader has 2x the amount of cores available.
You need to use money effectively in order to see benefits.
-
I hope they put some of the R&D money in their software dept. too, pumping out hardware means nothing if the software is shit.
ID: h6y4ib4As long as it's to improve the quality of the software (both Linux and Windows), and not adding pointless Radeon Software features or UI overhauls.
ID: h6z5oohNote that this requires different breeds of engineers. You aren't going to have your front end people work on low level graphics API performance improvements.
ID: h6y6e6hthe drivers have improved a ton in the past few years
ID: h7051o2I feel like they've gotten worse. Once I upgraded to the new UI and shit, my Vega 56 got crazy unstable. It was totally fine before I updated them. Sucks.
ID: h6yrwlzreally i got a 6800xt and that shit crashes every 2 days on the dot because of shitty drivers.
ID: h6xxs6x^^^This
Amd needs to focus a LOT right now in this department sick and tired of them almost always chasing Nvidia it would be nice to see them take a step ahead of them for once in software.
ID: h6y1qo8Until about 2 years ago AMD software teams were cut to the absolute bone and probably more. According to
they are ramping software spending and I'm not just talking about graphics drivers. They don't really talk publicly about software at the executive level or give roadmaps, which is unfortunate.ID: h6zu1v7They are finally getting the much needed money from sales to do that. The reason nvidia is ahead cuz they got the money and can also recruit more talent cuz they pay more
ID: h71nracAMD absolutely needs better drivers - but they ARE absolutely delivering slowly, but surely. Re-writes, cleaning up old software stacks, and getting new people up to speed on complex projects isn't a 2 week process, it's more like 3-6 months, on top of a hiring process that can take 3-6 months.
Hiring high skilled workers and getting them going on projects in a meaningful way is not like hiring another kid out of high school to flip burgers - I mean ya, it's both hiring, but the vetting process, and so on is far more involved out of absolute necessity.
On top of this: Intel gave AMD a silver platter to kick start there return to CPU competitiveness - NVIDIA has not done the same. And if you want the prime demonstration of this - go look at Intel's absolute shit show of a response to the quick succession launch of Ryzen / Epic / Threadripper (which was basically out of left field). And AMD didn't stop - 16 cores on a mainstream platform is kind of insane when you consider realistic Ram on mainstream for MOST people is 16 or 32GB of ram - and ideally you want roughly 1-2GB of ram per core, AND you need the bandwidth to actually feed them all when they are going at full tilt. Ya, there is yet another reason why memory overclocking can benefit ryzen chips so damn much.
So Yes, AMD has a ways to go. No one is saying differently. But if there are features you think AMD should have that NVIDIA does not have (and is not developing right now), go suggest them. But right now, AMD has to work on feature parity first. And if AMD can get to that extent, with at the very least "good enough" performance, then we can start talking about improving features and figuring out cool features people would love to see.
In other words: Catching Intel was catching a sleeping tiger - trivial. Catching NVIDIA? It's like trying to chase a cheetah that has never ending energy. Ya, you can make a rocket to get to it - but you still have to figure out how to fuel that rocket, build it, and of course have it be successful. Catching NVIDIA is NOT going to be easy, and AMD is not going to be finding NVIDIA with it's pants down anytime soon.
But to be blunt: This is a good thing. Competition pushes excelence. And with Intel aiming to enter the GPU market - we might finally have a contender that can create a decent product (eventually), that basically (out of necessity in dealing with AMD and NVIDIA) competes on price.
ID: h6y6gakPumping out software means nothing if hardware is shit. Hardware always needs to come first. Im sure their engineering, consultants, marketing, and financial team know what needs accomplished first for long running profits. It's one thing at a time. They know how supply and demand works. The demand for hardware is high, supply is low, which means higher prices, and they build capital from that to restructure. When supply is high, they can lower prices, sell more hardware and focus on software when they reach budget.
ID: h6yvvwvHardware means nothing if software is missing, both are needed, they are missing the whole HPC market by not having any comparable alternative to cuda.(opencl is not comparable)
ID: h6yw77nFair, but their hardware isn't shit. Right now, the software support/quality is a bigger gap for AMD. They don't NEED to compete with the 3090, or even the 3080, when they make up the top niche of the market. They'd likely find more success competing in the $200-500 range with great software than if they have mediocre software, but replace mid-range offerings with the high-end stuff that they offer now.
Their VR support is still lagging, though it's ALSO a niche thing that isn't needing major addressing now. Their general stability isn't quite to Nvidia's level, plus they're behind on ray tracing (hardware quality AND software support) and their FSR solution is behind DLSS. Yeah, those things take time to build out, just noting that it's not like AMD has been putting out bad hardware. They have more room to catch up on the software side than on hardware, IMO.
ID: h6zy5dvwhat? thats the exact opposite of the truth. x86 endured RISC design performance gains for a long time just from holding legacy support until amd/intel eventually got a lot of the overhead down. people continue to buy nvidia gpus specifically for CUDA and these days also cudnn, even with AMD gpus shipping at larger memory sizes against everything except the 3090. GCN steamrolled nvidia in compute/watt but nvidia still took over the market because their libraries were a thousand times better. As it gets harder and harder to shrink processors software is just going to get even more important.
-
It’s great to see after seeing success thEy r properly reinvesting back into the business.
-
Hopefully, this means AMD can start looking at helping us content creators and 3D artists out and give us some stronger alternatives to Nvidia!
ID: h6yq65vIntel would probably be an earlier option for you than AMD. They're investing a lot in GPU and their software team and budget is much larger.
-
As great as AMD has been doing they still realize that Intel and Nvidia are still the 800lbs gorilla's in this fight so they cannot afford to remain complacent during this time of growth. Intel became complacent and used their funds to buy back stocks instead of investing in more R&D now they are paying for that mistake. Now with new leadership at Intel it looks like they are getting serious about the future of the company and with Nvidia potentially buying Arm AMD cannot afford not to invest more into R&D.
With that being said I really hope that AMD uses this increase in R&D to improve their drivers and software stack.
Edit: added the word "potentially"
ID: h6ypp7qNvidia buying Arm
They haven't actually bought Arm yet as the deal still has to get through government regulations and it looks like it might get blocked in the EU.
-
here's hoping it gets put to good use
-
May it continue to increase
-
They need to increase to over a billion.. because nvidia spend 3 billion and intel is at 13 billion
ID: h6xqtyfNVIDIA spends close to $4B in "2021" (fiscal year), and AMD at the current quarterly rate is at a little over $2.6B, or about two thirds. So yes, there's a difference, but the situation is much better, and the difference isn't as big as it was.
Intel's budget isn't comparable. It has a lot of R&D that's completely outside CPUs and GPUs.
ID: h6xwh5oDon't forget that Nvidia's R&D is limited to their GPU department or at least it was before their announcement of their ARM core thing while AMD divides it between CPU and GPU
ID: h6y4ltzIts not the number, its how you use it. You can spend 200 billion and get jack, or spend 2 billion and gain whole new architecture. AMD might do more with less. Its hard to tell until, say 5-10 years down the line.
ID: h6xvmsgMost of Intel's R&D budget is in their doomed foundry business. :/
ID: h6yh3x8Intel's foundry isn't doomed. They couldn't go fab-less if they wanted to. Samsung and TSMC just don't have the capacity Intel needs. Intel still has plenty of money to dump into getting EUV working. Plus it's important to remember that while fabs do have large input into machine design, they aren't the ones designing process machines. Fabs buy from the same machine designers. So if any single fab gets a new technology up and running, it's never too long before the others get it too, budget permitting.
ID: h6z7s23Fanboy shit like this is probably look really stupid in a few years.
ID: h6y2wb5If nvidia had used tsmc 7nm, it would have been a blowout.
It's good to have competition, but we need cards. Even rx590s on glofo 12lp+ would be welcome. 20% performance or 40% efficiency over original rx590. That's 1660 super performance.
ID: h6y61q9Seeing even 1050tis selling for 2x-3x of their price, anything at this point would be welcome
ID: h6yrfwnThere would be like 5 cards worldwide if nvidia had used 7nm also
ID: h6yfic8No, they'd just be about on par in terms of power consumption with AMD instead of considerably behind. Performance probably wouldn't have improved much if at all.
ID: h6zyu25imagine spending 13b and the best thing R&D comes up with is renaming the nodes...
-
They managed to kill Intel with a shoestring budget and put up a great fight against nvidia (dont care what the fanatics think, which by the way, makes me wonder why you need to be in an AMD sub).
Imagine how much more they can do with an increase in that budget.
-
It's amazing what happens when a company is profitable and has a budget...
-
Fingers cross that those $$$ spent will produce something useful. AMD spent a lot in the past on projects following K8. None of those materialized.
-
Financed by you overpaying for every single AMD product for the last year and a half.
-
I don't know if my slightly less than 3 years old RX 580 8GB Nitro+ is dying or the driver is shit, but I like to rule the latter out. Hopefully they can also work on their drivers with those profits
-
Things like make rdna4/5, cdna3/4, zen5/6/whatever else only more exciting, having significantly expanded budgets for future products hopefully means great things ahead
-
Yes yes… more product launch is good for my portfolio
-
now if they would spend that on finding a way for my drivers not to mess up anytime windows updates it would be amazing. I love my 6800xt and prefer Radeons software overall but damn every month or so I get a sweet drivers don't match message and have to redo everything lol.
-
This is nothing more then AMAZING news. Amd Alder Lake is coming and rumors say the 3D cache will mainly just help games we need Zen 4 thankfully Amd said RDNA 3.0 and Zen 4 is on track.
If ANYONE has an idea on what comes after Zen 5 i would like to know as of today even the tech tubers have no idea.
-
even the tech tubers have no idea
Well, if even the tech tubers don't know, no one does.
-
I mean the ones that actually take a long look at the patents Amd files
-
If ANYONE has an idea on what comes after Zen 5 i would like to know as of today even the tech tubers have no idea.
I could make up some random bullshit and put it in video format for you if you'd like?
-
Again patents, Amd sub Reddit can be so toxic some times. lol
-
Zen 6?
-
No Amd at least in the beginning said Zen 5 was the last Zen architecture revision.
-
Do we even have certainty of a Zen 5 at this point? Road maps that I can find go Zen 4 late 2021 / early 2022. This would put Zen 5 more mid 2023, or early 2024 depending on how everything pans out in terms of architecture development, chip shortages, etc.
Now - some of what follows is speculation.
Odds are - we are going to be looking at a ground up architecture. By the end of 2025, windows 10 is going to be EOL, and the goal there is to basically enforce certain features be available. This will have a secondary effect of software manufacturers being able to expect certain instruction sets - and allow a general phase out of legacy support in general from hardware.
On top of this, AMD for sure, and I presume Intel - both have ARM licences, and with Apple going Arm, Microsoft having dabbled in it, Intel going that way, and so on - It is very likely that we will see some form of ARM architecture design. This has a lot of implications for mobile devices w/ docks.
Another thing to consider is Risc V development track - and it is coming a long way, and fast. As a very open platform focused product, this seems to go in line strongly with GPU open in general, and we may very well see AMD pivot in this direction at some point.
What I can tell you with fair certainty is, by 2025, I will probably be considering a platform upgrade (CPU + Motherboard + Ram), and will probably be considering a GPU upgrade as well.
But honestly, it could just as easily be another Zen architecture project - say a refresh - if a new development is hung up do to unexpected complications.
But to be perfectly blunt: No one fucking knows.
引用元:https://www.reddit.com/r/Amd/comments/ottnc0/oc3d_amds_rd_spending_has_been_boosted_by_over_40/
To be fair, they have an engineer leader throwing money at the engineering team and getting results. Thats the best kind of throwing money tactic if I've ever seen one lol