Do you believe its a possibility for APUs to overtake low end GPUs for gaming?

1 : Anonymous2021/10/24 14:53 ID: qetp8i

Just as the title says, i've been wondering if, even though APU development is slower than GPU R&D, is it possible for a foreseable future to have APUs that close the breach between them and a low end GPU enough that APUs can become a lot more relevant than they are today, even ignoring pricing and availability.
Or do you believe that this breach will either remain or get larger, and APUs will never be able to do 60fps max graphics gaming at a certain standar resolution due to the development of graphics always being steps ahead?

2 : Anonymous2021/10/24 17:50 ID: hhvxog0

Considering that the chips inside the PS5 and Xbox series x are technically APUs, yes. Of course, bringing that level of performance to PC APUs would be more complicated.

3 : Anonymous2021/10/24 14:57 ID: hhv8v3l

Well, since APUs can't draw the power even a low end GPU is allowed to draw: no. But: they are actually a good alternative to low end GPUs and have been for months, for they are available, relatively affordable and, although not as powerful, more than capable of running a lot games on medium settings.

And since the tech used in APUs is derived from GPUs, how shouldn't there be a gap? The igpu is the low power version of a GPU.

ID: hhwmqeu

It really depends how you look at it. If you look at the gtx 1650 the card use 65w, if you consider that part of that power goes to vram, it is off the chip in a apu scenario. It would be totally realistic to see the graphic portion of a apu using 40 to 50w, paired with a 65w compute, you end up with a 105 to 115w tdp part that would only use about 80 to 90w under normal gaming workloads. Rdna 2 scale a lot with voltage and power, so I would not be surprised if they we get a apu with rdna 3 cores that clock at 2ghz for laptop parts and they push it to 3ghz for desktop parts. Finally ddr5 will almost double bandwidth of ddr4, while being not a super big deal for the compute parts, it will unleash future apu designs.

ID: hhx9l5r

That really makes no difference. Any advancements that applies to an APU would also apply to a discrete GPU, and the discrete doesn't have to worry about sharing its thermal envelop and socket with a CPU. Not to mention its a lot more expensive pinning together a bunch of LPDDR5x to hit high bandwidth while keeping latency low for CPU ops compared to just using GDDR.

The only thing that would cause a shift from low end discrete to APU is if they just gave up on producing low end discrete gpus, which is possible if the market stays on its current path, but it wouldn't be from any technological advancements.

4 : Anonymous2021/10/24 21:28 ID: hhwtti1

No because:

1) power constraints - what's the max power consumption generally for Apus, 65-95w? Some of that will be taken by the CPU, let's assume about 25-45w depending on config, that severely limits apu performance Vs discrete gpus which even for really low power can use 75w+

2) memory bandwidth constraints - dual channel dram offers far less bandwidth than vram and that also has to be split with the CPU, meaning in reality it has significantly less than lower end GPUs. Zen for example can scale up in performance with dual channel ddr4 >3200 in certain workloads, which is 51.2GB/s. Even hypothetical ddr5 6400 will only.double that to 102.4GB/s, compare it to the 480 and 1060 which are midrange GPUs from 5 years ago - 256GB/s and 192GB/s respectively. Even the 6600 which has some infinity cache (and so narrow 128 bit bus) has 224GB/s. There is no way for them to compete in memory bandwidth

3) same arch - Apus aren't going to be some special magic arch, they're going to be the same arch as discrete gpus. A much smaller, lower power gpu with far less available memory bandwidth is always going to be a lot slower than one on the same arch with a lot more of everything. That is fundamental and why GPUs can scale from no power connector to 300w+ beasts, there is nothing that can change that

4) cost and so size constraints - Apus are generally a lower end part for the mobile market/compromise between CPU and discrete gpu. They have to make sacrifices to get there (like zen 2 apu cache is reduced Vs desktop iirc) and so you can't have a big apu at a small price because they will quickly balloon in size. The current highest end apu has 8 cus and the 6600 has 28 for example and the rumours for next gen are 12 rdna2 cus. The 5700g with 8 cus is what, $350ish? What will an actually big apu that replaces low end GPUs that are many times faster cost?

5) even if they did make a big apu to effectively replace low end GPUs, how are you actually going to feed it? Lots of soldered ram with a wide bus like consoles/macbooks now? Okay, that'll and you still probably can't feed it properly so you'll need a lot of cache (infinity cache/vcache) and that'll also cost you. How are you going to cool this 150-200w+ apu? Who's going to make and sell these pcs because at this point theyr going to have to be sold as complete systems, because even quad channel setups won't provide the bandwidth. At this point what you have is a console essentially, or a CPU and discrete gpu

The only way it's possible for them to "replace" lower end GPUs is if they continually raise gpu prices and make the low end essentially non-existent then yeah sure they'll replace them. But currently the best Apus from intel and amd are a lot slower than Polaris and gp106 which are 5 year old midrange GPUs so if we get to that point then we're screwed

5 : Anonymous2021/10/24 16:28 ID: hhvlf5t

Low end GPUs are GT 710 and GT 1030, 5600G beats them in games.

AMD Ryzen APUs have been out performing low end GPUs since 2010s idk what you are talking about.

Here is a video about the same.

You can't buy AMD's best product.

Not to mention, PS5 and Xbox literally use an APU.

ID: hhwggvv

I was thinking more on 1050ti or 1060 or 1660 levels. "Low end" for gaming, but not entry level.

ID: hhwtu25

Those GPUs use 256 bit memory bus, so dual channel DDR APU can't match that. Steam deck will use quad channel to close the gap tho.

ID: hhxocsh

1060 is a midrange GPU. It was $250 at launch.

ID: hhxebgb

Bruh, you should've said that. 1060 or 1660 level performance from an APU makes no sense in a cost ,VRAM buffer, power draw level and memory bandwidth level comparison lol.

But like I said, PS5 and Xbox use an APU so it is possible. Just that those are designed with Gaming in mind, not like general purpose APUs for desktops.

6 : Anonymous2021/10/24 15:19 ID: hhvbprz

As stated, no, a GPU will always be able to draw more power, and have access to much faster memory. APUs need to access slower system memory, and is also limited on how much it can access. They are still great budget options however. If you can't afford a GPU, an APU will get you into the game.

ID: hhw4jwb

he's asking about low end gpu, not higher end ones so cooling is a lesser problem than marketing is - after all the new xbox and playstation are just amd apus made on custom specs, the only reason for amd to not sell apus as powerfull as those is that their marketing research show that it wouldnt be profitable, not for technical reasons.

ID: hhwla8r

Even a low end gpu will have faster memory. Even if it used ddr4 it would be faster because it wouldn't need to share memory bandwidth with the cpu.

Hooking an apu up with ram as fast as consoles have is actually not trivial. You wouldn't be able to use gddr6 easily, for example. You can't buy off-the-shelf DIMMs with them, so they'd all have to use a fixed amount of ram directly soldered to the motherboard. A very wide ram bus also takes up space on the motherboard, which could push the price of said motherboard out of the budget range that APUs are generally made for.

You could design a monster motherboard with quad or six channel ddr4, but that's some expensive stuff to make. DDR5 "sort of" has twice as channels, but the bus width per channel is also halved, so the effective bandwidth isn't doubled. Quad (or 8 really) channel DDR5 might be fast enough for an actually good APU.

7 : Anonymous2021/10/24 17:36 ID: hhvvj0p

Define 'low end' first.

Also, keep in mind that what's still perfectly usable today might not be in a couple years as true next gen games start coming in and requirements go up. That's always a problem when you're at the lower end of things - you are much closer to the verge of 'outdated' unless you only play olde

demand games.

Anyways, a company like AMD could do a whole lot with APU's if they had the will. But you reach a point where you're gonna start to need proper video memory for bandwidth, which would mean unified memory setups, and new sockets and motherboards and whatnot. I personally think AMD will take a decent step forward with RDNA2 integrated GPU's here soon, but I dont see them tackling anything that gets into sort of traditional $250 sort of GPU territory either. It's just a lot of hassle for a market I dont think is *that* big. Most people I think would prefer to have a GPU they can replace.

ID: hhy1okw

Low end is roughly GTX 1000 series or RX 500 series. rDNA and Turing are low to mid range now.

Anything older than a GTX 1050 is probably worse than an APU by now.

ID: hhz8qrl

My GTX 960 beats all of the AMD APUs

ID: hhyjs3t

Can you play farcry at 1080 with an apu? You can on a gtx 750ti

8 : Anonymous2021/10/24 18:07 ID: hhw07g4

Not unless the system memory (that APUs need) comes close to VRAM in specs. I mean when all is said and done the main bottleneck even with high end dGPUs is memory bandwidth. Nvidia went with GDDR6x and AMD put a huge cache on the die. Both really expensive solutions.

Haven't even brought power into the equation and we are already facing many hurdles to APUs actually scaling.

I'm not taking M1 Max into consideration. That CPU is about 4 times the size of a usual x86 CPU and is many times more expensive than your average APU so it's not even in budget territory, You would need a special socket and motherboard just for this so integration on multiple platforms would be a huge problem. Apple ofc doesn't care about that since they don't sell CPUs/GPUs by themselves.

9 : Anonymous2021/10/24 17:22 ID: hhvte4r

If Apple makes them, yes.

ID: hhwlw6p

M1 Max looks sooo impressive, I legit wanna see some emulato

camp benchmarks

ID: hhwzsmk

It does, but it's also not cost efficient at nearly $4000. You'd be better off just building a PC with a discrete GPU.

10 : Anonymous2021/10/24 18:58 ID: hhw8094

I think the question is a misunderstanding on its own. The only reason APUs are close to competitive is because there haven’t been any low end GPUs for a couple of generations. The APU only appears to be getting more competitive because the graphics tech in the new APUs is newer. Current AMD graphics technology is always going to do lots, lots better in a standalone GPU as it doesn’t then need to compete for power and cooling with the CPU, which it does when they share a die.

Having said that, it’s possible that we’ll see a narrowing of the delta between standalone and integrated graphics solutions if Apples new chips are anything to go by. The whole presentation I was kind of thinking that AMD should have been in this space years ago and have blown a potential leadership position, with Intel now catching up too. Apple look to be demonstrating that it’s possible to do a lot more with graphics on a SoC if you optimise for that specifically - whereas RDNA2 is (presumably?) optimised for standalone GPUs and adapted for APUs later.

11 : Anonymous2021/10/24 20:23 ID: hhwkjqj

Already happened with the MacBook Air. Just a matter of time till the costs come down to kill low end gpus.

12 : Anonymous2021/10/24 20:24 ID: hhwks3a

With the Steam Deck having RDNA2 Graphics, and the info leaking about Barcelo/Rembrandt, we are going to see a big uptick in iGPU performance on desktop PC's.

I think we may start seeing larger GPU's on APU's and maybe will actually get something similar to PS5 and Xbox Series X.

I guess that actually kinda answers your question, this generation of consoles has an AMD APU, and they are gaming at 4k and very decent quality!

13 : Anonymous2021/10/24 22:22 ID: hhx14bj

Yes. They will. But, still 4 years or so off and that assumes what is meant by low end doesn't move. Let's say around an RX580/5500XT.

Sticking to a generally familiar platform dual channel DDR6 should give enough RAM bandwidth and 3nm small enough to cram enough CPU and GPU cores and some infinity cache and stay in a decent power threshold. The problem with that is when the chip doesn't have a guaranteed market of tens of millions like XBOX and PS5 is that it is risky to plan a certain production 4 or so months before you can sell it.

So, going to something we haven't seen yet. CPU chiplet, GPU chiplet, IO die. If AMD can figure out chiplet GPUs that means the base chiplet becomes the APU GPU and discrete GPUs are 2+ chiplets. AMD doesn't have to decide which until the wafers are done which is the longest lead time step which reduces business risk.

A step further would be stacked DRAM on package should hypothetically be able to more easily have 256 bit wide bus than DIMMs, but it wouldn't be upgradeable so that is a maybe. It is an interesting thought for the next XBOX or Playstation. A CPU, GPU, IO, DRAM, 1 TB flash in an EPYC size package seems like it could happen in the not to distant future in a game console.

14 : Anonymous2021/10/25 10:23 ID: hhyx2ql

AMD has abandoned low end GPUs. The most recent one is the RX 550, but lower than that it's still the R7 240 (or its refresh R5 430), which has been surpassed by R7 Kaveri APUs a while ago.

15 : Anonymous2021/10/25 15:06 ID: hhzr86q

Wanna know something funny? It's already happened. It's called a console. And forget low end, they're long past that level of fidelity with the current generation of machines. I know it's not the same though, and I know what you meant lol.

To answer your question, I don't see any of this happening in the forseeable future. The way we think about CPUs, APUs, GPUs, and how they work would need to change entirely. At least regarding the desktop computer space. As another poster said, desktop APUs can't even draw the power required for budget desktop GPUs. I definitely think it's possible, but unless we land a breakthrough in tech that's affordable to produce and sell to consumers on a large scale at a reasonable price, I don't see this happening within this current console generation which will likely last a decade. If an APU is gonna pull that off, it's not gonna be an APU like the ones we think about.

16 : Anonymous2021/10/24 15:03 ID: hhv9num

No. A dedicated GPU will always outperform an APU in the same generation.

That's also the reason why APUs will always be below the mainstream resolution/fps standard.

ID: hhvmukb

I love the concept of the old APU and HD cards to crossfire. While it didnt improve performance much but there were abit of gains.

I think thats still a good market for esport genre apu + low end dgpu.

ID: hhvkg72

There is still a case for Kaby Lake G where CPU and GPU were on the same package and the GPU even had HBM VRAM on it.

ID: hhw761b

FWIW this is technically a dGPU solution, just happens that the dGPU is on the same package connected via a PCIe 8x link - it's not integrated into the CPU die itself - Intel UHD Graphics is still present on the actual CPU die.

ID: hhvxn3n

The Ryzen 5700g beats the old RX550 in pretty much every game so not really. Of course that is likely to change when AMD finally releases navi 24 but even then price/performance crown will likely remain with APUs

ID: hhwuqqg

If we have to go back half a decade for Apus to "win" Vs lower end GPUs which were not much more useful than video out devices then the answer is no, they don't replace them lol

ID: hhvyd6k

The 5700g and RX 550 aren't from the same generation.

17 : Anonymous2021/10/24 15:35 ID: hhvduf1

Apple's M1 Pro and Max are APUs? Those outperform even most high end GPUs.

ID: hhvi995

You're getting downvoted, but you have a point.

Apple was able to throw way more transistors at graphics problems using a lower clock rate and better process, all of which reduces the power draw. They also have GPU level memory bandwidth (400GB/s) on an APU.

All to say that it can be done, but it seems like Apple will be the only one who has the vertical integration and deep pockets needed to do it for a while, unless AMD or someone else can make a monster "wide but low clocks" APU that also has huge amounts of memory bandwidth or a lot of cache to make up for slow memory.

ID: hhvmz3u

it will simply always be cheaper to use less silicon and more power

ID: hhw6gl8

Also there are the Console chips for Xbox and PS5. Those could be marketed as APUs if AMD was willing to.

18 : Anonymous2021/10/24 16:41 ID: hhvndxq

In this mining age, absolutely, in other timelines no

19 : Anonymous2021/10/24 16:36 ID: hhvmpof

Integrated and Dedicated graphics are technically the same thing, just packaged differently based on cost and other factors. Can they? Yes.

They can also be more powerful if you like, just stack on more cores, bigger bus, etc. It's not like they're different in any way.

I feel like people need to take a little look at exactly what they think they're buying before they answer, because the only real difference is core config and the ram bus lanes. This isn't a question worth answering because efficiency can make a newer GPU (and process) surpass a previous gen offering.

What Apple have done is simply scale WAY up.

20 : Anonymous2021/10/24 17:45 ID: hhvwvvn

Well, my 5700g comfortably outperforms my 710

21 : Anonymous2021/10/24 15:05 ID: hhv9v0f

That would depend mostly on the market. Currently there are no low end cards, so APUs are pretty much the only semi-budget buy (they're also not that cheap). Intel promises to have some low end GPUs next year, so that might change things.

The answer in a non-price-gouged market is: depends on your definition of "low end", "more relevant", etc. Sub-$100 cards are already underpowered compared to APUs. Sub-$200 cards will likely remain more powerful than APUs for a while. APUs play most games okay at 720p 30 FPS, and will likely continue doing so.

22 : Anonymous2021/10/24 17:37 ID: hhvvr8v

I imagine intel and amd are both watching the new Apple chips carefully. Neither will want to abandon the top end of the laptop market to Apple

23 : Anonymous2021/10/24 22:16 ID: hhx09cu

APU's already have. Every AMD APU since..well, forever. Intel's HD graphics with eDRAM. Then look at Mobile SOCs like found in modern cellphones.

The limiting factor is power and system memory speed. DDR5 will close the gap from future APUs to modern day low end GPUs (like a GTX650). But also with technology like FSR we can absolutely close that gap today, just not in raw power.

24 : Anonymous2021/10/24 15:23 ID: hhvc9pm

I'll say yes.

And the reason I say yes I'd because rumor has it AMD and NVIDIA are trying to kill the low-end market. Intel might be able to save us here.

25 : Anonymous2021/10/24 15:22 ID: hhvc4wi

60fps medium 1080p next gen yea, ultra no and dgpu always gonna be larger. If your target performance's fixed and ain't gonna follow market performance it's possible

26 : Anonymous2021/10/24 16:37 ID: hhvmu5t

No. Gpus on apus typically draw 5 watts. Gt 1030 draws like 25 watts and is way more powerful. Not to mention faster more memory.

27 : Anonymous2021/10/24 17:45 ID: hhvwvxk

It sure seems so, neither brand seems interested in launching new cards in the low and entry levels and APUs are filling the gap nicely

28 : Anonymous2021/10/24 18:10 ID: hhw0r57

Do you mean same generation GPU and APU? If so, no.

If you mean a past generation GPU compared to a new APU then at some point yes.

29 : Anonymous2021/10/24 18:27 ID: hhw39r8

A new APU may outperform an old low end GPU. However GPUs will always have the advantage without a massive gulf it the technology used. They're reasons for this, space, powe

, memory.

Integrated graphics have to fit the GPU die in with the CPU die, either as part of the same die or another chip entirety. So low end GPUs can have larger chips.

Heat wise if you tired to get a gtx1650 into a cpu you would need to account for another 75w. That's extra work for the vrm and the CPU cooler has extra heat to get rid of. That would make modest CPUS consume power like Intels i7s/i9s and you'd need the cooling to match.

Memory is also a problem. 2GB of shared system memory cannot match the bandwidth of even a gtx 1650, and the memory bus is shared with the CPU. So you couldn't get low end GPU performance as you'll be memory bottlenecked.


Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x