Why does AMD not put a tiny GPU core in the IO chiplet?

1 : Anonymous2021/03/14 12:27 ID: m4ufq6

Intel still has the advantage of nearly all CPUs having graphics good enough for most desktop use. I would think it should be fairly easy for AMD to counter this. Why do you think they are not doing it?

2 : Anonymous2021/03/14 13:51 ID: gqwb69f

Because it isn't tiny.

I think for Intel consumer-level CPUs, the integrated GPU takes up about half of the die.

AMD could put a really small GPU in the IO die, but it doesn't make any sense outside of diagnostics. Without good ASICs (for hardware acceleration, etc) it would be largely useless.

ID: gqwl5wc

That's the problem- it is overbuilt, it could be smaller and way less powerful- just for a single 2D output.

ID: gqwkfg9

I suppose you are somewhat right, from a bit of googling I may have underestimated the size a bit. So an older (Zen1 gen) 3CU Vega seems to take a bit less than 100mm² which would indeed significantly increase the IO die size. Not sure how much more that could be stripped down to still be good enough. Maybe it will make more sense on a newer node...

ID: gqygd5i

AMD's i/o die are all done on older global foundry nodes since according to AMD shrinking the i/o die does not improve performance much yet hurts yields (also probably cheaper than 7nm i/o dies). They already draw a decent amount of power depending on core count/RAM overclocking so you add that in with a GPU and you've got yourself a toaster.

They drop the I/O die and shove infinity fabric into one monolithic chip for their APUs to get around the power draw problem.

ID: gqy4bgi

Vega 1 baby

3 : Anonymous2021/03/14 12:29 ID: gqw6t9x

There's the desktop apu

ID: gqwqci6

AIUI it doesn't use chiplets?

ID: gqwu97j

Probably because AMD hasn't manufactured APU chiplets yet? It's on the roadmap, though, so I fully expect this to be one way they approach the integrated graphics problem.

ID: gqxzwx2

all APU's are monolythic

ID: gqxwut1

The one you can't buy?

4 : Anonymous2021/03/14 12:36 ID: gqw79li

Because they have spent most of their R&D budget on high performance GPUs, and have traditionally gone with APUs featured fused down dgpus for cpus that ship with graphical compute performance. To get an igpu on the power efficiency level of Intel's igpus would require even more investment into R&D, including research into the best substrate layout method, which is currently optimized for their chiplet design anyways.

And, again, AMD technically already has done this but with APUs, but those are the "budget friendly" market segment, and AMD has stated a few times now that they want to build their brand upwards and not focus on being the one company you buy when your cpu budget is like a hundred bucks.

5 : Anonymous2021/03/14 13:01 ID: gqw95nf

On the leaked roadmap, Mainstream Zen 4 on desktop is listed as having a RDNA GPU. So they're going to do something like this, but so far they haven't.

6 : Anonymous2021/03/14 12:53 ID: gqw8j59

There's not a lot of value in imitating the Intel strategy of "an iGPU that's only good enough to drive Windows and little else" - the high-end market is always going to partner a CPU with a dedicated GPU, and the laptop/OEM space would rather have a more high-powered iGPU like AMD is already doing: it makes the computer more capable all by itself, and office computers don't mind the performance hit from having a smaller CPU cache.

If a company wants to buy a thousand office computers that only need to run Excel, there's no "middle-ground" where they'd get significant savings from AMD having created something like a Ryzen 3100i with four cores, eight threads and a single Vega compute unit, versus simply buying a thousand units of Ryzen 3400G.

ID: gqwboyn

I don't know how big the market is, but there are people who'd like to have a more beefy CPU than the 3400G while not needing that much GPU power. But even if that market is not big, it seems like a feature that can be pretty trivially added, so what reason is there not to do it?

ID: gqwd63a

A more beefy CPU than the 3400G is already the Ryzen 4650G and the 4750G (or even the just-leaked Zen 3-based 5300G)

The problem may well be that AMD isn't offering these parts to enthusiast consumers, but there aren't going to be any cost-savings or market-opportunities from a 6-core with even fewer Vega cores than the 4650G's.

ID: gqwuov4

there are people who'd like to have a more beefy CPU than the 3400G while not needing that much GPU power.

This is kind of a niche market. At the low end, this is going to be served by APUs. At the medium end (at least in Asian markets), 4000-series APUs pretty much fill this gap.

Elsewhere, it's diminishing marginal returns. If you're chasing pure compute power, you need the space on the die for CPU cores. If you want graphics, you'll need to give up real estate for GPU cores. At a certain point, you point those users to Ryzen 5 3600 and up, and sell them a Radeon RX 550 and call it good.

7 : Anonymous2021/03/14 12:57 ID: gqw8tm7

Yeah, even with a dedicated card in your system it's really nice to just have extra monitor outputs or even an immediate backup for when your card dies. Give me one single vega CU. Put it on the chipset I don't care, we've already brought chipset fans back from the early 00s anyway.

ID: gqwanfg

It's excellent too for those who want to use their GPU for compute workloads and don't want to have that and the desktop rendering interfering with each other.

ID: gqx7sf4

Or those that want to run a virtual machine on Linux with VFIO GPU passthrough to game on windows without having to either buy a second gpu or plug out the monitor onnection to the host system to use the virtual one.

ID: gqwy5ha

even with a dedicated card in your system it's really nice to just have extra monitor outputs

yes it would be but I have not been able to get iGPU to stay enabled alongside a dGPU with Ryzen based APU's. 2400G(Asrock B350), 3400G(Asrock B350) and now 4700G(Giga A520). asking on Asrock and Gigabyte support forums never came up with a solution.

It used to work no problem on bulldozer based APUs. display out on IO die would be different I'm sure, but it's a shame I can't seem to make use of the APU's iGPU with a dGPU present.

ID: gqyb1px

My understanding is that you cannot use the APU's integrated GPU because the iGPU uses the same PCIe lanes that the dGPU plugs into, and only one device can use those lanes at a time. With no dGPU plugged in, the lanes go to the iGPU. With a dGPU in, the iGPU is disconnected and the lanes sent to the dGPU.

I could be wrong though, I haven't thoroughly investigated. But it explains your behavior.

ID: gqyewjd

Interesting. For a time I was using a 3400G with a discrete GPU. I was able to drive monitors from both the dGPU and iGPU at the same time. Perhaps it's a motherboard thing? I was not able to disable to iGPU for the life of me.

ID: gqwa7yp

Right. I don't think a tiny GPU even needs a cooler, smartphones don't need one either.

ID: gqwfadz

You can't compare those.

8 : Anonymous2021/03/14 12:45 ID: gqw7xgg

SIXTEEN CORES! SIXTEEN!

9 : Anonymous2021/03/14 12:30 ID: gqw6tzk

[deleted]

ID: gqxre21

We have hundreds of workstations at work that use Intel core CPU's with no GPU.

Basically 99% of business laptops have only an iGPU. Even though in many of those cases a better GPU would help, if I'm on a WebEx call the GPU utilisation is pretty high for doing relatively little on a Intel HD something the fact is most businesses will just assume they don't need a dGPU and buy the cheapest laptop with maybe an i5 that meets their specs

ID: gqwlj33

This is true but low powered PC's vastly outnumber higher end gaming machines in terms of sales. Most people don't game and just need a computer. AMD could vastly accelerate its growing market share with integrated graphics.

ID: gqwrk3l

They can't. The problem is supply and demand. Right now, AMD can't even keep up with the demand because they are limited to TSMC and how many chips they can produce witch is far less than what Intel can produce. The only way the above is true is if there are supply just sitting out there not being bought up ~ but between covid and miners ~ everything is just gon right now thus Intel looks like they are getting more market share even though they are not really ~ just that the pool is larger of buyers and was fill in with what was left over.

10 : Anonymous2021/03/14 14:55 ID: gqwgux0

Indeed, built in gpu's is a life saver if your discrete GPU is not functioning for some reason. It's also great to use for a secondary monitor so it won't affect the performance of the discrete gpu.

Adding a small GPU chiplet shouldn't be that expensive. And with DDR5, there should be plenty of bandwidth to make it work well.

ID: gqxat2i

Having had 2 GPUs already in silicon heaven, the priority is to get a minumum 1080p display output, gaming can take a backseat.

If my current GPU which is on its last legs conks out, i would be left with the odd CPU/GPU combo of a Ryzen 5 paired with a Geforce 8400 GS just to get a display up and running.Graphics cards with the exception of the GT 710 are out of stock here in the UK unless you are willing to pay scalper prices on last gen and new gen.

ID: gqxc6nu

Yeah, when my old AMD 290 had the vrm release a nice light show and magic smoke, I was pretty happy to have an Intel with built in GPU. I will go AMD next year with the introduction of DDR5, but I hope by then AMD will stop skimping on built in GPU. It comes off as being cheap now.

11 : Anonymous2021/03/14 15:47 ID: gqwmdl1

That will allegedly finally come with Zen4/Raphael.

12 : Anonymous2021/03/14 22:00 ID: gqy7qvu

Guess you never heard of all the AMD APUs in existence, eh? Need to catch up, rapidly...;) Where did you get this notion?...;) AMD has been selling CPUs with integrated GPUs for years. Not only that but those APUs have been trouncing Intel IGPs for years, too--way superior at 3d acceleration, for instance.

13 : Anonymous2021/03/14 16:18 ID: gqwpxpz

You are looking at the wrong product then. AMD has APUs along with just CPUs. So, they are already doing it. Having extra stuff also cost money, so at the end of the day, you would be paying a slightly bit more to have both for no real gains. I would agree I would like to see a default graphic chip for when the GPU dies for testing reasons, but I wouldn't really focus on the performance and maybe see if the MB could take that roll on instead.

As for a company computer ~ while an iGPU would be nice ~ the 3000g-5000g would be more then enough for the every day user as most programs in this sector don't really require much in terms of running a few monitors and QB/Excel/Etc. Anything more I would look into a workstation/desktop setup as there is a big jump between what an iGPU can provide even with proper cooling to what a full on dGPU can do.

14 : Anonymous2021/03/14 13:56 ID: gqwbhkh

Intel's iGPU is good for two things, video encode-decode, virtualization (GVT-g). AMD unfortunately doesn't seem to care about their VCE/AMF implementations, despite traditionally ATI Radeons being the best video cards for accelerated video playback, video capture etc.

ID: gqwn9hp

You're forgetting the biggest upside to integrated graphics - no GPU means cheaper computers. Most people just need a computer, not a gaming machine. This is why intel still has such a large market share for the vast majority of non specialized workstation PC's like the dell optiplex. I know integrated graphics are of little consequence to gamers and enthusiasts but if AMD can make integrated graphics the norm for their chips they would greatly accelerate how quickly their market share is growing.

ID: gqwovdj

...all Ryzen 3000-4000-5000 units have iGPUs in laptops.

15 : Anonymous2021/03/14 14:38 ID: gqwf7xu

... they probably will do something similar for zen4/zen5.

why they didn't wth zen1/2/3? costs and too much to do in one go, they were very limited on time and r&d resources back then.

16 : Anonymous2021/03/14 15:27 ID: gqwk5u3

They will do that with Zen 4. In one of the leaked roadmap, Navi2 was mentioned in desktop Ryzen processors (Raphael, not Phoenix APU).

17 : Anonymous2021/03/14 16:36 ID: gqwryvg

Because zen dies and io dies are made for server, and repurposed for desktop.

18 : Anonymous2021/03/14 17:14 ID: gqwxgse

Id like is a dedicated video encoder in the chipset that would be awesome

19 : Anonymous2021/03/14 18:12 ID: gqx70g2

agree

for throubleshhoting or windows use only should be great

20 : Anonymous2021/03/14 20:17 ID: gqxrowk

They do have a gpu built into many of the lower range CPUs (APUs). I believe their thinking is if someone needs 16 threads they probably need a faster GPU as well.

21 : Anonymous2021/03/14 20:55 ID: gqxxr9v

Heat is the reason.

22 : Anonymous2021/03/14 21:09 ID: gqxzz1b

Sorry but for the desktop market, the question arises because you're thinking that this built in GPU would be basically free. If you had side by side a CPU with and without graphics, and the price difference was the same as a basic dedicated GPU such as Gt 710 or 1030, would you still be interested in the CPU with built in graphics? And for chipset it's even worse, are they doing to offer motherboards with and without graphics? For desktops at least I think AMD decision is the right one, just get a dedicated GPU for graphics, you can choose exactly what you want. No point in including built in graphics when most people won't even use them.

23 : Anonymous2021/03/14 21:51 ID: gqy6e0c

I wondered this myself, they can release 5600G, 5800G etc. versions for little bit extra money but I'm guessing in comparison to Intel maybe it wouldn't be worth in comparison to Intel, I'd like to see some cool features incorporated with igpu, I remember that Adobe has igpu acceleration and the new 11th gen will have more features related with encoding if I remember correctly, personally I was using igpu while mining on my every day use PC, it helped offload the gpu and gained few mh/s.

24 : Anonymous2021/03/14 23:50 ID: gqykj2j

Pretty sure and planned to do that with rdna 1, but they struggled with mcm design and couldn't otherwise they would, rdna3 will have mcm and apparently APIus with gtx 1080 performance are coming

25 : Anonymous2021/03/14 23:50 ID: gqykjzn
ID: gqykkqj
27 : Anonymous2021/03/14 13:42 ID: gqwais1

Yku have ryzens and anthelons with APUs...

Like the a6-9500

28 : Anonymous2021/03/14 21:14 ID: gqy0w1f

For desktops you already have pcie basic GPUs, even some running on x1 pcie, no point in having more devices.

29 : Anonymous2021/03/14 21:32 ID: gqy3rd0

PCIe is an inferior solution, in both ease of use, general compatibility (for cases or boards with no place for such a card) and price.

引用元:https://www.reddit.com/r/Amd/comments/m4ufq6/why_does_amd_not_put_a_tiny_gpu_core_in_the_io/

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x