6600 xt bus width only x8?

1 : Anonymous2021/08/11 10:14 ID: p2a6q0

how do i know before buying a gpu what bus width it has?

i tried looking at the spec sheets of a couple of aib 6600 xt cards but they all just say "gen 4" without disclosing that the bus width is only x8

were there other gpus in the past that were only x8, while physically looking like x16? i'm asking because i have a gen 3.0 motherboard and i'm afraid upgrading to a gimped gpu in the future

2 : Anonymous2021/08/11 10:18 ID: h8inxw3

Also the RX 5500xt

ID: h8jq9sm

also RX460 / 560 🙂

3 : Anonymous2021/08/11 10:46 ID: h8ipysu

According to hardware unboxed x8 4.0 is 3% faster on average, 5% if you include doom eternal which seems to be heavily impacted by pcie speed.

Although doom eternal does make you wonder if there's any other games out there that'll have similar results.

ID: h8iq90q

the doom eternal result is the one that got me worrying

ID: h8jn803

You probably should be worried, most games are moving towards being more demanding in future. I expect this card may not age graciously, kind of like how the 3060 Ti may age worse than a 3060 will due to VRAM.

Considering how well DOOM Eternal is optimised too, you would expect it to be one of the least hit games performance wise. But it may also be so well optimised it can take advantage of any extra speed your GPU may encounter too, accentuating the performance difference between PCI-E versions. Who knows really.

ID: h8j04d1

Although doom eternal does make you wonder if there's any other games out there that'll have similar results.

Yes, a number of other games have been found where dropping to PCIe 3.0 has serious impact, especially on frametimes / 1% low fps.

CoD: Black Ops Cold War Death Stranding Gears 5 Watch Dogs Legion

Only slightly affected were:

AC:Valhalla Control Horizon: Zero Dawn
ID: h8j2737

This is for pcie 3.0 8x right? I'm running 8x on my current titan x pascal, because I am out of pcie lanes. If/when I get a new GPU I guess it would be worth upgrading my Mobo and maybe my CPU if needed. I'm on a 5800x and asus x470 crosshair vii hero. I already have 2 nvme ssds (one on the chipset lanes), my GPU at 8x, and my network card at 8x pcie (Intel x520).

ID: h8k6vaj

Wolfenstein Youngblood will probably yield similar results

4 : Anonymous2021/08/11 11:22 ID: h8isqb4
ID: h8l51qt

RX 6600 XT Gen4 Gen3 Gen3 t.o.v. Gen4

Fire Strike 27019 26904 99,6%

Fire Strike Graphics 28873 28737 99,5%

Fire Strike Extreme 12827 12815 99,9%

Fire Strike Extreme Graphics 13407 13390 99,9%

Fire Strike Ultra 6756 6750 99,9%

Fire Strike Ultra Graphics 6672 6664 99,9%

Time Spy 10236 10206 99,7%

Time Spy Graphics 9691 9650 99,6%

Port Royal score (punten) 4498 4452 99%

Port Royal (fps) 20,8 20,6 99%

Source:

only 1 percent difference really.

ID: h8iyf7a

Alas, I think this testing is somewhat suboptimal. Kinda like Wizzard's VRAM testing.

5 : Anonymous2021/08/11 10:22 ID: h8io6d5

This was also a problem with the 5500XT on PCI gen 3 boards (mostly with the 4GB version.) As long as you are under 8GB VRAM usage it shouldn't affect performance.

6 : Anonymous2021/08/11 10:39 ID: h8ipfiz

They probably reused tha same pcb they used for 5500xt, just to make it super cheap to manufacture.

7 : Anonymous2021/08/11 10:28 ID: h8ionkc

Go look the ltt review at 5min35

ID: h8j1nzq

They reused the 5500XT PCB

ID: h8l4247

They seem to assume you have a PCI-E 4.0 capable system in that video. This card can only do Gen 4 x8 or Gen 3 x8. It cannot do Gen 3 x16 (that is, the normal bandwidth for all last-gen and earlier cards).

8 : Anonymous2021/08/11 10:35 ID: h8ip5ky

x8 is not rare, many AMD 128-bit cards had it- the RX460, RX5500XT, now 6600XT. For the 6600XT, performance difference on PCIe 3.0 vs is not huge, ~3-5% on average at 1440p, and can probably be partially avoided by using slightly less bandwidth intensive settings.

ID: h8jh13n

Good to see the $380 card is being held to the same standards as a 460

ID: h8l7lc8

They pretty clearly didn’t design this chip for a $379 card - they designed a laptop chip and when it was done the market was such that they could charge $379 for it. Which they have done. If you don’t have any options, this is what you buy. If you have a card, don’t upgrade. $379 is what the 6700 vanilla should cost, or even the 6700XT.

ID: h8iq0fc

i see, so the performance downgrade HUB ran into with doom eternal at 1080p is an exception?

ID: h8ir281

As Steve himself says- he picked the setting that is not useful/reasonable, just to show the bottleneck. It can be avoided, and it is an open question if games of the future will make the issue more common, or if wider use of, say, variable rate shading, will reduce the cost on VRAM bandwidth in games.

9 : Anonymous2021/08/11 18:40 ID: h8kd8a7

marketing it as a gaming card but gimping it with x8 should be a crime

10 : Anonymous2021/08/11 12:05 ID: h8iwmf6

I am runing my 3090 @ x8 PCIE 3.0 and incounter no significant performance issues. Checkt it with benchmarks like Fire Strike and Time Spy. I get the same FPS numbers like other 3090 cards with same clocks.

11 : Anonymous2021/08/11 10:34 ID: h8ip26e

You’ll get around 5 % less performance since you’re using it on a gen 3.0 slot…

ID: h8ipqxo

I think it very much depends on the game and how memory intensive it is. Many games do not show a difference at all, but in others (most notably Doom Eternal, but also Death Stranding, Gears 5, CoD:Black Ops Cold War) the difference is non-negligible.

Reviewers such as HWUB have expressed concerns that we could see the share of such games increasing going forward.

ID: h8ipxe3

Yeah, I just replied here after watching their video. Their 6 game average showed around 5 % difference, and hence I mentioned the same …

12 : Anonymous2021/08/11 11:05 ID: h8ircxs

Yeah AMD product spec pages are a joke

13 : Anonymous2021/08/11 13:25 ID: h8j56mm

The only two I can think of that are x8 are 5500xt and the 6600xt

ID: h8khu6b

A lot of older AMD (and Nvidia?) cards are, but when I say older I'm talking RX 460. Absolutely no card should be 8x in today's market especially when it just serves to waste components

ID: h8l75ah

AMD does it for laptop cards, where it is both desirable (you need those few PCIe lanes that you have available for other things, such as M.2 storage) and a cost saving. Nvidia has taken it one step further and made an x4 chip (GP108, used on the desktop in GT1030, which despite its name was released this year). This chip was designed for laptops and reused for the low-end GPU. Only problem this time is that the card costs way too much for what it was designed for.

14 : Anonymous2021/08/11 18:26 ID: h8kb1r4

It probably doesn't matter that much for that card.

15 : Anonymous2021/08/11 21:52 ID: h8l4wox

It's one percent difference really if you tax the PCI-E 3.0 bus vs PCI-E 4.0.

Basicly PCI-E 4.0 x8 is technically PCI-E 3.0 X16.

You wont miss out alot. And boards technical enough allow for PCI-E bus overclocking. You can actually push to 112Mhz or so; it provides a decent boost in bandwidth and speed really.

16 : Anonymous2021/08/11 10:22 ID: h8io5u8

Unless you're gaming at 2K/4K you're never going to use over x8. The card is meant for 1080p gaming after all, so it makes sense.

ID: h8ipyuy

you're never going to use over x8

If you mean PCIe 4.0 x8 then sure, that is not a bottleneck for the performance tier of this card. But OP has a PCIe 3.0 system, where even at 1080p the x8 can be limiting as found by several 6600XT reviews.

ID: h8iro2s

I suppose there would be a 3.65% difference in performance when comparing PCIe 3.0 vs 4.0 x8 if you were maxing out all 8GB of video memory... Not the easiest task to perform at 1080p.

PICe 4.0 x8 = 15752MB/s
PCIe 3.0 x8 = 7880MB/s

8096 - 7880 = 296mb
296 / 8096 = 0.365
0.365 * 100 = 3.65

ID: h8ir66a

$380 1080p card. riiight

ID: h8iulqd

$380 is within price range for a low tier GPU. You could always go with the 3060 Ti for $549 and get 10-15 more FPS @ 1080p.

ID: h8iyofe

Unless you're gaming at 2K/4K you're never going to use over x8. The card is meant for 1080p gaming after all, so it makes sense.

Myth. Ironically, 1080p is the resolution that shows the biggest difference EXCEPT for once where VRAM is heavily stresed.

At 1080p, while the buffer is smaller, it is computed so fast that the overall traffic between the CPU and GPU is much higher than at 4k.

ID: h8iywrv

The card is meant for 1080p gaming after all, so it makes sense.

The card performs well at 1440p. Why care for what marketing says when the results speak for themselves?

ID: h8iq3mw

Unless you're gaming at 2K

The card is meant for 1080p

2k is 1080p, unless you think 2560 is closer to 2000 than 1920 is.

ID: h8iqnet

2k is 2560x1440 AKA 1440p.

ID: h8itwh4

Sorry, man. 2k is 1440.

ID: h8irzq6

2K is a resolution of 2560x1440 when referring to computer monitors, a resolution of 2048x1080 is used when referring to official cinema media content.

ID: h8ipqmt

wait, wasnt it the otherway around, ie at high frame rates it shows the perf issues not when you are bottlenecked by the gpu itself when it is struggling.

引用元:https://www.reddit.com/r/Amd/comments/p2a6q0/6600_xt_bus_width_only_x8/

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x