Were the amd fx 8000 series of CPUs bad?

1 : Anonymous2021/10/04 14:25 ID: q16gwk

If so what was wrong with them?

2 : Anonymous2021/10/04 15:04 ID: hfcw1w9

They functioned at the time, I sure remember once upon a time when I had an Xbox 1X and my friend had the "8 core" fx chip with a 1060 6gb.. i sure wished I had his PC over my console lol.

ID: hfg52kv

I still use a FX 8350 overclocked paired with a RX 570 I built for my son to use as a gaming rig. Runs Titanfall 2, Raft, Fortnite and many other games quite well at medium to high setting at 60 fps

ID: hfgbn1j

I only recently upgraded my 8320, it needed to be overclocking to at least 4ghz to avoid stuttering issues in games like battlefield 3 and 4.

The 8350 ran at 4gh base, the 8320 was only 3.5ghz base but it overclocked well.

ID: hfhbt26

That's awesome!

3 : Anonymous2021/10/04 15:10 ID: hfcwuew

Were the amd fx 8000 series of CPUs bad?

If so what was wrong with them?

The fx 8000 series is not bad by itself. Its IPC is only a little lower than K10's IPC, and it was the first mainstream CPU with 8 cores.

But when AMD launched fx 8000 series, Intel had launched chips using Nehalem then Sandy Bridge microarchitectures, which have significantly higher IPC cf.

/comments/5v11tm/ipc_performance_of_intel_and_amd_cpus_2004_to/" class="reddit-press-link" target="_blank" rel="noopener">https://www.reddit.com//comments/5v11tm/ipc_performance_of_intel_and_amd_cpus_2004_to/

And single core performance is crucial for games. And in 2011- 2014 no game used more than 4 cores.

ID: hfdxfzn

Only 8 cores in terms of how many integer units it had. Talking FPU, it only had 4. Same goes for a lot of the supporting hardware.

ID: hfeb46f

And the L2 cache was shared too. Along with the entire front end.

ID: hfcx2ja

Yeah poor AMD. But now ryzen is king and games are using more than 4 cores

ID: hfd2ieq

Yeah but it's a cautionary tale of how hitting an unexpected cul-de-sac of performance gains on an architecture can set you back for years. AMD almost went bankrupt. Ryzen is king now because Intel have been stuck on 14nm for ages and that has compromised the efficiency of their entire product stack.

Things can very easily swing around again.

ID: hfgrbux

Even in modern games fx 8000 would still get trounced by a 3770K.

ID: hfdnlt7

Bulldozer has terrible draw call performance. Piledriver had below Core 2 performance at processing them.

ID: hfe347q

That seems like a software issue as Oxide Games demonstrated that a piledriver downclocked to 2 Ghz could process far more draw calls, via Mantle, than the GPUs of the time could handle.

ID: hfemfiw

Apples to apples, yeah.

Apples to oranges, an FX CPU cost much less and an Intel Core, could socket into a much cheaper motherboard and could take the cheapest DDR3 memory you find. Put all that together and the AMD system cost much3 less than the Intel system, money savings you could throw at your video card instead.

It was never FX vs. i5. It was FX + GTX N80 vs. i5 and GTX N60.

ID: hfgnlnl

And I really don't regret buying an i5 2500k with my 7850 back in the day, instead of an FX processor with a 7950.

The 2500k lasted me 2 more GPU upgrades, would the FX had done the same? (answer is a big fat no, tbh)

ID: hfd86kx

That's the thing. An item can be good or bad dependig on what the competition is offering and the price.

Vs the Intel offerings, it was pretty bad, but at a very low price it was ok.-ish.

Nowadays, its comparable to the 11400KF situation. the 11400KF it not a very good processor by any means, but it's performance is about around the AMD 3600, AMD's last gen budget processor, but quite a bit cheaper.

As thesaying goes, there are no bad products, just bad prices.

EDIT: Sigh, will people stop saying the 11400 is a good CPU. I was using it as an example how pricing can make a bad product good.

The 11400 is only good due to its price. It's a current gen product that barely beats a previous gen budget CPU, but is a damn good deal due to its low price.

A 11400 VS a 5600 at similar prices would be a shitty deal. Again, the price is the deciding fa tor, hence, there are no bad products, only bad prices.

ID: hffd862

The 11400 is way way way closer to the competition than the FX 8000 series were

ID: hfe6pqm

How is the 11400kf a bad processor by any means? Yea it’s not top of the time and it’s not aiming to be It gives you 95% of the 5600x for half the price Now unless the 5% is worth paying almost double for That’s on you

ID: hfgr0hv

Edit: so every single pre-zen 3 cpu was bad gaming cpus only made OK by price:/

Remember 3700x (mid 2019) still lost to 8700k (late 2017) in gaming:/ not even “barely beating”.

It straight up LOST.

ID: hfgy132

There's a reason the 11400 is super popular and high in demand, it's so cheap for the performance it gives.

ID: hfgq4di

11400F easily beats 3600 in any game. And is miles better than any construction equipment cpu relative to competition.

ID: hfgqvaw

Nowadays, its comparable to the 11400KF situation

Not even remotely close.

The 11400 is faster than the 3600, offers comparable gaming performance to the 5600x and more importantly is cheaper than both of them (at least where I live).

Whereas the FX CPUs simply weren't competitive on any front.

Sigh, will people stop saying the 11400 is a good CPU. I was using it as an example how pricing can make a bad product good.

But it isn't a bad product?

4 : Anonymous2021/10/04 17:02 ID: hfddikr

The many affordable cores made the 8000 series fantastic chips for office work and general use. This gave them a very long useful lifetime as more things became multithreaded. The biggest downside was they run hot and ate power. Also, a lot of true believers overhyped these chips, and the people fooled by them were in for a rude surprise in the latest games.

In demanding games they were completely outclassed by Intel’s stuff, due to single core performance. Intel’s chips were much more efficient as well. However they were double or triple the price, though 2021 prices for everything can make that difference now seem quaint, but even 20$ meant a lot and divided people into different groups.

I personally ran the athlon x4 860k for many years, until Ryzen came out. The 4 cores simply wouldn’t bog down unless I was doing ridiculous multitasking and the memory leaks in Firefox/chrome reached critical mass. The only games I like are all old as hell, so that’s never been a concern.

ID: hfdexrj

I had an FX 8350 in my home server. It was amazing and I would have kept using it had the cooler not failed and melted on the chip. The replacement, a ryzen 2200g was actually slower for many things it was doing. I had to upgrade it to a 2700x to get past the FX performance.

5 : Anonymous2021/10/04 18:51 ID: hfdv6t1

If so what was wrong with them?

A lot of things were wrong with them. I wrote about this on /

a week ago, but here's an excerpt.

Here's what in general made Bulldozer bad for gaming, at a low level:


The front-end could decode only 4 instructions, shared per (2-core) module, down from 3 per core in K10. This got fixed in Steamroller, with each logical core getting its own dedicated 4-wide decoder. The front-end didn't have any sort of macro-ops (not micro-ops) fusion, which lets some x86 instructions get combined before decoding, effectively widening the front-end.


The longer Bulldozer pipeline made branch the branch misprediction penalty higher (20 clocks!), higher than K10 (12) or Core 2 (15) or Nehalem (17) or Sandy Bridge (14-17). The cost of a longer pipeline shows up here. AMD's branch predictor was better than any of their previous designs, but still significantly worse than Intel's (though Intel had had better branch prediction for a fairly long time, even when AMD was competitive). Bulldozer didn't have any sort of μop cache like Sandy Bridge did (though one was added in Steamroller), which exacerbated the branch misprediction penalty.


The ALU was only 2-wide (vs 3-wide in K10). The single 2-wide FPU was shared between two logical cores. Thus, per core floating point execution resources were low.


High-latency L2 and L3 caches, more than double that of Sandy Bridge CPUs. Gaming is particularly sensitive to cache latency, just like most games are more sensitive to RAM latency than bandwidth. Low L1 cache associativity - 2-way associative L1 instruction cache vs 8-way for Intel's CPUs. This was likely another die space tradeoff. The 2-way cache takes up less space, but it makes it more likely that different threads' instructions boot out the cached instructions of another thread. Cache misses are really painful. (This was improved to 3-way in Steamroller)

So, in the end you get an architecture which is poor for gaming because of:

Low per-core execution resources and thus the architecture is less able to extract instruction-level parallelism. Very high cache latencies compared to K10 and Sandy Bridge High branch misprediction penalties A 15-20% per-thread performance penalty in integer workloads for executing two threads on the same module, due the various shared cache/prefetch/decode hardware. A 10-20% overall decrease in floating point performance when executing two threads on the same module, due to shared hardware and scheduler conflicts.

Bulldozer was trash and almost bankrupted AMD.

ID: hfe5zgz

All of this made it especially disappointing compared to the Phenom II.

ID: hfdx94j

I see man. Thanks for such a detailed answer. I have some questions though. What's DECODE, BRANCH EXECUTION AND CACHE?

ID: hfe4i27


After an instruction is fetched from memory, the CPU needs to figure out what it's supposed to do with the instruction. The decoder is a part of the CPU that's responsible for that task. The more instructions you can decode during a clock cycle the better. But if decode isn't the bottleneck in the whole instruction cycle, improving decoder efficiency isn't that crucial. AMD chose to share the decode between two cores, which often lead to slowdowns, especially when the cores were heavily loaded.


Branch prediction is a thing where the CPU tries to guess if a conditional jump is taken before it's definitely known. This is done because waiting on the conditional jump to be confirmed wastes CPU cycles. So instead of waiting and doing no work at all, we try to get a headstart. If we're right, we get a speedup and we get the job done earlier, if we're wrong we have to start the execution all over again and empty the pipeline.

Bulldozer had a worse branch predictor than Sandy Bridge, so a Bulldozer chip had to start over again more often than the Sandy Bridge chip. And since the pipeline was longer on Bulldozer, it each branch mispredict lead to a longer delay before the CPU started spitting out results.


This is the part in the instruction cycle where we perform mathematical or logical functions on values. So, if we've fetched and decoded an instruction that says "ADD 2 to the register called A1", this is where we calculate A1 + 2.

I mentioned that the ALU was 2-wide in Bulldozer, this basically means that one core can perform two ALU operations at the same time. Their previous K10 could perform three, AMD said they removed the third ALU since it often went unused. This however meant that a ALU heavy and heavily threaded workload like rendering often saw performance regressions with Bulldozer.


Caches are a type of memory that's used in CPUs. Caches are smaller in size compared to your main system memory (for example, a typical FX-8150 system would have something like 8GB or 16GB of DDR3 memory, but only 8MB of L3 cache.) However, the cache is much more faster.

Just like you keep the stuff you need most often on your desk, a CPU keeps the stuff it needs most often in the various levels of cache it has. But if we need something that can't fit inside of the cache (or on top of desk), we need to go to main memory (or to our storage closet).

If we have caches that are slow, it means it takes longer for us to fetch stuff from the cache, meaning that we have to wait longer before we can start working. Bulldozer suffered badly from these slow caches. Memory access was also slower on Bulldozer compared to Sandy Bridge. Sandy Bridge users had their storage closet closer to their desks in other words, and they had an easier time picking up stuff from their desks.

ID: hfegs4v

I had both and the 8350 held it's own at a lower price.

At the time I had i7 3770k, i5 4460k, fx8350, fx 8310 and the amd did fine. My wife preferred the amd systems as they were snappier in win 7

6 : Anonymous2021/10/04 17:33 ID: hfdik2c

There are no bad products, just bad prices

7 : Anonymous2021/10/04 17:49 ID: hfdl26e

No. AMD bet on multicore, high integer loads – things took a bit too long to become as such, so the fx series was met with mostly negative criticisms at the time of release.

8 : Anonymous2021/10/04 14:44 ID: hfctbu9

I had my 8320 for 7ish years before needed to upgrade. I had it paired with a 4g 390x. i had no issues playing games or doing anything else. My pc was much faster than my dad's way more expensive Intel powered HP, which was likely because his pc came loaded with so much bloat. I think the only thing were my pc didn't shine was converting large video files.

9 : Anonymous2021/10/04 14:28 ID: hfcr78c

They weren't THAT bad performance wise, but they had a insanely bad efficiency (220w TDP for them to be weaker in an era where single threaded performance was important).

ID: hfcr9a3

So if they like came out today would they good?

ID: hfcrdzv

At the same clock speed, they're 52% slower than first gen Ryzen

ID: hfcrdb8

Unless they costed around a few bucks, nope. Even laptop cpu's would be better

10 : Anonymous2021/10/05 02:18 ID: hffl7v3

I had the fx9590 black edition, so I can't comment on thr 8k series, but I had that computer for 8 years, my gpu and hard drive space needed upgrading before the processor. I formatted and gave the computer to my buddy who was using a laptop that was midrange in 2012 for music composition. So he got a major upgrade for free.

11 : Anonymous2021/10/05 07:31 ID: hfgdqrv


User POV: power hungry and quite slow.

AMD POV: sold for little, expensive to produce ( die was quite large at 315mm²)

The architecture was an oddity (8ALU,4FPU) and was slower than its predecessor in most if not all single thread tests

12 : Anonymous2021/10/05 08:29 ID: hfgha0t

Actually, the Hardware Unboxed video I wanted to post was this, which is the latest one on the channel. It has the Intel advancement alongside the AMD one. It explains better why FX 8350, even though it wasn't that great to begin with, fell behind quite quickly.

13 : Anonymous2021/10/05 19:44 ID: hfin8nb

Bad? I wouldn't call it bad, they were functional chips, behind the competition. Depending on the price, that either made them bad value, or good value. If you were on a small budget, you could find good budget value. If you had midrange or high end budget, then they were not a very good option.

The $1000 model, the price made that one a bad deal from any angle. Again strictly speaking not a bad chip, just the price made it a horrible value that no one should have ever bought.

Generally speaking there are no bad products, just bad prices. This is when talking about 2 functional products, all other things being more or less equal except price and performance tier. Most products fall here, like the 8000 series, it depends on the performance tier you wanted, and the price you were willing to pay.

There are truly bad products, that are either dangerous and/or do not function correctly. The 8000 series certainly do NOT fall into what i would define as a bad product.

14 : Anonymous2021/10/04 16:29 ID: hfd8fnx

Fx chips are budget mid 2010's chips that are outperformed by budget intel chips at gaming. Not worth your time then mostly. And stay away. As far away as you can unless you have no choice in 2021.

The FX chips where a disaster. Releasing In 2012 with the FX 8350. This main stream chip would not be replaced till 2017 with the ryzen architecture.

That was how long it took amd to change diferections and get a new product designed from the ground up out the door. Still 1st gen ryzen did not match intel. Intel had beat AMD until 2020 and the launch of zen 3. Now AMD is on top.

If your looking for budget build CPU's look for a 7700k or 6700k chip for less used on ebay.

In reality any build right now should choose either a current i5 or R7/R9 zen3 chip

ID: hfdnu9g

All great advice accept you can not find a 7700k for cheap period. But I agree all the way. Going for an i3 10100 would basically be a 7700k but brand new. If you have the chance to get an fx 8350 for free vs paying for a new chip then I would probably go for it.

ID: hfdpojj

Still 1st gen ryzen did not match intel. Intel had beat AMD until 2020 and the launch of zen 3. Now AMD is on top.

Depends what you're looking for of course.

I've noticed that Reddit, and many other sites, are quite focused on gaming. And yeah, in gaming, latency, and single core performance are king.

Still, even with Zen1, AMD was very competitive in multicore performance.

What I'm saying is is that no, Intel didn't beat AMD until 2020. Intel still had the performance crown in gaming. But there's more to a processor than gaming performance. and overall, AMD was slower in gaming by a few percent, but beat the intel offerings in multicore by quite the margin.

What makes a processor better than the other is whatever is the best for your use case. And gaming/DIY market is actually quite niche.

All in all, I think from zen+ onwards, AMD and Intel were pretty much on par, with a slight lead for Intel, and AMD beating Intel hands down from Zen2 onwards, there's a reason Zen2 sold so well, that's pricing taken into consideration as well.

And speaking of gaming, the "budget" (not quite budget, more like bottom of the stack) AMD CPU beats the top Intel SKU in gaming and annihalates it in multicore performance.

Like i said in a previous reply, there are no bad products, just bad pricing, And Intel finally caught up, offering a last gen Ryzen 2 3600 competitor at a lower price.

15 : Anonymous2021/10/04 14:48 ID: hfctvxr

Take a look at this recent video from Hardware Unboxed. It should answer your question (to some extent).

Basically, the original FX 8100 family was somewhat slower than previous AMD chips in most tasks, and while the FX 8300 was better, and somewhat okay at release time, Intel progressed quite quickly in performance at that time, while AMD didn't release anything better for 5 years.

16 : Anonymous2021/10/04 18:01 ID: hfdn4u7

They worked great they just have a bad rep from chills that crapped on them.

17 : Anonymous2021/10/04 14:33 ID: hfcrx7e

for the time released yes. But today FX 6300>i5 2500k even though the latter gives more fps. The gameplay is smoother due to not having spikes as the i5 is pegged 100%

18 : Anonymous2021/10/04 18:35 ID: hfdsicj

They weren't nearly as bad as people made them out to he. The bulldozer chips (8320) were definitely worse than the pile-driver chips (8350), but ultimately they offered i7 multi core performance at the price point of an i5, so they certainly had a market and nowadays they often hold up better than their intel counterparts from the same generation as while they may not have had 8 traditional cores, they still had extra cores, so hold up a lot better in today's core heavy games

ID: hfe2foo

Bulldozer was 81x0 chips. Pile driver was 83x0....and the 9000 series that had the 220w tdp

ID: hfe3ekx

you are totally right, don't know what I was going on about there. too many late nights recently

19 : Anonymous2021/10/04 16:20 ID: hfd6y70

I had 6300 an myself friends 8370/9590. Even I had more FPS then my friends with i7 3770 or whatever I5 PC-S, meanwhile we kinda had the same gpus (580/590/1060 and the higher end fx users vega/navi)

FX was kinda meh when released, but the fine wine is working very well with dx12/vulkan api

ID: hfd7120

Nice. Good to see that fx series are showing their worth now


Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x