-
If so what was wrong with them?
-
They functioned at the time, I sure remember once upon a time when I had an Xbox 1X and my friend had the "8 core" fx chip with a 1060 6gb.. i sure wished I had his PC over my console lol.
ID: hfg52kvID: hfgbn1jI only recently upgraded my 8320, it needed to be overclocking to at least 4ghz to avoid stuttering issues in games like battlefield 3 and 4.
The 8350 ran at 4gh base, the 8320 was only 3.5ghz base but it overclocked well.
ID: hfhbt26That's awesome!
-
Were the amd fx 8000 series of CPUs bad?
If so what was wrong with them?
The fx 8000 series is not bad by itself. Its IPC is only a little lower than K10's IPC, and it was the first mainstream CPU with 8 cores.
But when AMD launched fx 8000 series, Intel had launched chips using Nehalem then Sandy Bridge microarchitectures, which have significantly higher IPC cf.
/comments/5v11tm/ipc_performance_of_intel_and_amd_cpus_2004_to/" class="reddit-press-link" target="_blank" rel="noopener">https://www.reddit.com//comments/5v11tm/ipc_performance_of_intel_and_amd_cpus_2004_to/And single core performance is crucial for games. And in 2011- 2014 no game used more than 4 cores.
Only 8 cores in terms of how many integer units it had. Talking FPU, it only had 4. Same goes for a lot of the supporting hardware.
And the L2 cache was shared too. Along with the entire front end.
Yeah poor AMD. But now ryzen is king and games are using more than 4 cores
Yeah but it's a cautionary tale of how hitting an unexpected cul-de-sac of performance gains on an architecture can set you back for years. AMD almost went bankrupt. Ryzen is king now because Intel have been stuck on 14nm for ages and that has compromised the efficiency of their entire product stack.
Things can very easily swing around again.
Even in modern games fx 8000 would still get trounced by a 3770K.
Bulldozer has terrible draw call performance. Piledriver had below Core 2 performance at processing them.
That seems like a software issue as Oxide Games demonstrated that a piledriver downclocked to 2 Ghz could process far more draw calls, via Mantle, than the GPUs of the time could handle.
Apples to apples, yeah.
Apples to oranges, an FX CPU cost much less and an Intel Core, could socket into a much cheaper motherboard and could take the cheapest DDR3 memory you find. Put all that together and the AMD system cost much3 less than the Intel system, money savings you could throw at your video card instead.
It was never FX vs. i5. It was FX + GTX N80 vs. i5 and GTX N60.
And I really don't regret buying an i5 2500k with my 7850 back in the day, instead of an FX processor with a 7950.
The 2500k lasted me 2 more GPU upgrades, would the FX had done the same? (answer is a big fat no, tbh)
That's the thing. An item can be good or bad dependig on what the competition is offering and the price.
Vs the Intel offerings, it was pretty bad, but at a very low price it was ok.-ish.
Nowadays, its comparable to the 11400KF situation. the 11400KF it not a very good processor by any means, but it's performance is about around the AMD 3600, AMD's last gen budget processor, but quite a bit cheaper.
As thesaying goes, there are no bad products, just bad prices.
EDIT: Sigh, will people stop saying the 11400 is a good CPU. I was using it as an example how pricing can make a bad product good.
The 11400 is only good due to its price. It's a current gen product that barely beats a previous gen budget CPU, but is a damn good deal due to its low price.
A 11400 VS a 5600 at similar prices would be a shitty deal. Again, the price is the deciding fa tor, hence, there are no bad products, only bad prices.
The 11400 is way way way closer to the competition than the FX 8000 series were
How is the 11400kf a bad processor by any means? Yea it’s not top of the time and it’s not aiming to be It gives you 95% of the 5600x for half the price Now unless the 5% is worth paying almost double for That’s on you
Edit: so every single pre-zen 3 cpu was bad gaming cpus only made OK by price:/
Remember 3700x (mid 2019) still lost to 8700k (late 2017) in gaming:/ not even “barely beating”.
It straight up LOST.
There's a reason the 11400 is super popular and high in demand, it's so cheap for the performance it gives.
11400F easily beats 3600 in any game. And is miles better than any construction equipment cpu relative to competition.
Nowadays, its comparable to the 11400KF situation
Not even remotely close.
The 11400 is faster than the 3600, offers comparable gaming performance to the 5600x and more importantly is cheaper than both of them (at least where I live).
Whereas the FX CPUs simply weren't competitive on any front.
Sigh, will people stop saying the 11400 is a good CPU. I was using it as an example how pricing can make a bad product good.
But it isn't a bad product?
The many affordable cores made the 8000 series fantastic chips for office work and general use. This gave them a very long useful lifetime as more things became multithreaded. The biggest downside was they run hot and ate power. Also, a lot of true believers overhyped these chips, and the people fooled by them were in for a rude surprise in the latest games.
In demanding games they were completely outclassed by Intel’s stuff, due to single core performance. Intel’s chips were much more efficient as well. However they were double or triple the price, though 2021 prices for everything can make that difference now seem quaint, but even 20$ meant a lot and divided people into different groups.
I personally ran the athlon x4 860k for many years, until Ryzen came out. The 4 cores simply wouldn’t bog down unless I was doing ridiculous multitasking and the memory leaks in Firefox/chrome reached critical mass. The only games I like are all old as hell, so that’s never been a concern.
I had an FX 8350 in my home server. It was amazing and I would have kept using it had the cooler not failed and melted on the chip. The replacement, a ryzen 2200g was actually slower for many things it was doing. I had to upgrade it to a 2700x to get past the FX performance.
If so what was wrong with them?
A lot of things were wrong with them. I wrote about this on /
a week ago, but here's an excerpt.Here's what in general made Bulldozer bad for gaming, at a low level:
Decode
The front-end could decode only 4 instructions, shared per (2-core) module, down from 3 per core in K10. This got fixed in Steamroller, with each logical core getting its own dedicated 4-wide decoder. The front-end didn't have any sort of macro-ops (not micro-ops) fusion, which lets some x86 instructions get combined before decoding, effectively widening the front-end.Branch
The longer Bulldozer pipeline made branch the branch misprediction penalty higher (20 clocks!), higher than K10 (12) or Core 2 (15) or Nehalem (17) or Sandy Bridge (14-17). The cost of a longer pipeline shows up here. AMD's branch predictor was better than any of their previous designs, but still significantly worse than Intel's (though Intel had had better branch prediction for a fairly long time, even when AMD was competitive). Bulldozer didn't have any sort of μop cache like Sandy Bridge did (though one was added in Steamroller), which exacerbated the branch misprediction penalty.Execution
The ALU was only 2-wide (vs 3-wide in K10). The single 2-wide FPU was shared between two logical cores. Thus, per core floating point execution resources were low.Cache
High-latency L2 and L3 caches, more than double that of Sandy Bridge CPUs. Gaming is particularly sensitive to cache latency, just like most games are more sensitive to RAM latency than bandwidth. Low L1 cache associativity - 2-way associative L1 instruction cache vs 8-way for Intel's CPUs. This was likely another die space tradeoff. The 2-way cache takes up less space, but it makes it more likely that different threads' instructions boot out the cached instructions of another thread. Cache misses are really painful. (This was improved to 3-way in Steamroller)So, in the end you get an architecture which is poor for gaming because of:
Low per-core execution resources and thus the architecture is less able to extract instruction-level parallelism. Very high cache latencies compared to K10 and Sandy Bridge High branch misprediction penalties A 15-20% per-thread performance penalty in integer workloads for executing two threads on the same module, due the various shared cache/prefetch/decode hardware. A 10-20% overall decrease in floating point performance when executing two threads on the same module, due to shared hardware and scheduler conflicts.Bulldozer was trash and almost bankrupted AMD.
All of this made it especially disappointing compared to the Phenom II.
I see man. Thanks for such a detailed answer. I have some questions though. What's DECODE, BRANCH EXECUTION AND CACHE?
DECODE
After an instruction is fetched from memory, the CPU needs to figure out what it's supposed to do with the instruction. The decoder is a part of the CPU that's responsible for that task. The more instructions you can decode during a clock cycle the better. But if decode isn't the bottleneck in the whole instruction cycle, improving decoder efficiency isn't that crucial. AMD chose to share the decode between two cores, which often lead to slowdowns, especially when the cores were heavily loaded.
BRANCH (PREDICTION)
Branch prediction is a thing where the CPU tries to guess if a conditional jump is taken before it's definitely known. This is done because waiting on the conditional jump to be confirmed wastes CPU cycles. So instead of waiting and doing no work at all, we try to get a headstart. If we're right, we get a speedup and we get the job done earlier, if we're wrong we have to start the execution all over again and empty the pipeline.
Bulldozer had a worse branch predictor than Sandy Bridge, so a Bulldozer chip had to start over again more often than the Sandy Bridge chip. And since the pipeline was longer on Bulldozer, it each branch mispredict lead to a longer delay before the CPU started spitting out results.
EXECUTION
This is the part in the instruction cycle where we perform mathematical or logical functions on values. So, if we've fetched and decoded an instruction that says "ADD 2 to the register called A1", this is where we calculate A1 + 2.
I mentioned that the ALU was 2-wide in Bulldozer, this basically means that one core can perform two ALU operations at the same time. Their previous K10 could perform three, AMD said they removed the third ALU since it often went unused. This however meant that a ALU heavy and heavily threaded workload like rendering often saw performance regressions with Bulldozer.
CACHE
Caches are a type of memory that's used in CPUs. Caches are smaller in size compared to your main system memory (for example, a typical FX-8150 system would have something like 8GB or 16GB of DDR3 memory, but only 8MB of L3 cache.) However, the cache is much more faster.
Just like you keep the stuff you need most often on your desk, a CPU keeps the stuff it needs most often in the various levels of cache it has. But if we need something that can't fit inside of the cache (or on top of desk), we need to go to main memory (or to our storage closet).
If we have caches that are slow, it means it takes longer for us to fetch stuff from the cache, meaning that we have to wait longer before we can start working. Bulldozer suffered badly from these slow caches. Memory access was also slower on Bulldozer compared to Sandy Bridge. Sandy Bridge users had their storage closet closer to their desks in other words, and they had an easier time picking up stuff from their desks.
I had both and the 8350 held it's own at a lower price.
At the time I had i7 3770k, i5 4460k, fx8350, fx 8310 and the amd did fine. My wife preferred the amd systems as they were snappier in win 7
There are no bad products, just bad prices
No. AMD bet on multicore, high integer loads – things took a bit too long to become as such, so the fx series was met with mostly negative criticisms at the time of release.
I had my 8320 for 7ish years before needed to upgrade. I had it paired with a 4g 390x. i had no issues playing games or doing anything else. My pc was much faster than my dad's way more expensive Intel powered HP, which was likely because his pc came loaded with so much bloat. I think the only thing were my pc didn't shine was converting large video files.
They weren't THAT bad performance wise, but they had a insanely bad efficiency (220w TDP for them to be weaker in an era where single threaded performance was important).
So if they like came out today would they good?
At the same clock speed, they're 52% slower than first gen Ryzen
Unless they costed around a few bucks, nope. Even laptop cpu's would be better
I had the fx9590 black edition, so I can't comment on thr 8k series, but I had that computer for 8 years, my gpu and hard drive space needed upgrading before the processor. I formatted and gave the computer to my buddy who was using a laptop that was midrange in 2012 for music composition. So he got a major upgrade for free.
Yes.
User POV: power hungry and quite slow.
AMD POV: sold for little, expensive to produce ( die was quite large at 315mm²)
The architecture was an oddity (8ALU,4FPU) and was slower than its predecessor in most if not all single thread tests
Actually, the Hardware Unboxed video I wanted to post was this, which is the latest one on the channel. It has the Intel advancement alongside the AMD one. It explains better why FX 8350, even though it wasn't that great to begin with, fell behind quite quickly.
Bad? I wouldn't call it bad, they were functional chips, behind the competition. Depending on the price, that either made them bad value, or good value. If you were on a small budget, you could find good budget value. If you had midrange or high end budget, then they were not a very good option.
The $1000 model, the price made that one a bad deal from any angle. Again strictly speaking not a bad chip, just the price made it a horrible value that no one should have ever bought.
Generally speaking there are no bad products, just bad prices. This is when talking about 2 functional products, all other things being more or less equal except price and performance tier. Most products fall here, like the 8000 series, it depends on the performance tier you wanted, and the price you were willing to pay.
There are truly bad products, that are either dangerous and/or do not function correctly. The 8000 series certainly do NOT fall into what i would define as a bad product.
Fx chips are budget mid 2010's chips that are outperformed by budget intel chips at gaming. Not worth your time then mostly. And stay away. As far away as you can unless you have no choice in 2021.
The FX chips where a disaster. Releasing In 2012 with the FX 8350. This main stream chip would not be replaced till 2017 with the ryzen architecture.
That was how long it took amd to change diferections and get a new product designed from the ground up out the door. Still 1st gen ryzen did not match intel. Intel had beat AMD until 2020 and the launch of zen 3. Now AMD is on top.
If your looking for budget build CPU's look for a 7700k or 6700k chip for less used on ebay.
In reality any build right now should choose either a current i5 or R7/R9 zen3 chip
All great advice accept you can not find a 7700k for cheap period. But I agree all the way. Going for an i3 10100 would basically be a 7700k but brand new. If you have the chance to get an fx 8350 for free vs paying for a new chip then I would probably go for it.
Still 1st gen ryzen did not match intel. Intel had beat AMD until 2020 and the launch of zen 3. Now AMD is on top.
Depends what you're looking for of course.
I've noticed that Reddit, and many other sites, are quite focused on gaming. And yeah, in gaming, latency, and single core performance are king.
Still, even with Zen1, AMD was very competitive in multicore performance.
What I'm saying is is that no, Intel didn't beat AMD until 2020. Intel still had the performance crown in gaming. But there's more to a processor than gaming performance. and overall, AMD was slower in gaming by a few percent, but beat the intel offerings in multicore by quite the margin.
What makes a processor better than the other is whatever is the best for your use case. And gaming/DIY market is actually quite niche.
All in all, I think from zen+ onwards, AMD and Intel were pretty much on par, with a slight lead for Intel, and AMD beating Intel hands down from Zen2 onwards, there's a reason Zen2 sold so well, that's pricing taken into consideration as well.
And speaking of gaming, the "budget" (not quite budget, more like bottom of the stack) AMD CPU beats the top Intel SKU in gaming and annihalates it in multicore performance.
Like i said in a previous reply, there are no bad products, just bad pricing, And Intel finally caught up, offering a last gen Ryzen 2 3600 competitor at a lower price.
Take a look at this recent video from Hardware Unboxed. It should answer your question (to some extent).
Basically, the original FX 8100 family was somewhat slower than previous AMD chips in most tasks, and while the FX 8300 was better, and somewhat okay at release time, Intel progressed quite quickly in performance at that time, while AMD didn't release anything better for 5 years.
They worked great they just have a bad rep from chills that crapped on them.
for the time released yes. But today FX 6300>i5 2500k even though the latter gives more fps. The gameplay is smoother due to not having spikes as the i5 is pegged 100%
They weren't nearly as bad as people made them out to he. The bulldozer chips (8320) were definitely worse than the pile-driver chips (8350), but ultimately they offered i7 multi core performance at the price point of an i5, so they certainly had a market and nowadays they often hold up better than their intel counterparts from the same generation as while they may not have had 8 traditional cores, they still had extra cores, so hold up a lot better in today's core heavy games
Bulldozer was 81x0 chips. Pile driver was 83x0....and the 9000 series that had the 220w tdp
you are totally right, don't know what I was going on about there. too many late nights recently
I had 6300 an myself friends 8370/9590. Even I had more FPS then my friends with i7 3770 or whatever I5 PC-S, meanwhile we kinda had the same gpus (580/590/1060 and the higher end fx users vega/navi)
FX was kinda meh when released, but the fine wine is working very well with dx12/vulkan api
Nice. Good to see that fx series are showing their worth now
引用元:https://www.reddit.com/r/Amd/comments/q16gwk/were_the_amd_fx_8000_series_of_cpus_bad/
I still use a FX 8350 overclocked paired with a RX 570 I built for my son to use as a gaming rig. Runs Titanfall 2, Raft, Fortnite and many other games quite well at medium to high setting at 60 fps