6600 XT is low-key the best mining GPU in history (a.k.a. MSRP never)

1 : Anonymous2021/08/11 10:39 ID: p2ai8r
6600 XT is low-key the best mining GPU in history (a.k.a. MSRP never)
2 : Anonymous2021/08/11 10:43 ID: h8ipphq

It only needs 55W?

ID: h8j1phz

Mining almost entirely depends on VRAM, the core is undervolted as far as possible for efficiency. That means smalle

chips are actually better, that's why the 6800 is more profitable than the XT.

ID: h8j5547

For example my 2060KO draws 120-125w for 30-31mh/s while my 1660S/Ti only draw 70-80 for 31-32mh/s

ID: h8jahcw

It's not linear though. A balance of core IPC, memory bandwidth, memory frequency and even power delivery exists, and that's why the 8GB 3060Ti is so much better than the 12GB 3060 and even better RoI than the 8GB higher bandwidth 3070.

It's also why the RVII has been the champion for a while now (because it probably had the best combination of all and likely the most favorable implementation of infinity cache memory handling for daggerhashimoto).

ID: h8iywqi

Probably he underclocked and undervolted it to 55W and still got that MH/s. A lot of the other power ratings are also way lower than official TBP.

ID: h8izuee

The numbers are off for a variety of things. A 1660ti will do 31/77, a 6800xt will pull better PPW than a RX 580 in this chart etc.

ID: h8j8iuc

this 100%.

Even a 3070 FE will happily pull 62/119 (O_o)...

ID: h8irmmh

in his dreams

ID: h8jcs22

It draws 55 watts when mining, not a joke.

ID: h8jtjg6

When mining, my 3080 only pulls 230-250W, which is far lower than when gaming.

3 : Anonymous2021/08/11 10:41 ID: h8ipl19

140W for the 6700XT is a bit much, I know someone that got his 6700XT down to around 72W while still getting 47MH/s. That puts it firmly in the number one spot at a PPW of 0.6528.

ID: h8jh4h2

There's also no Vega on that chart, 50 @ 125w isn't too hard to get which puts it upper middle, if you go by the best results people have gotten (53@116ish) it's in 3rd place

ID: h8ivt57

Yup.. I mine when i'm not gaming and my 6700xt is 47mhs at 89w, though I'm not sure how you'd get to 72w without some bios modding.

ID: h8ixjxb

MorePowerTool, not sure how they got to 72W but 80W is easy enough

4 : Anonymous2021/08/11 13:25 ID: h8j56p7

Not sure this is an accurate statement considering ETH mining is in it's twilight. Right now efficiency is not as important as total hash rate. You want to be mining as much as you can as fast as you can regardless of efficiency before it gets cutoff.

ID: h8j6z5p

This. PPW does not reflect profitablity at all. The hashrate difference (and hence revenue difference) between the 3060ti and 6600xt massively dwarfs the difference in electricty cost between the two cards. Given all cards are available in msrp, no miner is going to want to pay $379 for a 32 MH card when adding $20 gives you 58 MH instead.

ID: h8jf0pg

But there is no msrp atm. The entire market for the most parts reflects mining performance.

ID: h8jggo0

Try getting a 3060ti at anywhere the price that you can get a 6600xt.

ID: h8kaad7

While I agree ETH mining is coming to an end, the sheer demand for a PoW coin to mine will drive the value of another PoW coin to take ETH place in that niche.

5 : Anonymous2021/08/11 12:26 ID: h8iyp6l

Rtx 3060 37mh/s since when?

let be real here rdna2 does not scale as well as past amd graphics architectures, ampere, on the other hand, is very good at mining.

ID: h8j6umw

not RDNA2 but got my RX5700XT doing ~50mh/s at 90w peak. Haven't tried lower yet.

6 : Anonymous2021/08/11 11:45 ID: h8iuqry

Some 1060 9gbps with timing mods could do 28.5 at 67-68 at the wall. 1660ti, and most 1660 super are usually 29-32 at 65-73W. 5600xt, 39 at 92-93W. Best of the best 580s, at 775mV (vdroop to 750) are 98-105. All from the wall.

These numbers in the list are pretty conservative, if not inaccurate...

7 : Anonymous2021/08/11 13:39 ID: h8j6vxe

Look in the comments all the hidden miners woke up

Edit: first all gamers cried and shit talked fuck scalpers & miners and now everyone is mining too, you all have become one of them and everything is excusable now lol

Frickin dishonest and hypocrites people everywhere. Disgusting



ID: h8jh6n6

imo people mining with their gpus idle is not a problem and i dont mind it at all, it's people mass buying gpus that are the problem

ID: h8ji1yq

imo people mining with their gpus idle is not a problem and i dont mind it at all, it's people mass buying gpus that a

I agree, i wouldn't consider my self a miner, i just bought a gpu i feel it's overpriced and decided that i will get some money back with mining.
edit: some grammar

ID: h8k4hxq

Except for the environment. But fuck that right?

ID: h8jwunj

Normal people mining in their idle time makes it harder for everyone else to mine coins, making mining less profitable. It should be encouraged.

ID: h8kqsha

Yup. So many people here bitch and moan about miners, but they're mining on the side with their gaming GPU to try to offset the scalper prices they paid for them.

The gamers with zero self control are the problem - they enable the scalpers and the ridiculous OEMs.

Dedicated miners are a drop in the bucket of the overall market, and they only prefer a select few models.

The chart in the OP is laughably wrong, so much so that my guess is they posted that horrible data to trigger a flood of posts from people saying "No way! My X card gets Y MH/s at Z Watts!".

8 : Anonymous2021/08/11 13:06 ID: h8j2ynh

this table is not that realistic, i get 31.8 MHs stable on my 1660s with 70W, this makes the ppw 0.45 which would be on top of the table

9 : Anonymous2021/08/11 12:54 ID: h8j1md7

I honestly thought the tiny bus width was partly to make this less attractive to miners - perplexed it mines ETH so well

ID: h8k45f4

Bus bandwidth (cpu to gpu right?), I don't think it matters at all.

I have 5700, and since my cpu doesn't that much bandwidth. The Gpu is running on 8x, 3.0.

100w or even 95w and I can get 50mh/s.

ID: h8l0xon

I think bus bandwidth here refers to the VRAM. That's 256 GB/s on the 6600 XT.

10 : Anonymous2021/08/11 11:10 ID: h8irr1b

Let me guess using Hwinfo before they do an update for that card lol

11 : Anonymous2021/08/11 13:50 ID: h8j87lx

I knew it, it was fishy.

12 : Anonymous2021/08/11 15:29 ID: h8jlrsm

Cool. (in my best John Oliver voice)

You go buy all the 6600xt's and leave the good cards for the rest of us.

What algo do you plan to be mining this time next year? Go re-run the numbers for kawpow and autolykos2.

13 : Anonymous2021/08/11 10:59 ID: h8iqwrd

Mining was dead they said

ID: h8j1pfi

I bought my 3060 to play games, but I’m sure as hell going to mine as much as I can between sessions until ETH can’t be mined any longer.

ID: h8kb6pd

You will probably make more than you payed for the thing the way crypto is going at the mo.

ID: h8isa3g

haha yeah, with eth at $3200 it's far from dead. an as gpu efficiency improves and cripto becomes more popular i can hardly see it dying

ID: h8itb37

Didn't Ethereum 2.0 disable GPU-based mining or has it not been implemented yet?

14 : Anonymous2021/08/11 16:13 ID: h8js4z1

Good. Miners can buy the overpriced card that should have plenty of availability, then leave the other models alone for gamers.

15 : Anonymous2021/08/11 12:17 ID: h8ixtlo

fucking miners

ID: h8j1p86

Fuck them. Unless they do it for some extra bucks on a pc they use for other stuff.

16 : Anonymous2021/08/11 12:50 ID: h8j1a6c

My RTX 3060 does 41 MH/s stock tho, seems a bit low in this chart?

ID: h8jax5a

My rev 1 evga 3060 does 49mh/s with the dummy plug

17 : Anonymous2021/08/11 13:21 ID: h8j4pkn

The thing is, if you mine the card will pay the difference.

I want a $400 3060Ti. If I can find one for $800 not from ebay, if I use it to mine it will take ~100 days of mining to make the difference.

18 : Anonymous2021/08/11 10:47 ID: h8iq1fx

55W is the reading in software. AMD cards are notorious for having inaccurate software readings, typically by 30W. This card is probably closer to 90W at the wall. Nvidia is typically very close between software and the wall, with any discrepancy being attributable to your power supply's efficiency rating.

This chart is also just wrong in numerous places - for example, the 5700 XT typically uses 100W in software, but shows 130W here, as if they added in the 30W software inaccuracy.

ID: h8ir582

Usually software reading is for the GPU core only. Doesn't take into account vram etc which adds ~40W easily.

ID: h8iz485

AMD drivers report the power draw of all of the components on the card however, the reported power is the "Power Out" figure (from the card's VRMs) and not the "Power In", which would be from the PSU. In practice meaning that the VRM losses are being omitted from the reported figures.

Because of that, for example the 6800 XT and 6900 XT that officially are 300W cards only have the default power limit set to 255W. Same applies to other Navi 2x cards as well.

ID: h8iwxnr

55W is the reading in software. AMD cards are notorious for having inaccurate software readings, typically by 30W.

Not inaccurate. They are accurate for the GPU only, they dont take into accoount the VRAM and VRM losses.

AMD is bang on correct with their TBP.


Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x