Bitcoin Forum

Other => Beginners & Help => Topic started by: bitbeast on June 11, 2013, 05:46:21 AM



Title: Mining Performance Evaluation
Post by: bitbeast on June 11, 2013, 05:46:21 AM

Well, guys and gals, - as the time of GPU mining seems to be close to its logical end and first ASIC miners makes their way to the first happy customers, I've decided to publish a small research about the factors that really affect the modern GPUs mining performance.

I've had an idea that on a such a class of task suitable for good parallelization, like SHA256-based bitcoin mining algorithm, only two GPU parameters should affect the total GPU chip speed. I've made an assumption that these parameters are:

- number of GPU specific processors, so called 'shaders' (shader units);
- a clock frequency the shader units above mentioned, able to use.

So, it is possible to evaluate a (total computing power) P of every GPU by a synthetic number - a multiplication of (number of shader units) N by a (clock frequency of GPU at 100 % load) F:

P = N x F

- For example, the AMD Radeon HD 6670 has 480 shaders working at 800 MHz (0.8 gigaherz) clock frequency in a full-speed mode, i.e. at 100 % GPU load.
So, the computational power for Radeon HD 6670 may be evaluated by the formulae (480 shaders x 0.8 GHz) = 384 'theoretical' units of computing power.

- In comparison, the AMD Radeon HD 7970, one of the most powerful GPUs on the modern market, has 2048 shaders working at 925 MHz (0.925 gigaherz) clock frequency at full GPU load.
And the power for Radeon HD 7970 may be evaluated as (2048 shaders x 0.925 GHz) = 1894 units of computational power, so AMD Radeon HD 7970 looks to be 5 times more powerful hardware solution than AMD Radeon HD 6670.

The next question is - how these 'theoretical units of power' mentioned above can be conmpared with a real GPU productivity on such a task as, for example, bitcoin mining?

The answer is simple - to draw a simple 2D-plot, where vertical axis will represent the real computing power of GPU on the (for example) task of bitcoin mining, in megahashes per second (M, MHps) while the horizontal axis will represent the theoretical computing power of GPU calculated by the method mentioned above.

So, for every modern GPU we can get a simple pair numbers and use these numbers as 2D-plot coordinates - theoretical power P and real power M.

Here the final table:

http://s21.postimg.org/ww90nbbon/GPU_Table.png

and the plot:

http://s17.postimg.org/xw88p0ydr/GPU_Speed.png

as we can see from the graph above, the plot of M versus P is pretty linear that really shows the direct dependence between the 'theoretical computing power' calculated and the real-world power, measured in nature experiments in (megahashes per second) on a real hardware.

As a result, now you can evalute the mining speed / performance of every GPU of AMD Radeon 6xx0, 7xx0 and may be probably of forthcoming 8xx0 series, via a simple formulae of linear equation:

M (MHps) = 0.2935 x (Shaders x Frequency In GHz)

where:

- (M) - BTC mining speed in Megahashes per second;
- (Shaders x Frequency) is a multiplication of number of shaders at GPU and their stock frequency in gigaherz (GHz);
- (0.2935) is a scale factor.

That's all, folks - and happy mining!


Title: Re: Mining Performance Evaluation
Post by: flint78 on June 11, 2013, 07:36:00 AM
Interesting. I'd like to know whether a similar trend exists for Nvidia cards.


Title: Re: Mining Performance Evaluation
Post by: bitbeast on June 11, 2013, 07:53:31 AM

I'd like to know whether a similar trend exists for Nvidia cards.

- Plenty of data regarding bitcoin mining performance, available here:

https://en.bitcoin.it/wiki/Mining_Hardware_Comparison

But as a rule NVidia GPUs are significantly slower on BTC mining due to the specifics of NVidia graphics hardware architecture.

That's why people prefer mining on AMD graphic boards...  :)


Title: Re: Mining Performance Evaluation
Post by: smolen on July 22, 2013, 07:40:50 AM
M (MHps) = 0.2935 x (Shaders x Frequency In GHz)

where:

- (M) - BTC mining speed in Megahashes per second;
- (Shaders x Frequency) is a multiplication of number of shaders at GPU and their stock frequency in gigaherz (GHz);
- (0.2935) is a scale factor.
And this scale factor is almost exactly 1 GHz/1MHs/3375 integer ops = 0.3034, see https://bitcointalk.org/index.php?topic=7964.msg550288#msg550288 (https://bitcointalk.org/index.php?topic=7964.msg550288#msg550288)


Title: Re: Mining Performance Evaluation
Post by: bitbeast on July 22, 2013, 08:11:06 AM

Quote
- (0.2935) is a scale factor.
And this scale factor is almost exactly 1 GHz/1MHs/3375 integer ops = 0.3034, see https://bitcointalk.org/index.php?topic=7964.msg550288#msg550288

- Yes, the difference between above numbers is only 3 % that's probably in the margins of error.


Title: Re: Mining Performance Evaluation
Post by: choicatn on July 22, 2013, 04:56:31 PM
Interesting. I'd like to know whether a similar trend exists for Nvidia cards.

As would I as I have a lot of NVidia hardware for gaming and not many amd/ati cards


Title: Re: Mining Performance Evaluation
Post by: macmn on July 23, 2013, 03:26:43 PM
I am currently going with amd 7950 cards as they seem like the best price/performance cards out there right now