Bitcoin Forum
May 12, 2024, 06:52:16 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Mining Performance Evaluation  (Read 2457 times)
bitbeast (OP)
Member
**
Offline Offline

Activity: 70
Merit: 10



View Profile
June 11, 2013, 05:46:21 AM
Last edit: June 11, 2013, 08:30:58 AM by bitbeast
 #1


Well, guys and gals, - as the time of GPU mining seems to be close to its logical end and first ASIC miners makes their way to the first happy customers, I've decided to publish a small research about the factors that really affect the modern GPUs mining performance.

I've had an idea that on a such a class of task suitable for good parallelization, like SHA256-based bitcoin mining algorithm, only two GPU parameters should affect the total GPU chip speed. I've made an assumption that these parameters are:

- number of GPU specific processors, so called 'shaders' (shader units);
- a clock frequency the shader units above mentioned, able to use.

So, it is possible to evaluate a (total computing power) P of every GPU by a synthetic number - a multiplication of (number of shader units) N by a (clock frequency of GPU at 100 % load) F:

P = N x F

- For example, the AMD Radeon HD 6670 has 480 shaders working at 800 MHz (0.8 gigaherz) clock frequency in a full-speed mode, i.e. at 100 % GPU load.
So, the computational power for Radeon HD 6670 may be evaluated by the formulae (480 shaders x 0.8 GHz) = 384 'theoretical' units of computing power.

- In comparison, the AMD Radeon HD 7970, one of the most powerful GPUs on the modern market, has 2048 shaders working at 925 MHz (0.925 gigaherz) clock frequency at full GPU load.
And the power for Radeon HD 7970 may be evaluated as (2048 shaders x 0.925 GHz) = 1894 units of computational power, so AMD Radeon HD 7970 looks to be 5 times more powerful hardware solution than AMD Radeon HD 6670.

The next question is - how these 'theoretical units of power' mentioned above can be conmpared with a real GPU productivity on such a task as, for example, bitcoin mining?

The answer is simple - to draw a simple 2D-plot, where vertical axis will represent the real computing power of GPU on the (for example) task of bitcoin mining, in megahashes per second (M, MHps) while the horizontal axis will represent the theoretical computing power of GPU calculated by the method mentioned above.

So, for every modern GPU we can get a simple pair numbers and use these numbers as 2D-plot coordinates - theoretical power P and real power M.

Here the final table:



and the plot:



as we can see from the graph above, the plot of M versus P is pretty linear that really shows the direct dependence between the 'theoretical computing power' calculated and the real-world power, measured in nature experiments in (megahashes per second) on a real hardware.

As a result, now you can evalute the mining speed / performance of every GPU of AMD Radeon 6xx0, 7xx0 and may be probably of forthcoming 8xx0 series, via a simple formulae of linear equation:

M (MHps) = 0.2935 x (Shaders x Frequency In GHz)

where:

- (M) - BTC mining speed in Megahashes per second;
- (Shaders x Frequency) is a multiplication of number of shaders at GPU and their stock frequency in gigaherz (GHz);
- (0.2935) is a scale factor.

That's all, folks - and happy mining!

Time is money mining.
If you see garbage posts (off-topic, trolling, spam, no point, etc.), use the "report to moderator" links. All reports are investigated, though you will rarely be contacted about your reports.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
flint78
Newbie
*
Offline Offline

Activity: 1
Merit: 0


View Profile
June 11, 2013, 07:36:00 AM
 #2

Interesting. I'd like to know whether a similar trend exists for Nvidia cards.
bitbeast (OP)
Member
**
Offline Offline

Activity: 70
Merit: 10



View Profile
June 11, 2013, 07:53:31 AM
Last edit: July 20, 2013, 11:30:31 PM by bitbeast
 #3


I'd like to know whether a similar trend exists for Nvidia cards.

- Plenty of data regarding bitcoin mining performance, available here:

https://en.bitcoin.it/wiki/Mining_Hardware_Comparison

But as a rule NVidia GPUs are significantly slower on BTC mining due to the specifics of NVidia graphics hardware architecture.

That's why people prefer mining on AMD graphic boards...  Smiley

Time is money mining.
smolen
Hero Member
*****
Offline Offline

Activity: 524
Merit: 500


View Profile
July 22, 2013, 07:40:50 AM
 #4

M (MHps) = 0.2935 x (Shaders x Frequency In GHz)

where:

- (M) - BTC mining speed in Megahashes per second;
- (Shaders x Frequency) is a multiplication of number of shaders at GPU and their stock frequency in gigaherz (GHz);
- (0.2935) is a scale factor.
And this scale factor is almost exactly 1 GHz/1MHs/3375 integer ops = 0.3034, see https://bitcointalk.org/index.php?topic=7964.msg550288#msg550288

Of course I gave you bad advice. Good one is way out of your price range.
bitbeast (OP)
Member
**
Offline Offline

Activity: 70
Merit: 10



View Profile
July 22, 2013, 08:11:06 AM
 #5


Quote
- (0.2935) is a scale factor.
And this scale factor is almost exactly 1 GHz/1MHs/3375 integer ops = 0.3034, see https://bitcointalk.org/index.php?topic=7964.msg550288#msg550288

- Yes, the difference between above numbers is only 3 % that's probably in the margins of error.

Time is money mining.
choicatn
Newbie
*
Offline Offline

Activity: 14
Merit: 0


View Profile
July 22, 2013, 04:56:31 PM
 #6

Interesting. I'd like to know whether a similar trend exists for Nvidia cards.

As would I as I have a lot of NVidia hardware for gaming and not many amd/ati cards
macmn
Newbie
*
Offline Offline

Activity: 21
Merit: 0


View Profile
July 23, 2013, 03:26:43 PM
 #7

I am currently going with amd 7950 cards as they seem like the best price/performance cards out there right now
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!