Bitcoin Forum
April 26, 2024, 11:02:57 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2] 3 »  All
  Print  
Author Topic: GK104: nVidia's Kepler to be the First Mining Card?  (Read 8300 times)
rjk
Sr. Member
****
Offline Offline

Activity: 448
Merit: 250


1ngldh


View Profile
March 20, 2012, 03:26:48 PM
 #21

nda is up and reviews are out. The 680 is shredding the 7970 in most applictions... Can't wait to see the mining performance.
What games are most similar to bitcoin mining in that they use integers heavily? Perhaps we could compare such a game and make a little bit of a better determination. I doubt that many games make heavy use of integer ops though, at least not in a remotely similar way to how mining does.

Mining Rig Extraordinaire - the Trenton BPX6806 18-slot PCIe backplane [PICS] Dead project is dead, all hail the coming of the mighty ASIC!
1714172577
Hero Member
*
Offline Offline

Posts: 1714172577

View Profile Personal Message (Offline)

Ignore
1714172577
Reply with quote  #2

1714172577
Report to moderator
1714172577
Hero Member
*
Offline Offline

Posts: 1714172577

View Profile Personal Message (Offline)

Ignore
1714172577
Reply with quote  #2

1714172577
Report to moderator
"There should not be any signed int. If you've found a signed int somewhere, please tell me (within the next 25 years please) and I'll change it to unsigned int." -- Satoshi
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714172577
Hero Member
*
Offline Offline

Posts: 1714172577

View Profile Personal Message (Offline)

Ignore
1714172577
Reply with quote  #2

1714172577
Report to moderator
MrTeal
Legendary
*
Offline Offline

Activity: 1274
Merit: 1004


View Profile
March 20, 2012, 03:41:11 PM
 #22






Power usage is awesome and it destroys the 7970 in gaming performance per watt, but I wouldn't expect it to mine well. The price of a 7970 might be dropping soon though.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
March 20, 2012, 03:47:48 PM
Last edit: March 20, 2012, 03:59:30 PM by DeathAndTaxes
 #23

No games use integer ops, that is what CPU is used for. Smiley

My guess is it will suck at mining.  Nothing in any of the reviews indicated improved integer performance.  I doubt NVidia would mention Bitcoin specifically but something like "improved encryption for OpenCL/CUDA accelerated applications like Winzip" would be a good sign.

It is roughly 1.3x to 1.7x as fast a 580 GTX.   If it has similar relative performance (int ops vs FLOPs) that puts it around 200 MH/s maybe 250 MH/s.

I agree with the post above.  The good news is that because it "beats" the 7970, AMD will need to undercut on price.

On edit:
http://www.fudzilla.com/home/item/26437-nvidia-gtx-680-price-now-set-at-us-$499

Looks like NVidia is striking back.  If GTX 680 is $499 and it out performs 7970 then 7970 will need a steep cut to be competitive.  Most gamers have little loyalty.  They just want max fps/$. Smiley

Obviously launch day prices will be higher but if they can keep the supply up I would imagine AMD needs to look at <$480 to avoid losing share.
Mousepotato
Hero Member
*****
Offline Offline

Activity: 896
Merit: 1000


Seal Cub Clubbing Club


View Profile
March 20, 2012, 05:00:01 PM
 #24

If I listen closely, I can almost hear the price of 7970s dropping.

Mousepotato
bulanula
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
March 20, 2012, 11:54:49 PM
 #25

Indeed. This is good as 7970 destroyed so crappy AMD prices lowered now due to gamers buying green stuff.

Too bad we have to stick with needing xserver running and hardcoded 8 GPU limit Cry

I really think somebody should try it out for mining first before calling it a crap mining card Huh
racerguy
Sr. Member
****
Offline Offline

Activity: 270
Merit: 250


View Profile
March 21, 2012, 12:52:53 AM
 #26

I don't understand the people buying brand new 7970's thinking the resell value will be better, it's going to lose a few hundred $ in value the first few months, while a 5970 still maintains the majority if not all of it's value if you bought one when the 7970 came out.  Anyway more for me I guess.
The-Real-Link
Hero Member
*****
Offline Offline

Activity: 533
Merit: 500


View Profile
March 21, 2012, 01:51:46 AM
 #27

  Sorry for bumping my own thread as I hadn't seen the info edited into this post when I checked it before when I posted mine.

  Since I was reconfiguring my main GPUs anyway, I sold my 580s and should be picking one (or if prices are that darned good, maybe two) of the 680s up.  I can gladly give you guys mining results once I have time to mess around with the cards.

Oh Loaded, who art up in Mt. Gox, hallowed be thy name!  Thy dollars rain, thy will be done, on BTCUSD.  Give us this day our daily 10% 30%, and forgive the bears, as we have bought their bitcoins.  And lead us into quadruple digits
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
March 21, 2012, 02:35:10 AM
 #28

Although I don't have much hope I would be interested to see the results TRL.
goxed
Legendary
*
Offline Offline

Activity: 1946
Merit: 1006


Bitcoin / Crypto mining Hardware.


View Profile
March 21, 2012, 04:00:22 AM
 #29

Although I don't have much hope I would be interested to see the results TRL.

Here are results by proxy
http://www.theinquirer.net/inquirer/review/2162193/nvidias-gtx680-thrashed-amds-mid-range-radeon-hd-7870-gpu-compute

Revewing Bitcoin / Crypto mining Hardware.
rjk
Sr. Member
****
Offline Offline

Activity: 448
Merit: 250


1ngldh


View Profile
March 21, 2012, 03:06:07 PM
 #30

So according to that article, it is horrible at FP ops, and the 7970 beats the pants off of it in that benchmark. Wonder if they compensated by giving better integer ops? One can only hope....

Mining Rig Extraordinaire - the Trenton BPX6806 18-slot PCIe backplane [PICS] Dead project is dead, all hail the coming of the mighty ASIC!
MrTeal
Legendary
*
Offline Offline

Activity: 1274
Merit: 1004


View Profile
March 21, 2012, 04:22:38 PM
 #31

If you look at the second to last slide I posted, Sandra actually has a crypto benchmark that uses SHA256. I'm not sure how it's implemented since the numbers between the 6990 and the 7970 don't correspond to bitcoin hashing performance, but it should still be more useful than FP ops.
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
March 21, 2012, 04:29:42 PM
 #32

If you look at the second to last slide I posted, Sandra actually has a crypto benchmark that uses SHA256. I'm not sure how it's implemented since the numbers between the 6990 and the 7970 don't correspond to bitcoin hashing performance, but it should still be more useful than FP ops.

I noticed that was int performance but didn't notice it was SHA-256.  

Yeah it is hashing a large amount of data where Bitcoin hashes a small amount of data a lot of times so the units aren't going to be similar still the relative performance should give us some ideas.  Ballpark it looks like less than 1/2 the performance of 7970.  Exactly how many hashes it pulls will depend on how much it can be tweaked or overclocked (or gain from using pure CUDA).  Still it isn't even close to matching a 7970 in OpenCL integer performance.  It likely won't even match a 7950.
gat3way
Sr. Member
****
Offline Offline

Activity: 256
Merit: 250


View Profile
March 21, 2012, 05:48:42 PM
 #33

SHA1 is based on a Merkle-Damgard construction. It doesn't matter much whether you hash a lot of data once or a small amount of data multiple times. In both cases you are calculating multiple times the same compression function. Of course, there are some differences like hashing small amount of data can be more cache-friendly (with GPUs that could mean less __global reads). There are quite a lot of optimizations that can be done when input is fixed and small enough like what's the case with bitcoin. Anyway I believe that the ratios would be more or less the same provided that the graphs are correct.

However, this does not take into consideration whether the code is optimized better for a specific platform, the quality of the drivers and the OpenCL stack (which does not perform as well as CUDA on NVidia).
tacotime (OP)
Legendary
*
Offline Offline

Activity: 1484
Merit: 1005



View Profile
March 21, 2012, 06:15:08 PM
 #34

So according to that article, it is horrible at FP ops, and the 7970 beats the pants off of it in that benchmark. Wonder if they compensated by giving better integer ops? One can only hope....

Not surprising, they nerfed the FPUs in GK104 to about 1/8 of those of the GTX570, with the intent of making it a fast gaming card without much in the way of FP compute.  GK100 will be the one with very high FP/DPFP compute ability.

Code:
XMR: 44GBHzv6ZyQdJkjqZje6KLZ3xSyN1hBSFAnLP6EAqJtCRVzMzZmeXTC2AHKDS9aEDTRKmo6a6o9r9j86pYfhCWDkKjbtcns
MrTeal
Legendary
*
Offline Offline

Activity: 1274
Merit: 1004


View Profile
March 22, 2012, 02:03:53 AM
 #35

Well, after the Tom's leak it looks like Newegg had their cards listed early. It's been pulled now, but it was saved.



We'll know more once the NDA lifts (tonight?), but if gaming performance really is 10%+ better than Tahiti with lower power draw, I can't see how AMD could continue to sell many 7970s above $500.
MrTeal
Legendary
*
Offline Offline

Activity: 1274
Merit: 1004


View Profile
March 22, 2012, 02:10:42 AM
 #36

Looks like the NDA lifts tomorrow morning. Not sure which Canadian time zone.
http://www.hardwarecanucks.com/forum/video-cards/52615-12-more-hours.html
mrb
Legendary
*
Offline Offline

Activity: 1512
Merit: 1027


View Profile WWW
March 22, 2012, 08:31:02 AM
 #37

Newegg listed "Shader clock: 2012MHz".

Interesting... 1536 ALUs at 2012MHz would give it more processing power than a HD 6990. Either the GTX 680 is going to be the fastest GPU for mining, or it is a blatant mistake from Newegg or the microarchitecture has undisclosed limitations that would prevent exploiting all this apparent power.
Vbs
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500


View Profile
March 22, 2012, 11:06:53 AM
 #38

Newegg listed "Shader clock: 2012MHz".

Interesting... 1536 ALUs at 2012MHz would give it more processing power than a HD 6990. Either the GTX 680 is going to be the fastest GPU for mining, or it is a blatant mistake from Newegg or the microarchitecture has undisclosed limitations that would prevent exploiting all this apparent power.

Shaders are default clocked at 1411MHz on the highest profile, although there seems to be a big overclock headroom. Need some serious reviews to come up! Tongue
amazingrando
Hero Member
*****
Offline Offline

Activity: 546
Merit: 500



View Profile
March 22, 2012, 02:27:33 PM
 #39

Tom's hardware has its review posted.

http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-2.html

Quote
Kepler’s shaders run at the processor’s frequency (1:1)

Bitbond - 105% PPS mining bond - mining payouts without buying hardware
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
March 22, 2012, 02:36:46 PM
 #40

Newegg listing is obviously a misquote.

http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-4.html

Quote
First, we launch a single run of the Central Park level at 1920x1080 in DirectX 11 mode, without anti-aliasing. We get a 72.3 FPS result, and we observe GPU Boost pushing the GeForce GTX 680 between 1071 and 1124 MHz during the run (up from the 1006 MHz base).

The top chart shows that we’re bouncing around the upper end of GK104’s power ceiling. So, we increase the target board power by 15%. The result is a small jump to 74.2 FPS, along with clocks that vacillate between 1145 and 1197 MHz.

Figuring the power target boost likely freed up some thermal headroom, we then increase the offset by 100 MHz, which enables even better performance—76.1 FPS. This time, however, we get a constant 1215 MHz. Nvidia says this is basically as fast as the card will go given our workload and the power limit.

So why not up the target power again? At 130% (basically, the interface’s 225 W specification), performance actually drops to 75.6 FPS, and the graph over time shows a constant 1202 MHz. We expected more performance, not less. What gives? This is where folks are going to find a problem with GPU Boost. Because outcome is dependent on factors continually being monitored, performance does change over time. As a GPU heats up, current leakage increases. And as that happens, variables like frequency and voltage are brought down to counter a vicious cycle.

The effect is similar to heat soak in an engine. If you’re on a dynamometer doing back to back pulls, you expect to see a drop in horsepower if you don’t wait long enough between runs. Similarly, it’s easy to get consistently-high numbers after a few minute-long benchmarks. But if you’re gaming for hours, GPU Boost cannot be as effective.

Our attempt to push a 200 MHz offset demonstrates that, even though this technology tries to keep you at the highest frequency under a given power ceiling, increasing both limits still makes it easy to exceed the board’s potential and seize up.

Sliding back a bit to a 150 MHz offset gives us stability, but performance isn’t any better than the 100 MHz setting. No doubt, it’ll take more tinkering to find the right overclock with GPU Boost in the mix and always on.
Pages: « 1 [2] 3 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!