Bitcoin Forum
May 10, 2024, 09:58:04 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: supercomputer GPUs vs consumer GPUs?  (Read 1862 times)
keten (OP)
Newbie
*
Offline Offline

Activity: 4
Merit: 0


View Profile
June 11, 2011, 07:38:56 PM
 #1

So take a look at the

http://www.nvidia.com/docs/IO/105880/DS_Tesla-M2090_LR.pdf
M2050:
1030 single precision GFLOPs
448 CUDA cores
3 GB GDDR5
148 GB/s memory bandwidth
cost: $2400

vs.

http://www.geeks3d.com/20110324/nvidia-geforce-gtx-590-officially-launched-specifications-and-reviews-hd-6990-is-still-faster/
GeForce GTX 590
2486 single precision GFLOPs
1024 CUDA cores
3 GB GDDR5
327 GB/s memory bandwidth
cost: $699

So why do people use supercomputer GPUs? They seem worse in every way specification wise. Is there any way it'd be better than the consumer one for bitcoin mining?


1715378284
Hero Member
*
Offline Offline

Posts: 1715378284

View Profile Personal Message (Offline)

Ignore
1715378284
Reply with quote  #2

1715378284
Report to moderator
1715378284
Hero Member
*
Offline Offline

Posts: 1715378284

View Profile Personal Message (Offline)

Ignore
1715378284
Reply with quote  #2

1715378284
Report to moderator
Bitcoin addresses contain a checksum, so it is very unlikely that mistyping an address will cause you to lose money.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1715378284
Hero Member
*
Offline Offline

Posts: 1715378284

View Profile Personal Message (Offline)

Ignore
1715378284
Reply with quote  #2

1715378284
Report to moderator
1715378284
Hero Member
*
Offline Offline

Posts: 1715378284

View Profile Personal Message (Offline)

Ignore
1715378284
Reply with quote  #2

1715378284
Report to moderator
1715378284
Hero Member
*
Offline Offline

Posts: 1715378284

View Profile Personal Message (Offline)

Ignore
1715378284
Reply with quote  #2

1715378284
Report to moderator
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
June 11, 2011, 07:53:13 PM
 #2

Single Precision shaders==fail.
scalar arch==fail.
so, in short, nothing interesting in NVidia, except easy-to-start/use SDK.
but this rolled around proprietary/slowly-advancing API, called CUDA, which isn't point to invest developing around/for.
Rob P.
Member
**
Offline Offline

Activity: 84
Merit: 10


View Profile WWW
June 11, 2011, 07:53:30 PM
 #3

No one serious is using an nVidia to mine.  There's no point, nVidia's blow at mining compared to ATI cards of the same cost.

SHA256 is an Integer operation, which ATI cards excel at doing.
nVidia cards are better at floating point operations, which mapping polygons on a screen requires lots of...

Basically, neither of those cards are great at mining.

--

If you like what I've written here, consider tipping the messenger:
1GZu4CtHa6ai8iWoWiVFxV5VVoNte4SkoG

If you don't like what I've written, send me a Tip and I'll stop talking.
bcpokey
Hero Member
*****
Offline Offline

Activity: 602
Merit: 500



View Profile
June 11, 2011, 07:56:29 PM
 #4

Is OP asking about use of these cards for mining or for general usage? No one mines on nvidia cards regardless of their specs.

In general however people use the supercomputer GPUs for non-single precision calcs.
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
June 11, 2011, 08:00:20 PM
 #5

using supers for SP computation is joke, sure. watch for usual LINPACK/LIVERMORE lib usage, for example Wink
even DP isn't enough anymore and new, quad-presision, FP-format introduced both for Fortran[in 2003]and other languages.
keten (OP)
Newbie
*
Offline Offline

Activity: 4
Merit: 0


View Profile
June 11, 2011, 08:07:02 PM
 #6

Hmmm, are nvidias really that bad?

I have the opportunity to use a setup with 22 GB RAM, 2 x NVIDIA Tesla “Fermi” M2050 GPU. Is there really no way to get any kind of comparable mining performance from that compared to just any old ATI card?
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
June 11, 2011, 08:10:04 PM
 #7

Hmmm, are nvidias really that bad?

I have the opportunity to use a setup with 22 GB RAM, 2 x NVIDIA Tesla “Fermi” M2050 GPU. Is there really no way to get any kind of comparable mining performance from that compared to just any old ATI card?

they Good for rendering, ie in games.
esp in complex scenes with cool light and huge viewdistance.
but for GPGPU usage they sucks.
and stay in this biz, only because ATI/AMD doing it wrong.
bcpokey
Hero Member
*****
Offline Offline

Activity: 602
Merit: 500



View Profile
June 11, 2011, 08:11:53 PM
 #8

You might be looking at about 100MHash/sec per card? Not sure. But that's about the equivalent mhash of an old 4850 ($50 AMD)
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!