Bitcoin Forum
May 12, 2024, 08:22:46 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Cray Unveils Its First GPU Supercomputer  (Read 2919 times)
Gameover (OP)
Member
**
Offline Offline

Activity: 92
Merit: 10

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE


View Profile WWW
May 25, 2011, 01:26:17 AM
 #1

http://www.hpcwire.com/hpcwire/2011-05-24/cray_unveils_its_first_gpu_supercomputer.html

so what is the hashrate of 70 teraflops turn out to be?

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE
1715502166
Hero Member
*
Offline Offline

Posts: 1715502166

View Profile Personal Message (Offline)

Ignore
1715502166
Reply with quote  #2

1715502166
Report to moderator
1715502166
Hero Member
*
Offline Offline

Posts: 1715502166

View Profile Personal Message (Offline)

Ignore
1715502166
Reply with quote  #2

1715502166
Report to moderator
1715502166
Hero Member
*
Offline Offline

Posts: 1715502166

View Profile Personal Message (Offline)

Ignore
1715502166
Reply with quote  #2

1715502166
Report to moderator
Even if you use Bitcoin through Tor, the way transactions are handled by the network makes anonymity difficult to achieve. Do not expect your transactions to be anonymous unless you really know what you're doing.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1715502166
Hero Member
*
Offline Offline

Posts: 1715502166

View Profile Personal Message (Offline)

Ignore
1715502166
Reply with quote  #2

1715502166
Report to moderator
Txyru
Member
**
Offline Offline

Activity: 61
Merit: 10


View Profile
May 25, 2011, 02:26:54 AM
 #2

A 5850 does 2 terraflops (or more), so assuming 300mhash/s for each 5850 I estimate 10.5ghash/s...
commlinx
Full Member
***
Offline Offline

Activity: 294
Merit: 100



View Profile
May 25, 2011, 03:12:06 AM
 #3

I noticed in the article the GPUs are Tesla and while I'm not familiar with that model presumably they are better for floating point than integer the same as the consumer Nvidia cards.
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
May 25, 2011, 03:16:18 AM
Last edit: May 25, 2011, 03:40:01 AM by Basiley
 #4

Nvidia Tesla inside ?
no double-precision math support ?
what kind of "super" it would be ? :/
home super ? :| like Octane 3 ?
while new Fortran-2003 -adapted libs enjoy quad-precision FP math :-/

p.s.
finally Cray jumps to hybrid supers bandvagon :/
noticing five 5 most powerful supers [hybrid]nature.

p.p.s.
probably they scrap their FPGA/Vector modules development soon.
w128
Newbie
*
Offline Offline

Activity: 14
Merit: 0



View Profile
May 25, 2011, 03:20:41 AM
 #5

Previously, the only difference between certain Tesla models and certain consumer GeForce cards was the lack of video connectors and a (presumably) more stringent binning process to ensure Tesla parts were stable within specific tolerances for 24/7 use.

Syke
Legendary
*
Offline Offline

Activity: 3878
Merit: 1193


View Profile
May 25, 2011, 05:11:21 AM
 #6

ROFL. Cray jumped the shark when they decided to use Windows on their HPCs.

Buy & Hold
IlbiStarz
Full Member
***
Offline Offline

Activity: 336
Merit: 100



View Profile
May 25, 2011, 05:39:03 AM
 #7

I thought their "non-gpu" supercomputer were already hitting petaflops? So wouldn't that be faster.
allinvain
Legendary
*
Offline Offline

Activity: 3080
Merit: 1080



View Profile WWW
May 25, 2011, 06:31:51 AM
 #8

Bah, why didn't they partner up with AMD? They're already a big purchaser of AMD Opteron chips, so why not poke AMD to develop a similar solution like the Teslas. Multiple Cayman cores on a PCB anyone? Hmm, or is it because Nvidia's CUDA language is easier to code with?


commlinx
Full Member
***
Offline Offline

Activity: 294
Merit: 100



View Profile
May 25, 2011, 10:19:44 AM
 #9

Hmm, or is it because Nvidia's CUDA language is easier to code with?
My guess is that it's because floating point is more useful for most simulations. I'm sure someone will be able to point out exceptions, but the only two mainstream research applications I've got that support a GPUs are a GIS application and an RF (radio frequency) simulation package where floating point is a natural fit. I'd guess most physics and "real world" simulations would fall into the same category.

The only two GPU applications I've got where integer is better are Bitcoin mining and a password cracker. Outside a few government agencies and security researchers it's problably not the sort of things that many people throw a lot of money at.
Gameover (OP)
Member
**
Offline Offline

Activity: 92
Merit: 10

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE


View Profile WWW
May 25, 2011, 02:01:23 PM
 #10

Hmm, or is it because Nvidia's CUDA language is easier to code with?
My guess is that it's because floating point is more useful for most simulations. I'm sure someone will be able to point out exceptions, but the only two mainstream research applications I've got that support a GPUs are a GIS application and an RF (radio frequency) simulation package where floating point is a natural fit. I'd guess most physics and "real world" simulations would fall into the same category.

The only two GPU applications I've got where integer is better are Bitcoin mining and a password cracker. Outside a few government agencies and security researchers it's problably not the sort of things that many people throw a lot of money at.

Lol good point, on both of the integer applications if you have significant funds or influence you can simply use the much cheaper solution of interrogation to get the password or hire a criminal and steal someones bitcoins.  For floating point its impossible to influence a weather pattern or cell tower via political power  Grin

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
May 25, 2011, 03:30:53 PM
 #11

basicly because AMD not provide virtually NO support for GPGPU developers[both software and hardware] for years !!
even datasheets findings is troublesome[and many things undisclosed yet :/].
and Nvidia actually co-operate with devlopers and even co-assist in development itself Tongue
i whish AMD start investing little more in building healthy eco-system around their fellow GPGPU developers/enthusisasts :-|
starting from improving OpenCL related SDK, expanding devs support division and helping in making 1st GPGPU steps.
bulanula
Hero Member
*****
Offline Offline

Activity: 518
Merit: 500



View Profile
May 25, 2011, 03:34:24 PM
 #12

This has 50 petaflops power.

More than 40 petaflops of whole Bitcoin network.

This could kill this network in a matter of seconds !

Nobody is invincible, including Bitcoin !
Basiley
Newbie
*
Offline Offline

Activity: 42
Merit: 0


View Profile
May 25, 2011, 03:39:26 PM
 #13

This has 50 petaflops power.

More than 40 petaflops of whole Bitcoin network.

This could kill this network in a matter of seconds !

Nobody is invincible, including Bitcoin !
trick/lie is lied in HOW those "Petaflops" measured Wink
hardly such thing reach this rate on general-purpose math and even less likely its done with double precision.
and both make this "Super" pointless, compared to BitCoin network.
commlinx
Full Member
***
Offline Offline

Activity: 294
Merit: 100



View Profile
May 25, 2011, 03:56:52 PM
 #14

This has 50 petaflops power.
Considering they say it may scale up to that you should ask for a quote on that configuration. I see a card at $3K for a tflop, so that would make it $150M, that was a one off price for a PC card though but would still assume it would be in the tens of millions so if you had that much cash it would be better spent on GPUs or an ASIC for mining.
Gameover (OP)
Member
**
Offline Offline

Activity: 92
Merit: 10

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE


View Profile WWW
May 25, 2011, 05:06:11 PM
 #15

This has 50 petaflops power.
Considering they say it may scale up to that you should ask for a quote on that configuration. I see a card at $3K for a tflop, so that would make it $150M, that was a one off price for a PC card though but would still assume it would be in the tens of millions so if you had that much cash it would be better spent on GPUs or an ASIC for mining.

Again, much cheaper to simply hire a team to find people with most bitcoins and extort them.

NEURAL.CLUB - FIRST SOCIAL ARTIFICIAL INTELLIGENCE
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!