Bitcoin Forum
December 03, 2016, 09:53:27 AM *
News: Latest stable version of Bitcoin Core: 0.13.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: [1]
  Print  
Author Topic: Bitcoin Rig FUTURE....is it possible?  (Read 1202 times)
tatsuchan
Full Member
***
Offline Offline

Activity: 184



View Profile
June 10, 2012, 12:26:23 PM
 #1

I'm more of an investor/miner than a computer scientist, so I'm asking the enlightened on here.....Would it be possible to rent out rigs that are pulling say 10+ghash of power to businesses, science research, poor governments or possibly any other service that would require large amounts of computing power?  There might be a business model in this if we had pool websites setup to do mass services for 3rd party businesses.  I've heard that typically GPU's are bad at scientific research because of small cache size.  Is there a way around this if developed from ground up?

Is there a reason why we can't rent our large bitcoin rigs' power to something other then bitcoin?  
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1480758807
Hero Member
*
Offline Offline

Posts: 1480758807

View Profile Personal Message (Offline)

Ignore
1480758807
Reply with quote  #2

1480758807
Report to moderator
1480758807
Hero Member
*
Offline Offline

Posts: 1480758807

View Profile Personal Message (Offline)

Ignore
1480758807
Reply with quote  #2

1480758807
Report to moderator
etotheipi
Legendary
*
expert
Offline Offline

Activity: 1428


Core Armory Developer


View Profile WWW
June 10, 2012, 03:17:40 PM
 #2

I'm more of an investor/miner than a computer scientist, so I'm asking the enlightened on here.....Would it be possible to rent out rigs that are pulling say 10+ghash of power to businesses, science research, poor governments or possibly any other service that would require large amounts of computing power?  There might be a business model in this if we had pool websites setup to do mass services for 3rd party businesses.  I've heard that typically GPU's are bad at scientific research because of small cache size.  Is there a way around this if developed from ground up?

Is there a reason why we can't rent our large bitcoin rigs' power to something other then bitcoin?  

Wherever you heard "GPUs are bad at scientific research" is wholly incorrect.  They're absolutely fantastic for tons of different kinds of scientific and computational activities, but there's a lot of applications where it doesn't fit.  I should know, because I work as a physicist, and we have dozens of projects within the organization that are adopting GP-GPU (general-purpose GPU) programming, and outfitting all of their systems with GPUs.  Myself, I do a bit of GPGPU development for "scientific research."

On the other hand, I will say from personal experience that CUDA -- an NVIDIA-specific GPGPU language -- is much more pleasant to develop with than OpenCL, even though CUDA won't work on ATI cards.  That's why all the computers in all the computer labs where I work have GTX 480/580/680s.  ATI cards will generally perform better with equivalently-optimized algorithms for the target architecture, but the development ease is the issue, not the speed.  If code currently takes 10 minutes to run on CPU, then perhaps we can get it down to 15 seconds on NVIDIA cards or 9 seconds using openCL.  In that case, we're picking whichever arch is the easiest to develop for because the difference between 15s and 9s is like comparing paying $0.15 for a sandwich instead of $0.09... we'll pay the extra $0.06 for the more pleasant experience.

(P.S. -- Do not compare raw power of the ATI vs NVIDIA based on mining speeds:  ATI has some big advantages over NVIDIA that are very specific only to Bitcoin mining; the gap is not nearly as huge for most types of computations, especially double-precision operations which are common to scientific programming and NVIDIA has spent a lot of time optimizing them).

That's not to say that OpenCL isn't used.  Because of the speed of ATI cards, there are a lot of organizations that develop in OpenCL, and prefer ATI cards because they tend to perform better overall.  I think there is plenty of market for GPU-cluster resources, the big question is how are you going to arrange/setup for remote users to access the cluster and charge them for their use of it?.


Founder and CEO of Armory Technologies, Inc.
Armory Bitcoin Wallet: Bringing cold storage to the average user!
Only use Armory software signed by the Armory Offline Signing Key (0x98832223)

Please donate to the Armory project by clicking here!    (or donate directly via 1QBDLYTDFHHZAABYSKGKPWKLSXZWCCJQBX -- yes, it's a real address!)
crosby
Sr. Member
****
Offline Offline

Activity: 366


#RIP freemoney


View Profile WWW
June 10, 2012, 03:27:29 PM
 #3


http://gpuhash.com/

Get your BTC and LTC at www.betcoinpartners.com/c/2/374 with FREEROLLS + Daily & Weekly Bonuses + Satellites + ON DEMAND tournaments + Instant Deposit + Instant Withdrawals + RING games + GTD Tournament + On Demand Satellite, + LIVE DEALER CASINO + SPORTSBOOK + 100% WELCOME BONUS www.betcoinpartners.com/c/2/374  150 btc ($100,000) gtd monthly, 75 btc ($50,000) gtd weekly and 10 BIG daily gtd ranging 0.5 btc to up to 20 btc
tatsuchan
Full Member
***
Offline Offline

Activity: 184



View Profile
June 10, 2012, 07:12:32 PM
 #4

I'm more of an investor/miner than a computer scientist, so I'm asking the enlightened on here.....Would it be possible to rent out rigs that are pulling say 10+ghash of power to businesses, science research, poor governments or possibly any other service that would require large amounts of computing power?  There might be a business model in this if we had pool websites setup to do mass services for 3rd party businesses.  I've heard that typically GPU's are bad at scientific research because of small cache size.  Is there a way around this if developed from ground up?

Is there a reason why we can't rent our large bitcoin rigs' power to something other then bitcoin?  

Wherever you heard "GPUs are bad at scientific research" is wholly incorrect.  They're absolutely fantastic for tons of different kinds of scientific and computational activities, but there's a lot of applications where it doesn't fit.  I should know, because I work as a physicist, and we have dozens of projects within the organization that are adopting GP-GPU (general-purpose GPU) programming, and outfitting all of their systems with GPUs.  Myself, I do a bit of GPGPU development for "scientific research."

On the other hand, I will say from personal experience that CUDA -- an NVIDIA-specific GPGPU language -- is much more pleasant to develop with than OpenCL, even though CUDA won't work on ATI cards.  That's why all the computers in all the computer labs where I work have GTX 480/580/680s.  ATI cards will generally perform better with equivalently-optimized algorithms for the target architecture, but the development ease is the issue, not the speed.  If code currently takes 10 minutes to run on CPU, then perhaps we can get it down to 15 seconds on NVIDIA cards or 9 seconds using openCL.  In that case, we're picking whichever arch is the easiest to develop for because the difference between 15s and 9s is like comparing paying $0.15 for a sandwich instead of $0.09... we'll pay the extra $0.06 for the more pleasant experience.

(P.S. -- Do not compare raw power of the ATI vs NVIDIA based on mining speeds:  ATI has some big advantages over NVIDIA that are very specific only to Bitcoin mining; the gap is not nearly as huge for most types of computations, especially double-precision operations which are common to scientific programming and NVIDIA has spent a lot of time optimizing them).

That's not to say that OpenCL isn't used.  Because of the speed of ATI cards, there are a lot of organizations that develop in OpenCL, and prefer ATI cards because they tend to perform better overall.  I think there is plenty of market for GPU-cluster resources, the big question is how are you going to arrange/setup for remote users to access the cluster and charge them for their use of it?.



etotheipi, THANK YOU!  Seriously, that puts a good direction on this idea.  It would matter to the Bitcoin community to use ATI specific platforms.  After all, most of us are running 5800's 6800's etc... It is great to know that it is plausible though.  I know there are various applications out there for for-rent super computing, and might make a few of us some profit.  What I want to see put together is a pool that gives you options to put your GPU power to work and make it cost efficient (hopefully for most people anyway).  I guess we need a way to consolidate and distribute a specific language/distribution protocol to even begin as well as some sort of software for the client end to distribute their "problems" to a pool/network?

Do you think a pool could accept several different jobs at once and process it to hundreds of GPU nodes without the users ever having to know what companies they are doing the work for?  Just have a pool that pays say $.01/share, and the shares belong to multiple contract jobs juggled by pools.  Could it work the same way like namecoins and bitcoins work?  In theory, think I could set up a pool that allowed users to mine bitcoin/namecoin/for-profit computing at once?  That might really bump the profit for all us owning expensive rigs.
Pages: [1]
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!