Bitcoin Forum
December 10, 2016, 01:06:53 PM *
News: To be able to use the next phase of the beta forum software, please ensure that your email address is correct/functional.
 
   Home   Help Search Donate Login Register  
Pages: « 1 [2] 3 4 5 6 7 8 »  All
  Print  
Author Topic: Thread about GPU-mining and Litecoin  (Read 31865 times)
iddo
Sr. Member
****
Offline Offline

Activity: 360


View Profile
February 17, 2012, 07:43:26 AM
 #21

Here's what I'd like to accomplish:
1) Figure out if GPU mining litecoins is indeed more efficient. And if so how much better is it.
2) Do we want to switch to a new hashing algorithm that is more GPU-hostile.
3) If we do want to switch, there are a ton of other questions. Can we modify scrypt params or do we need something totally different. How far away do we do the algorithm switch? How do we get miners/pools/clients ready for the switch so that there's no downtime?

1)
If I understand the rumors correctly, a single high-end GPU would be about 25x faster than a single CPU core, as opposed to 5x faster that was initially predicted by ArtForz? Still a lot less than the 400x speedup of bitcoin with 5870 GPU vs single CPU core?

2)
I suppose that increasing the memory size parameter of scrypt to a very large amount (megabytes...) which doesn't fit in the cache would mean that it'd be infeasible to do hash attempts in parallel with a GPU (and maybe even with several CPU cores), but it also most likely means that people couldn't use their computer to do other stuff while mining litecoins due to system responsiveness issues. Therefore it's possible that the current scrypt parameters as chosen by ArtForz and Lolcust are the best, espeically if bitcoin GPU mining remains more profitable than litecoin GPU mining.


3)
Switching to another hashing algorithm in this case is relatively simple, just need a protocol change that says that starting from block #x in the future the PoW will be checked according to the new hashing algorithm. By contrast, if it was the case that there's full pre-image attack on scrypt (which ain't gonna happen while there's no pre-image attack on sha256) then there would also be a need to add a new data field that contains the new hash to all the old blocks in the chain, otherwise an attacker could use the pre-image attack replace an old block with his own malicious block (see https://bitcointalk.org/index.php?topic=191.msg1585#msg1585).
There would be no "downtime", you can let the miners vote (similarly to bitcoin p2sh vote, see https://github.com/bitcoin/bitcoin/pull/804) and see if they agreed to the switch before block #x is reached, and if there's no majority then the minority should withdraw their support before block #x so that this proposal would be cancelled in a smooth way. From the point of view of the fork being beneficial as a rehearsal to a similar bitcoin fork that could be useful in the future, I think that if the litecoin p2pool has relatively high computing power then it'd be interesting.
1481375213
Hero Member
*
Offline Offline

Posts: 1481375213

View Profile Personal Message (Offline)

Ignore
1481375213
Reply with quote  #2

1481375213
Report to moderator
1481375213
Hero Member
*
Offline Offline

Posts: 1481375213

View Profile Personal Message (Offline)

Ignore
1481375213
Reply with quote  #2

1481375213
Report to moderator
1481375213
Hero Member
*
Offline Offline

Posts: 1481375213

View Profile Personal Message (Offline)

Ignore
1481375213
Reply with quote  #2

1481375213
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1481375213
Hero Member
*
Offline Offline

Posts: 1481375213

View Profile Personal Message (Offline)

Ignore
1481375213
Reply with quote  #2

1481375213
Report to moderator
1481375213
Hero Member
*
Offline Offline

Posts: 1481375213

View Profile Personal Message (Offline)

Ignore
1481375213
Reply with quote  #2

1481375213
Report to moderator
ssvb
Jr. Member
*
Offline Offline

Activity: 39


View Profile
February 17, 2012, 09:35:28 AM
 #22

If you want to believe me, then I can vouch for mtrlt's gpu miner being significantly more efficient than any current cpu miner for scrypt.

From what I know of the gpu miner, option 3 of modifying the scrypt parameter will have minimal impact. The pad size did not seem to matter much, and can be compressed for lack of a better word, with on the fly value reconstruction. So any increase in pad size will have a relatively equal impact on cpu miners until you exceed their cache size, at which point, gpus may become even more efficient.
Right now salsa20/8 is used as a core part of of scrypt (8 rounds for the inner loop). Replacing it with something like salsa20/2 (just two rounds in the inner loop) would significantly improve performance on CPU, because 4x less calculations would be involved. And the memory access pattern would remain the same, resulting in almost no improvement for the miners which depend on memory performance (GPU miners and also to some extent Cell/BE miner). So what's the problem? The variants of salsa20 with lower number of rounds are supposedly significantly less secure, at least in some cases:
http://fse2008.epfl.ch/docs/papers/day_3_sess_3/29_Lausanne_FSE08_camera_ready.pdf
http://www.ecrypt.eu.org/stream/papersdir/2007/010.pdf

But I don't know how exactly this all applies to scrypt, because cryptography is definitely not my forte. That's why I think that bringing up the issue to the scrypt author can make some sense. That is after we get a better idea of realistic GPU performance.

Quote
I think you will be stuck with option 2, finding a completely different hashing algorithm.
Not in an attempt to troll the thread, but if you look at solidcoin's hash code, you will see it has random reads and writes that are of varying size, spread out over a large memory range, and are randomly aligned. These are key techniques in creating havoc with a gpu's memory access methods. I would suggest looking for code that has similar traits if you really want to defeat gpu's or at least keep them on a level playing field with cpus.
This is all nice. But can we be sure that these convoluted hash calculations can't be algorithmically optimized and reduced to something that can run orders of magnitude faster?
Schwede65
Sr. Member
****
Offline Offline

Activity: 309


View Profile
February 17, 2012, 12:01:55 PM
 #23


According to litecoin wiki mining hardware comparison, an AMD Phenom X4 955 at 3.6ghz gets 24kh @ 125 watts. This translates to 0.192kh per watt.
A gpu rig consisting of 69xx series gpus can produce 998kh @ 920 watts at the wall. This translates to 1.08kh per watt.

So does at least a 5.6 factor increase in *efficiency* qualify as "significantly more"?

Consider the litecoin wiki entry for the Intel Core i7 860 which produces 25kh at 153 watts (a believable wattage consumption for the entire system). It gives a system kh/watt score of only 0.163. The gpu example is now a factor of 6.6 times more efficient.


to have a fair comparison:

it's measured at my btc-mining-rigs, the point is, that i only count the extra-cpu-watt of the rigs, because they are mining btc and a cpu-coin is mined extra, when there is one, when there is no coin to mine i save the watt

core i3 / 2.4 ghz / 3 threads / ~  8 w only for this / 10 kh/s =>1.25 kh/w
CPU / GPU // CPU is 1.157 more efficient Grin

core i3 / 3.1 ghz / 3 threads / ~ 15 w only for this / 12.5 kh/s =>0.83 kh/w
CPU / GPU // CPU is 0.768 less efficient

core i7 / 3.6 ghz / 7 threads / ~ 46 w only for this / 32 kh/s =>0.695 kh/w
CPU / GPU // CPU is 0.644 less efficient

for me i can't see the 5- or 6-higher-efficiency-factor of the gpu...


shakti
Jr. Member
*
Offline Offline

Activity: 57



View Profile
February 17, 2012, 01:43:12 PM
 #24

Well, my theorethical researches give some unswers to Problematic.
Is Gpu-miner possible - YES, depends on implementation it can be made somehow better as CPU miner.
The performance of GPU miner is caped by Memory bandwith (you can't store all the data in Local/Shared memory of Device and have to use global Memory.
Global memory has havy latencies to access but managing an implementation where you will "hide" latencies by calculating in this time can work. You'll  still have cap in bandwidth ... the Fastest Bandwidth we have on common GPUs (as far as i know reading http://www.geforce.com/Hardware/GPUs/geforce-gtx-590/specifications used Nvidia GPU cause they have better bandwidth as AMD) is 327.7 GB/s. Sure it's best possible performance for just sequent read/write it's impossible to reach this in Scrypt calculation on GPU but we can calculate a Cap. One hash needs 128k writes and 128k reads to and from Global device memory. this means we can't calculate more hashes as

327.7 * 10^9 / 2 / 128 * 10^3 = 1.28 * 10^6 hashes.

this means the cap of performance is 1.28 MH/s.

The real implementation would be worser by factor 2 or 3 ...

Is this a problem ... well could be, but i don't think so.

DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
February 17, 2012, 02:01:57 PM
 #25

One thing (somewhat theoretical) I would throw out there is that as GPUs become more "CPU like" they will devote the necessary resources (transistors and chip yield) to increased L1 cache.  GPU long since outstripped the growth in pixel counts so they devoted more resources to improved image quality at a fixed number of pixels and/or polygons.  The GPU resources are growing faster than developers ability to use them as devising more complex and realistic "effects" requires more human capital than simply doubling the polygon count or going from a 800x600 pixel count to 1920x1200 pixel count.  So it will be increases in GPGPU workload which increasingly drives development of future GPUs.   

GPU traditionally had very little local cache because there is no need when performing traditional graphics work.  That dynamic likely won't hold true in the future.  NVidia Tesla cards for example can be configured to double the amount of L1 cache because it is so useful in boosting performance of some GPGPU functionality.   Larger L1 caches will eventually trickle down into consumer grade products to.

No coin today is "anti-GPU" it is they can be described as "large L1 cache dependent".   Even if GPU mining is impossible w/ todays hardware the desire for GPGPU performance will drive higher cache in the SP for future GPU and eventually they will be able to perform with no limitation.

One "hard" alternative is to make the lookup tables so large they can only fit in main memory (say a 3.2GB lookup table).   That obviously makes the code less functional on those with limited system resources but given CPU access to memory is always lower latency that GPUs it becomes hard to defeat.  Eventually GPUs shared memory will be 4GB, 8GB, 16GB etc so possibly some algorithm which adjusts size of lookup table based on Moore's law might be needed (i.e. uses a 3.2GB lookup table at current block but that increases by 7% every 40,000 blocks as part of the protocol, maybe on a multiple of difficulty like every 8th difficulty adjustment is also a memory adjustment).
shakti
Jr. Member
*
Offline Offline

Activity: 57



View Profile
February 17, 2012, 02:15:27 PM
 #26

One thing (somewhat theoretical) I would throw out there is that as GPUs become more "CPU like" they will devote the necessary resources (transistors and chip yield) to increased L1 cache.  GPU long since outstripped the growth in pixel counts so they devoted more resources to improved image quality at a fixed number of pixels and/or polygons.  The GPU resources are growing faster than developers ability to use them as devising more complex and realistic "effects" requires more human capital than simply doubling the polygon count or going from a 800x600 pixel count to 1920x1200 pixel count.  So it will be increases in GPGPU workload which increasingly drives development of future GPUs.   

GPU traditionally had very little local cache because there is no need when performing traditional graphics work.  That dynamic likely won't hold true in the future.  NVidia Tesla cards for example can be configured to double the amount of L1 cache because it is so useful in boosting performance of some GPGPU functionality.   Larger L1 caches will eventually trickle down into consumer grade products to.

No coin today is "anti-GPU" it is they can be described as "large L1 cache dependent".   Even if GPU mining is impossible w/ todays hardware the desire for GPGPU performance will drive higher cache in the SP for future GPU and eventually they will be able to perform with no limitation.

One "hard" alternative is to make the lookup tables so large they can only fit in main memory (say a 3.2GB lookup table).   That obviously makes the code less functional on those with limited system resources but given CPU access to memory is always lower latency that GPUs it becomes hard to defeat.  Eventually GPUs shared memory will be 4GB, 8GB, 16GB etc so possibly some algorithm which adjusts size of lookup table based on Moore's law might be needed (i.e. uses a 3.2GB lookup table at current block but that increases by 7% every 40,000 blocks as part of the protocol, maybe on a multiple of difficulty like every 8th difficulty adjustment is also a memory adjustment).
Sure with new hardware situation will be other. Quantum computers are possible aswell and will be "common" someday aswell and it will change everything we know about Cryptography Smiley

DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
February 17, 2012, 02:28:02 PM
 #27

Sure with new hardware situation will be other. Quantum computers are possible aswell and will be "common" someday aswell and it will change everything we know about Cryptography Smiley

True but the probability of GPU reaching L1 cache parity is much higher in the short term.  I would expect it within 1 generation (8000 series) but if not there it will certainly be there in the next-next generation.

So realistically you are talking about a 2-3 window before GPU have cache parity with CPU.  Using an algorithm which relies on that inequality is "futile" over even the short term.

Honestly I don't see the value in trying to limit hardware.  At this point GPU mining (which we don't even know exists) is still more valuable on Bitcoin.  So if someone was convinced Litecoin will take over the world it would still be cheaper to mine BTC and use that to buy LTC.  If miner i just interested in fiat profits well all that matters is which coin generates higher return on a given piece of hardware (or even more abstract a given amount of capital).
Bitinvestor
Sr. Member
****
Offline Offline

Activity: 465


View Profile
February 17, 2012, 02:54:21 PM
 #28

I agree with DeathAndTaxes on this one. It's not worth trying to limit hardware because it's improving all the time. By the end of this year Intel is expected to release their new Knights Corner 50-core CPU. That will be a good one for mining Litecoins, according to Artforz on btc-e. Are you going to try and stifle that one too?

Do you have any hopes of Litecoin ever becoming mainstream? It won't happen while 50% of the mining is done by botnets! That's why the Litecoin price is so low: nobody is going to put any serious money into a botnet currency.

A GPU miner is the best thing that could happen to LTC because botnets don't have GPUs. Let's leave Litecoin as it is and let's get mtrlt to release an open source version of his miner. I read that he spent only a few hours working on it so the Litecoin community should be able to buy it from him. I'm willing to donate 10 BTC towards this goal.

Those who cause problems for others also cause problems for themselves.
CoinHunter
Sr. Member
****
Offline Offline

Activity: 252



View Profile
February 17, 2012, 03:36:11 PM
 #29

I agree with DeathAndTaxes on this one. It's not worth trying to limit hardware because it's improving all the time. By the end of this year Intel is expected to release their new Knights Corner 50-core CPU. That will be a good one for mining Litecoins, according to Artforz on btc-e. Are you going to try and stifle that one too?

Do you have any hopes of Litecoin ever becoming mainstream? It won't happen while 50% of the mining is done by botnets! That's why the Litecoin price is so low: nobody is going to put any serious money into a botnet currency.

A GPU miner is the best thing that could happen to LTC because botnets don't have GPUs. Let's leave Litecoin as it is and let's get mtrlt to release an open source version of his miner. I read that he spent only a few hours working on it so the Litecoin community should be able to buy it from him. I'm willing to donate 10 BTC towards this goal.

I think you should find out more about why Coblee created litecoin :-

https://github.com/coblee/litecoin/wiki/Comparison-between-Bitcoin-and-Litecoin

Quote
For proof of work, Bitcoin uses the highly parallelizable SHA256 hash function, and therefore Bitcoin mining is a GPU-friendly task. Litecoin uses scrypt instead of SHA256 for proof of work. The scrypt hash function uses SHA256 as a subroutine, but it also depends on fast access to large amounts of memory rather than depending just on fast arithmetic operations, so it is hard to run many instances of scrypt in parallel by using the ALUs of a modern graphics card. This means that currently CPU mining is more efficient than GPU mining for Litecoin. In the future, GPUs or other dedicated hardware might prove useful for Litecoin mining, though the improvement over CPUs is likely to be less significant than it was for Bitcoin mining (e.g. 4x speedup instead of 400x speedup).

Quote
If your computer already mines bitcoins, then the CPU on that computer is probably idle, so you can simultaneously mine litecoins without affecting the speed in which your GPU mines bitcoins.

Either litecoin is a CPU coin or the major reason for its existence just disappeared?

From the latest stats it appears GPU mining scrypt is 25-35x faster than CPU, is able to be stacked in the same PC and is 6-10x more efficient.

Check out SolidCoins mining page for info on how a correct implementation of CPU/GPU hard algorithm should figure in performance.
http://wiki.solidcoin.info/wiki/Mining_Hardware_Performance

Try SolidCoin or talk with other SolidCoin supporters here SolidCoin Forums
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
February 17, 2012, 03:56:38 PM
 #30

I believe this is do to the accessibility goals of LTC, true hardware will increase so algo changes only *need* to be made to keep the *common man* readily accessible to mining.  Right now if GPU mining proves good enough the common man becomes ostracized from this coin again and thus should warrant an algorithm adjustment... 5 years or whenever down the line that it takes for the common man to have the hardware you are talking about as generally available then it would be ok to let the algo slip into that new territory without an update, right now it seems just too soon ... imho at least.

Well the common man isn't mining it and likely will never mine ANY coin.  Mining isn't necessary for usage.  Mining like any commodity business will be limited to those with highest efficiency.  Lots of people buy/sell/trade gold a much smaller number of people actually mine it. Smiley

Still even if one believes CPU = greater adoption (and ignores the issue of botnets) just having a GPU miner doens't mean someone will use it.  If one can get x mining LTC and get 2x mining BTC on the same hardware well it doesn't really matter does it?

LTC GPU performance isn't just competing against LTC CPU performance but also BTC GPU performance as well.  You can mine ShortBusCoins with GPUs and have been able to for some time but hashing power is a tiny fraction (even adjusting for relative difficulty) of Bitcoin meaning most miners with the ability to mine ShortBusCoins simply chose not to.

Unless it is more profitable to mine LTC instead of BTC on a given piece of hardware it would be illogical make the switch.  Everything is fungible.

I can get LTC a variety of ways:
$$$$ -> LTC
$$$$ -> CPUs -> LTC
$$$$ -> GPUs -> BTC -> LTC
$$$$ -> GPUs -> LTC (unproven)

dishwara
Legendary
*
Offline Offline

Activity: 1386

Truth may get delay, but NEVER fails


View Profile
February 17, 2012, 04:36:23 PM
 #31

Anyone can explain me why i am getting this error?

2012-02-17 22:01:10 Error 6 getting work. See http://curl.haxx.se/libcurl/c/libc
url-errors.html for error code explanations.
2012-02-17 22:01:10 Couldn't connect to server. Trying again in a few seconds...

I tried website mentioned, but cant find.
I used reaper 12 from http://wiki.solidcoin.info/wiki/Reaper
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
February 17, 2012, 04:38:45 PM
 #32

Anyone can explain me why i am getting this error?

2012-02-17 22:01:10 Error 6 getting work. See http://curl.haxx.se/libcurl/c/libc
url-errors.html for error code explanations.
2012-02-17 22:01:10 Couldn't connect to server. Trying again in a few seconds...

I tried website mentioned, but cant find.
I used reaper 12 from http://wiki.solidcoin.info/wiki/Reaper

Because it doesn't support litecoin?
michaelmclees
Hero Member
*****
Offline Offline

Activity: 629


View Profile
February 17, 2012, 04:44:58 PM
 #33

In my opinion, so long as a CPU rig can earn coins at a significant percentage rate of a GPU rig, then Litecoin is serving its purpose.  You have to keep in mind that a $1,000 CPU mining Bitcoin is going to take 60 years to find a block, while a $1,000 GPU rig will take about 2 months.  The CPU rig is .2% as efficient as the GPU rig.

Let's make the same comparison to LTC.  A $1,000 CPU rigs will make 30LTC a day while a $1,000 GPU rig will, according to rumor, make 600LTC per day.  The CPU rig is 5% as efficient.  There is a difference of 25 times between the inefficiencies of CPU's to GPU's regarding Bitcoin and Litecoin.

Even in a world of GPU miners, with Litecoin, the CPU is still very effective, especially when there is no initial outlay of money for the rig itself.  With Bitcoin, no matter how good your CPU is, it is utterly worthless for mining; this isn't even close to the case for Litecoin, if the rumors of GPU miners turn out to be true.

I see no reason for any changes.
Mousepotato
Hero Member
*****
Offline Offline

Activity: 896


Seal Cub Clubbing Club


View Profile
February 17, 2012, 05:16:28 PM
 #34

Quote
[04:12:10] <@PooL-X> The expected generation output, at 200 KHps, given current difficulty of 1.5926655, is 126.31 LTC per day, 5.26 LTC per hour, Estimated time to find a block is 9 hours 30 minutes 3 seconds
That's the amount of kh/s mtrlt was saying, or at least an average: that's $1.20 a day.
Now, with my 6870 I get 300mh/s on bitcoin...which earns me an avg of 0.2BTC a day, which is around $1/day
So really they're about equal, if bitcoin price rises any more, it's more worthwhile to mine bitcoins on a GPU.

I believe that quoted 200 KH/s is from a single 6990.  So 126.31 LTC per day from a 6990 equals roughly 0.2808 BTC per day.  You could just as well mine BTC to the tune of around .60XX BTC per day with the same card.  Unless I mis-read where that 200 KH/s came from, you'd really have to squeeze 400 KH/s+ out of a single 6990 before it even begins to become equal to net daily output from mining BTC.

Mousepotato
shakti
Jr. Member
*
Offline Offline

Activity: 57



View Profile
February 17, 2012, 05:22:03 PM
 #35

I agree with DeathAndTaxes on this one. It's not worth trying to limit hardware because it's improving all the time. By the end of this year Intel is expected to release their new Knights Corner 50-core CPU. That will be a good one for mining Litecoins, according to Artforz on btc-e. Are you going to try and stifle that one too?

Do you have any hopes of Litecoin ever becoming mainstream? It won't happen while 50% of the mining is done by botnets! That's why the Litecoin price is so low: nobody is going to put any serious money into a botnet currency.

A GPU miner is the best thing that could happen to LTC because botnets don't have GPUs. Let's leave Litecoin as it is and let's get mtrlt to release an open source version of his miner. I read that he spent only a few hours working on it so the Litecoin community should be able to buy it from him. I'm willing to donate 10 BTC towards this goal.

I think you should find out more about why Coblee created litecoin :-

https://github.com/coblee/litecoin/wiki/Comparison-between-Bitcoin-and-Litecoin

Quote
For proof of work, Bitcoin uses the highly parallelizable SHA256 hash function, and therefore Bitcoin mining is a GPU-friendly task. Litecoin uses scrypt instead of SHA256 for proof of work. The scrypt hash function uses SHA256 as a subroutine, but it also depends on fast access to large amounts of memory rather than depending just on fast arithmetic operations, so it is hard to run many instances of scrypt in parallel by using the ALUs of a modern graphics card. This means that currently CPU mining is more efficient than GPU mining for Litecoin. In the future, GPUs or other dedicated hardware might prove useful for Litecoin mining, though the improvement over CPUs is likely to be less significant than it was for Bitcoin mining (e.g. 4x speedup instead of 400x speedup).

Quote
If your computer already mines bitcoins, then the CPU on that computer is probably idle, so you can simultaneously mine litecoins without affecting the speed in which your GPU mines bitcoins.

Either litecoin is a CPU coin or the major reason for its existence just disappeared?

From the latest stats it appears GPU mining scrypt is 25-35x faster than CPU, is able to be stacked in the same PC and is 6-10x more efficient.

Check out SolidCoins mining page for info on how a correct implementation of CPU/GPU hard algorithm should figure in performance.
http://wiki.solidcoin.info/wiki/Mining_Hardware_Performance

I see nothing correct in your Hashing Algorithm, i've implemented 2.5 faster Version of miner for it (CPU) and due lack of time and profit for mining there is even 3x-8x better implementation of GPU miner for Solidcoin Smiley so it's nothing other Smiley

gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2030



View Profile
February 17, 2012, 05:28:42 PM
 #36

In my opinion, so long as a CPU rig can earn coins at a significant percentage rate of a GPU rig, then Litecoin is serving its purpose.  You have to keep in mind that a $1,000 CPU mining Bitcoin is going to take 60 years to find a block, while a $1,000 GPU rig will take about 2 months.  The CPU rig is .2% as efficient as the GPU rig.

Lots of gibberish on this thread.    A typical current generation quad core cpu will produce 18MH/s a similarly priced (and higher power consumption) GPU will produce perhaps 320 MH/s.    ~20x is nowhere near 500x.


michaelmclees
Hero Member
*****
Offline Offline

Activity: 629


View Profile
February 17, 2012, 05:43:22 PM
 #37

In my opinion, so long as a CPU rig can earn coins at a significant percentage rate of a GPU rig, then Litecoin is serving its purpose.  You have to keep in mind that a $1,000 CPU mining Bitcoin is going to take 60 years to find a block, while a $1,000 GPU rig will take about 2 months.  The CPU rig is .2% as efficient as the GPU rig.

Lots of gibberish on this thread.    A typical current generation quad core cpu will produce 18MH/s a similarly priced (and higher power consumption) GPU will produce perhaps 320 MH/s.    ~20x is nowhere near 500x.


But a $1,000 Bitcoin GPU rig is going to produce over 1GH/s, not 320 MH/s.  So while my memory of what a good CPU will do for Bitcoin might be fuzzy (haven't done it), the overall point remains the same.  GPU mining on BTC has made CPU mining worthless, a money losing endeavor.  All the while, if GPU mining on Litecoin is as good as the rumors claim, CPU mining is still very much worthwhile.
Ahimoth
Member
**
Offline Offline

Activity: 69


View Profile
February 17, 2012, 05:52:16 PM
 #38

Anyone like crunching numbers? What would the price of LTC need to be in order to make GPU mining worthwhile? I don't think we're there, but like I said, I haven't run the numbers at all.

easy.. the same rig on bitcoin would make $5.17 daily, whereas on litecoin it would make $6.11 daily. This is according to allchains.info.


I guess everyone is ignoring my math that it is currently more profitable to mine LTC on a gpu rig than it is to mine bitcoins on that same rig.

Admittedly this is with the gpu miner kept private. If the gpu miner is released to the public, the ltc difficulty will likely increase, making the profit margin smaller, or putting bitcoin back in the lead.
Schwede65
Sr. Member
****
Offline Offline

Activity: 309


View Profile
February 17, 2012, 06:25:26 PM
 #39

I guess everyone is ignoring my math that it is currently more profitable to mine LTC on a gpu rig than it is to mine bitcoins on that same rig.

Admittedly this is with the gpu miner kept private. If the gpu miner is released to the public, the ltc difficulty will likely increase, making the profit margin smaller, or putting bitcoin back in the lead.
--- in the quote bold by me ---


that may be the truth...

you can post this the whole day long...

but who cares about a private software and the math with it, especially everyone...
notme
Legendary
*
Offline Offline

Activity: 1540


View Profile
February 17, 2012, 06:33:46 PM
 #40

I guess everyone is ignoring my math that it is currently more profitable to mine LTC on a gpu rig than it is to mine bitcoins on that same rig.

Admittedly this is with the gpu miner kept private. If the gpu miner is released to the public, the ltc difficulty will likely increase, making the profit margin smaller, or putting bitcoin back in the lead.
--- in the quote bold by me ---


that may be the truth...

you can post this the whole day long...

but who cares about a private software and the math with it, especially everyone...

Right, we can't verify your numbers without access to said miner.

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
12jh3odyAAaR2XedPKZNCR4X4sebuotQzN
Pages: « 1 [2] 3 4 5 6 7 8 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!