Bitcoin Forum

Alternate cryptocurrencies => Altcoin Discussion => Topic started by: knowledgeable on March 24, 2013, 01:09:04 AM



Title: Litecoin Scrypt Parameters
Post by: knowledgeable on March 24, 2013, 01:09:04 AM
Does it have a design flaw?
It isn't as much flaw as deception. Litecoin used the same scrypt parameters as Tenebrix. Artforz had gamed almost everyone involved in the scrypt()-based coins. He had choosen the set of parameters that made GPU mining possible, but made claims that the design is GPU-resistant. Then he proceeded to mine all the scrypt()-based coins (Tenebrix/Fairbrix/etc) on his GPU farm that was significantly more efficient than the CPU miners.

Is the information in this post correct?  Could the parameters of a scrypt coin be adjusted to make GPU mining less competitive with CPU mining?  How much work would this be?  In the litecoin announcement post, Coblee indicated that litecoin was not supposed to compete with bitcoin for miners, yet now it clearly is competing because the chosen scrypt algorithm is not inefficient enough on GPU.  Why did the litecoin devs choose to use Artforz's exact proof of work implementation?
https://bitcointalk.org/index.php?topic=47417.0


Title: Re: Litecoin Scrypt Parameters
Post by: tacotime on March 24, 2013, 02:10:44 AM
This has been addressed many times

If you directly port the scrypt cpu code to gpu it's faster on the cpu using the l2 cache. So, Artforz thought it was faster on cpu than gpu. With an optimized algorithm that reads sequentially and has on the fly lookup table reconstruction it's not true.  Artforz was pretty embarrassed by this lapse in technical prowess and left the scene, which has been rife with conspiracy theory since.

There is no way to easily make a cpu only coin either, see my "memcoin" thread where I learned why it was a bad idea with scrypt


Title: Re: Litecoin Scrypt Parameters
Post by: knowledgeable on March 24, 2013, 05:17:51 AM
This has been addressed many times

If you directly port the scrypt cpu code to gpu it's faster on the cpu using the l2 cache. So, Artforz thought it was faster on cpu than gpu. With an optimized algorithm that reads sequentially and has on the fly lookup table reconstruction it's not true.  Artforz was pretty embarrassed by this lapse in technical prowess and left the scene, which has been rife with conspiracy theory since.

There is no way to easily make a cpu only coin either, see my "memcoin" thread where I learned why it was a bad idea with scrypt

Thanks.  I found some of your old posts about it.  You had some Intel data in one post that showed a radix sort or a tree search algorithm ran better on CPUs.  Did ATI/NVIDIA wind up beating those too?

https://bitcointalk.org/index.php?topic=64239.msg786472#msg786472