Bitcoin Forum
November 04, 2024, 07:39:13 PM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Litecoin Scrypt Parameters  (Read 1643 times)
knowledgeable (OP)
Newbie
*
Offline Offline

Activity: 18
Merit: 0


View Profile
March 24, 2013, 01:09:04 AM
 #1

Does it have a design flaw?
It isn't as much flaw as deception. Litecoin used the same scrypt parameters as Tenebrix. Artforz had gamed almost everyone involved in the scrypt()-based coins. He had choosen the set of parameters that made GPU mining possible, but made claims that the design is GPU-resistant. Then he proceeded to mine all the scrypt()-based coins (Tenebrix/Fairbrix/etc) on his GPU farm that was significantly more efficient than the CPU miners.

Is the information in this post correct?  Could the parameters of a scrypt coin be adjusted to make GPU mining less competitive with CPU mining?  How much work would this be?  In the litecoin announcement post, Coblee indicated that litecoin was not supposed to compete with bitcoin for miners, yet now it clearly is competing because the chosen scrypt algorithm is not inefficient enough on GPU.  Why did the litecoin devs choose to use Artforz's exact proof of work implementation?
https://bitcointalk.org/index.php?topic=47417.0
tacotime
Legendary
*
Offline Offline

Activity: 1484
Merit: 1005



View Profile
March 24, 2013, 02:10:44 AM
 #2

This has been addressed many times

If you directly port the scrypt cpu code to gpu it's faster on the cpu using the l2 cache. So, Artforz thought it was faster on cpu than gpu. With an optimized algorithm that reads sequentially and has on the fly lookup table reconstruction it's not true.  Artforz was pretty embarrassed by this lapse in technical prowess and left the scene, which has been rife with conspiracy theory since.

There is no way to easily make a cpu only coin either, see my "memcoin" thread where I learned why it was a bad idea with scrypt

Code:
XMR: 44GBHzv6ZyQdJkjqZje6KLZ3xSyN1hBSFAnLP6EAqJtCRVzMzZmeXTC2AHKDS9aEDTRKmo6a6o9r9j86pYfhCWDkKjbtcns
knowledgeable (OP)
Newbie
*
Offline Offline

Activity: 18
Merit: 0


View Profile
March 24, 2013, 05:17:51 AM
 #3

This has been addressed many times

If you directly port the scrypt cpu code to gpu it's faster on the cpu using the l2 cache. So, Artforz thought it was faster on cpu than gpu. With an optimized algorithm that reads sequentially and has on the fly lookup table reconstruction it's not true.  Artforz was pretty embarrassed by this lapse in technical prowess and left the scene, which has been rife with conspiracy theory since.

There is no way to easily make a cpu only coin either, see my "memcoin" thread where I learned why it was a bad idea with scrypt

Thanks.  I found some of your old posts about it.  You had some Intel data in one post that showed a radix sort or a tree search algorithm ran better on CPUs.  Did ATI/NVIDIA wind up beating those too?

https://bitcointalk.org/index.php?topic=64239.msg786472#msg786472
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!