Bitcoin Forum
November 23, 2017, 10:15:50 PM *
News: Latest stable version of Bitcoin Core: 0.15.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: [1]
  Print  
Author Topic: Litecoin Scrypt Parameters  (Read 1526 times)
knowledgeable
Newbie
*
Offline Offline

Activity: 18


View Profile
March 24, 2013, 01:09:04 AM
 #1

Does it have a design flaw?
It isn't as much flaw as deception. Litecoin used the same scrypt parameters as Tenebrix. Artforz had gamed almost everyone involved in the scrypt()-based coins. He had choosen the set of parameters that made GPU mining possible, but made claims that the design is GPU-resistant. Then he proceeded to mine all the scrypt()-based coins (Tenebrix/Fairbrix/etc) on his GPU farm that was significantly more efficient than the CPU miners.

Is the information in this post correct?  Could the parameters of a scrypt coin be adjusted to make GPU mining less competitive with CPU mining?  How much work would this be?  In the litecoin announcement post, Coblee indicated that litecoin was not supposed to compete with bitcoin for miners, yet now it clearly is competing because the chosen scrypt algorithm is not inefficient enough on GPU.  Why did the litecoin devs choose to use Artforz's exact proof of work implementation?
https://bitcointalk.org/index.php?topic=47417.0
1511475350
Hero Member
*
Offline Offline

Posts: 1511475350

View Profile Personal Message (Offline)

Ignore
1511475350
Reply with quote  #2

1511475350
Report to moderator
1511475350
Hero Member
*
Offline Offline

Posts: 1511475350

View Profile Personal Message (Offline)

Ignore
1511475350
Reply with quote  #2

1511475350
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1511475350
Hero Member
*
Offline Offline

Posts: 1511475350

View Profile Personal Message (Offline)

Ignore
1511475350
Reply with quote  #2

1511475350
Report to moderator
tacotime
Legendary
*
Offline Offline

Activity: 1484



View Profile
March 24, 2013, 02:10:44 AM
 #2

This has been addressed many times

If you directly port the scrypt cpu code to gpu it's faster on the cpu using the l2 cache. So, Artforz thought it was faster on cpu than gpu. With an optimized algorithm that reads sequentially and has on the fly lookup table reconstruction it's not true.  Artforz was pretty embarrassed by this lapse in technical prowess and left the scene, which has been rife with conspiracy theory since.

There is no way to easily make a cpu only coin either, see my "memcoin" thread where I learned why it was a bad idea with scrypt

Code:
XMR: 44GBHzv6ZyQdJkjqZje6KLZ3xSyN1hBSFAnLP6EAqJtCRVzMzZmeXTC2AHKDS9aEDTRKmo6a6o9r9j86pYfhCWDkKjbtcns
knowledgeable
Newbie
*
Offline Offline

Activity: 18


View Profile
March 24, 2013, 05:17:51 AM
 #3

This has been addressed many times

If you directly port the scrypt cpu code to gpu it's faster on the cpu using the l2 cache. So, Artforz thought it was faster on cpu than gpu. With an optimized algorithm that reads sequentially and has on the fly lookup table reconstruction it's not true.  Artforz was pretty embarrassed by this lapse in technical prowess and left the scene, which has been rife with conspiracy theory since.

There is no way to easily make a cpu only coin either, see my "memcoin" thread where I learned why it was a bad idea with scrypt

Thanks.  I found some of your old posts about it.  You had some Intel data in one post that showed a radix sort or a tree search algorithm ran better on CPUs.  Did ATI/NVIDIA wind up beating those too?

https://bitcointalk.org/index.php?topic=64239.msg786472#msg786472
Pages: [1]
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!