Bitcoin Forum
May 07, 2024, 02:42:23 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 [3]  All
  Print  
Author Topic: New PoW method using factorization of large numbers.  (Read 818 times)
Taras
Legendary
*
Offline Offline

Activity: 1386
Merit: 1053


Please do not PM me loan requests!


View Profile WWW
December 22, 2017, 05:58:35 PM
 #41

Originally I was worried about primes but one of the other posters alerted me that everyone can be working on different numbers.  I didn't realize this is how bitcoin works currently.  Everyone's number is determined by the hash of their public key along with other block details.  So if one person gets dealt a prime (or two or three or 100 people) then the others will be the ones to win that block.
If one miner gets stuck with a prime, then they're not "out" until the next block is solved - they can and will periodically generate a new number by changing some of the block details, and then hope this new number they're factorizing isn't prime.

On a side note it is possible that miners will be biased toward even numbers, since it is more likely that an even number will be composite than an odd number (marginally, but at this scale it could make a difference). As miners can change certain parts of the block details to whatever they want, they can keep trying until they get an even number, and then work on that one.
If you want to be a moderator, report many posts with accuracy. You will be noticed.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1715092943
Hero Member
*
Offline Offline

Posts: 1715092943

View Profile Personal Message (Offline)

Ignore
1715092943
Reply with quote  #2

1715092943
Report to moderator
1715092943
Hero Member
*
Offline Offline

Posts: 1715092943

View Profile Personal Message (Offline)

Ignore
1715092943
Reply with quote  #2

1715092943
Report to moderator
ir.hn (OP)
Member
**
Offline Offline

Activity: 322
Merit: 54

Consensus is Constitution


View Profile
February 22, 2018, 04:00:27 AM
 #42

Originally I was worried about primes but one of the other posters alerted me that everyone can be working on different numbers.  I didn't realize this is how bitcoin works currently.  Everyone's number is determined by the hash of their public key along with other block details.  So if one person gets dealt a prime (or two or three or 100 people) then the others will be the ones to win that block.
If one miner gets stuck with a prime, then they're not "out" until the next block is solved - they can and will periodically generate a new number by changing some of the block details, and then hope this new number they're factorizing isn't prime.

On a side note it is possible that miners will be biased toward even numbers, since it is more likely that an even number will be composite than an odd number (marginally, but at this scale it could make a difference). As miners can change certain parts of the block details to whatever they want, they can keep trying until they get an even number, and then work on that one.

Good catch.  That is a glaring vulnerability since we want to ultimately test whether numbers are prime or not.  My intention is that people can datamine the blockchain to know what numbers to rule out in the search for new prime numbers.  If everyone is mining even numbers than this wouldn't work.  honestly I think we would have to eliminate numbers that have any factors up to say 1000.  This way people can't just divide their number by everything from 1-100 to make sure it isn't prime.  The point is to rule out prime candidates by proving they are composite.

I don't know, any other ideas on this vulnerability?

Taras
Legendary
*
Offline Offline

Activity: 1386
Merit: 1053


Please do not PM me loan requests!


View Profile WWW
February 22, 2018, 04:00:56 PM
Last edit: March 03, 2018, 03:42:59 PM by Taras
Merited by DarkStar_ (1)
 #43

Originally I was worried about primes but one of the other posters alerted me that everyone can be working on different numbers.  I didn't realize this is how bitcoin works currently.  Everyone's number is determined by the hash of their public key along with other block details.  So if one person gets dealt a prime (or two or three or 100 people) then the others will be the ones to win that block.
If one miner gets stuck with a prime, then they're not "out" until the next block is solved - they can and will periodically generate a new number by changing some of the block details, and then hope this new number they're factorizing isn't prime.

On a side note it is possible that miners will be biased toward even numbers, since it is more likely that an even number will be composite than an odd number (marginally, but at this scale it could make a difference). As miners can change certain parts of the block details to whatever they want, they can keep trying until they get an even number, and then work on that one.

Good catch.  That is a glaring vulnerability since we want to ultimately test whether numbers are prime or not.  My intention is that people can datamine the blockchain to know what numbers to rule out in the search for new prime numbers.  If everyone is mining even numbers than this wouldn't work.  honestly I think we would have to eliminate numbers that have any factors up to say 1000.  This way people can't just divide their number by everything from 1-100 to make sure it isn't prime.  The point is to rule out prime candidates by proving they are composite.

I don't know, any other ideas on this vulnerability?

Let's say that, to solve a block, you need to find a factor that is at least 1000 digits of a number that is 2000 digits. We already know we can skip odd numbers, so keep trying to hash your block until you get an even number. This should happen nearly instantly. Now, to find a 1000+ digit factor of a 2000 digit even number...... you just divide by 2. Even on a number this size, that kind of operation would take only milliseconds on a CPU.

More digits in the factor meaning higher difficulty is just not accurate. Finding a 50 digit factor would be harder than finding either a 1 digit or 1999+ digit factor of a 2000 digit number. But ruling out prime candidates is not the hard part. You can already prove that a number is composite if it is both greater than 2 and even. We already know it's greater than 2, so you just have to look at the last one bit - if 0 then composite, if 1 then unknown, try again.

Disallowing numbers with any factors (other than 1) up to 1000 is an interesting idea. They certainly exist (here's one, ~100 digits: 1332040162761665637631613428888885580516735653612226801746764441547065802389532 208097762306977281), but I don't know if there's a way to look at a number and calculate if it meets that criteria before you go bruteforcing factors. If there is (I imagine there is) then we're back to looking for a large enough factor anyways.

What you could do instead is have a window of factor length that has to be met. Rather than >= 1000 digits, say 100 to 110 digits, for a factor of a 2000 digit number.

edit: The very big number above is getting a space in it for some reason - if you try to calculate the factors, make sure that you remove that space
c_demian
Newbie
*
Offline Offline

Activity: 17
Merit: 0


View Profile
March 27, 2018, 10:32:44 PM
 #44

It's a challenging proposal.
Prime numbers and numbers having small factors can be ruled out easily. Primality test is much faster than factoring and there are algorithms which are very fast in finding small factors (see elliptic curve factoring). so, you could just pick numbers with specified factor sizes.
This could also lead to finding new factoring algorithms.
Kogs
Member
**
Offline Offline

Activity: 86
Merit: 26


View Profile
March 28, 2018, 04:50:44 AM
 #45

Here is a whitepaper about how to define a new proof of work system that gives preference to CPU's forever theoretically and is immune to vulnerabilities that encryption algorithms are prone to.
(link is to wayback machine so you can trust visiting it)

https://web.archive.org/web/20171212224738/http://www.naturehackerproducts.com/2017/12/new-proof-of-work-pow-method-based-on.html?m=1

The take home message is to find a single factor (of a length defined by the difficulty) of a very large number (over 100 digits).

Since it is such a large number GPU's become very slow at attempting it.

Factoring big numbers can be highly accelerated with quantum computers and Shor's algorithm.

https://en.wikipedia.org/wiki/Shor%27s_algorithm

In my opinion new POW algorithms should not only be ASIC resistant, they should also be quantum computing resistant.

Bitcoin would already suffer if some day quantum computers with enough qubits become reality as the private/public keys could be cracked with Shor's algorithm.
The SHA-256 hashing of the public key would prevent this attack, but there are already lot of public keys revealed which still contain balances (very old transactions, reused bitcoin addresses). And with those known public keys the private keys could be calculated.

Also RSA encryption (used for example in HTTPS connections) could be cracked quite easly with Shor's algorithm which might be the best what could happen to our nice three letter agencies to globally spy on every online connection.

It might still take long time until quantum computers with enough qubits are available, but we should try to avoid not quantum computing proof algorithms when creating new stuff.
bob123
Legendary
*
Offline Offline

Activity: 1624
Merit: 2481



View Profile WWW
March 30, 2018, 08:32:46 PM
 #46

It's a challenging proposal.
Prime numbers and numbers having small factors can be ruled out easily. Primality test is much faster than factoring and there are algorithms which are very fast in finding small factors (see elliptic curve factoring).

Factorization of large numbers usually means 1024 bit (obsolte) or 2048 bit numbers (4096 bit would be recommended to be on the safe side).
Those numbers are incredibly big (about 600+ digits long).



This could also lead to finding new factoring algorithms.

Quite a few organisations have an incentive to find factoring algorithms with a better runtime.
Until now there hasn't been made any considerable progress in solving this problem.

ir.hn (OP)
Member
**
Offline Offline

Activity: 322
Merit: 54

Consensus is Constitution


View Profile
May 06, 2018, 05:57:59 AM
 #47

Originally I was worried about primes but one of the other posters alerted me that everyone can be working on different numbers.  I didn't realize this is how bitcoin works currently.  Everyone's number is determined by the hash of their public key along with other block details.  So if one person gets dealt a prime (or two or three or 100 people) then the others will be the ones to win that block.
If one miner gets stuck with a prime, then they're not "out" until the next block is solved - they can and will periodically generate a new number by changing some of the block details, and then hope this new number they're factorizing isn't prime.

On a side note it is possible that miners will be biased toward even numbers, since it is more likely that an even number will be composite than an odd number (marginally, but at this scale it could make a difference). As miners can change certain parts of the block details to whatever they want, they can keep trying until they get an even number, and then work on that one.

Good catch.  That is a glaring vulnerability since we want to ultimately test whether numbers are prime or not.  My intention is that people can datamine the blockchain to know what numbers to rule out in the search for new prime numbers.  If everyone is mining even numbers than this wouldn't work.  honestly I think we would have to eliminate numbers that have any factors up to say 1000.  This way people can't just divide their number by everything from 1-100 to make sure it isn't prime.  The point is to rule out prime candidates by proving they are composite.

I don't know, any other ideas on this vulnerability?

Let's say that, to solve a block, you need to find a factor that is at least 1000 digits of a number that is 2000 digits. We already know we can skip odd numbers, so keep trying to hash your block until you get an even number. This should happen nearly instantly. Now, to find a 1000+ digit factor of a 2000 digit even number...... you just divide by 2. Even on a number this size, that kind of operation would take only milliseconds on a CPU.

More digits in the factor meaning higher difficulty is just not accurate. Finding a 50 digit factor would be harder than finding either a 1 digit or 1999+ digit factor of a 2000 digit number. But ruling out prime candidates is not the hard part. You can already prove that a number is composite if it is both greater than 2 and even. We already know it's greater than 2, so you just have to look at the last one bit - if 0 then composite, if 1 then unknown, try again.

Disallowing numbers with any factors (other than 1) up to 1000 is an interesting idea. They certainly exist (here's one, ~100 digits: 1332040162761665637631613428888885580516735653612226801746764441547065802389532 208097762306977281), but I don't know if there's a way to look at a number and calculate if it meets that criteria before you go bruteforcing factors. If there is (I imagine there is) then we're back to looking for a large enough factor anyways.

What you could do instead is have a window of factor length that has to be met. Rather than >= 1000 digits, say 100 to 110 digits, for a factor of a 2000 digit number.

edit: The very big number above is getting a space in it for some reason - if you try to calculate the factors, make sure that you remove that space

Thanka for your great posts.  But yes my intention was that it would require a number with the exact number of digits to win the block, not just 70+ digit factor but an exactly 70 digit factor for example.  If we did the "no numbers with any factors up to 1000" thing then it would just require a teeny bit of brute forcing to verify the proposed target number is valid.  So you would broadcast to the network that here was my target number, notice that it isn't divisible by anything under 1000, and here is my proposed 70 digit factor, notice it works and is exactly 70 digits.

ir.hn (OP)
Member
**
Offline Offline

Activity: 322
Merit: 54

Consensus is Constitution


View Profile
May 06, 2018, 06:05:10 AM
 #48

Here is a whitepaper about how to define a new proof of work system that gives preference to CPU's forever theoretically and is immune to vulnerabilities that encryption algorithms are prone to.
(link is to wayback machine so you can trust visiting it)

https://web.archive.org/web/20171212224738/http://www.naturehackerproducts.com/2017/12/new-proof-of-work-pow-method-based-on.html?m=1

The take home message is to find a single factor (of a length defined by the difficulty) of a very large number (over 100 digits).

Since it is such a large number GPU's become very slow at attempting it.

Factoring big numbers can be highly accelerated with quantum computers and Shor's algorithm.

https://en.wikipedia.org/wiki/Shor%27s_algorithm

In my opinion new POW algorithms should not only be ASIC resistant, they should also be quantum computing resistant.

Bitcoin would already suffer if some day quantum computers with enough qubits become reality as the private/public keys could be cracked with Shor's algorithm.
The SHA-256 hashing of the public key would prevent this attack, but there are already lot of public keys revealed which still contain balances (very old transactions, reused bitcoin addresses). And with those known public keys the private keys could be calculated.

Also RSA encryption (used for example in HTTPS connections) could be cracked quite easly with Shor's algorithm which might be the best what could happen to our nice three letter agencies to globally spy on every online connection.

It might still take long time until quantum computers with enough qubits are available, but we should try to avoid not quantum computing proof algorithms when creating new stuff.

Well since every other algorithm suffers massively RIGHT NOW from GPU and/or ASIC speed up, if all we have to worry anout is quantum computing then I think we are way way above par.  For right now we don't even know if quantum computing the way it is envisioned will even work, let alone we have no idea what an actual application will be capable of.  So I say we cross that bridge when we get there.  For now we need gpu and asic and botnet and server farm resistance.

ir.hn (OP)
Member
**
Offline Offline

Activity: 322
Merit: 54

Consensus is Constitution


View Profile
May 06, 2018, 06:12:06 AM
 #49

It's a challenging proposal.
Prime numbers and numbers having small factors can be ruled out easily. Primality test is much faster than factoring and there are algorithms which are very fast in finding small factors (see elliptic curve factoring).

Factorization of large numbers usually means 1024 bit (obsolte) or 2048 bit numbers (4096 bit would be recommended to be on the safe side).
Those numbers are incredibly big (about 600+ digits long).



This could also lead to finding new factoring algorithms.

Quite a few organisations have an incentive to find factoring algorithms with a better runtime.
Until now there hasn't been made any considerable progress in solving this problem.


Yup right on.  I think for our purposes we will be working primarily on the 100-200 digit range.  Still gnfs sieve is the best known way to either look for factors or to determine if these numbers are prime.  Finding one factor of a given length is much easier to do (or prove you can't do) than to prove a mumber is prime.

ir.hn (OP)
Member
**
Offline Offline

Activity: 322
Merit: 54

Consensus is Constitution


View Profile
June 09, 2018, 11:43:57 PM
 #50

Like I have been saying recently, this algorithm would require 1 CPU and 1 GPU for optimum mining.  This would be perfect for commodity devices like phones, laptops, and AI that have "APU" (combined cpu gpu) processors and would help protect against server farms and IoT botnet's.

http://www.mersenneforum.org/showthread.php?t=19312

andrew1carlssin
Jr. Member
*
Offline Offline

Activity: 168
Merit: 3

#Please, read:Daniel Ellsberg,-The Doomsday *wk


View Profile WWW
June 10, 2018, 12:30:34 AM
 #51

recently I have seen more and more interest from the scientific community on this topic ... I had spend some of my spare time looking for papers about it ... I really find this one interesting ..

Quote
Proofs of Work (PoWs) were introduced [DN92] to enforce that a certain amount of energy was
expended for doing some task in an easily verifiable way. In most applications, PoWs force malicious
users to accrue a large workload, thus guarding against email spam, denial of service attacks, and,
most recently, double-spending in cryptocurrencies such as Bitcoin [Nak08]. Unfortunately, existing
PoW schemes are often disconnected from the task they are attached to, so that the work expended
is not actually useful in accomplishing that task. More importantly, the work and energy expended
is generally not useful for anything except proving that work had in fact been done.
To this end, PoWs are wasteful of real resources and energy and, in the massive use case
of Bitcoin, have even been called an ”environmental disaster” [And13]. Two early attempts to
combat this are Primecoin [Kin13] and Permacoin [MJS+14]. The former suggests a Proof of
Work system whose outcome enables the search for chains of prime numbers, whereas the latter
repurposes Bitcoin mining resources to achieve distributed storage of archival data, based on Proofs
of Retrievability (thus requiring clients to invest not just computational resources, but also storage).
Another line of work, studies Proofs of Space [ABFG14, DFKP15, KK14], where a user must
dedicate a significant amount of disk space, instead of computing power, to perform their task.
This accomplishes a similar constraint on malicious activity to PoWs since a group of users cannot
over-perform a task without having access to massive amounts of memory. However, as a typical
user has free disk space, such schemes will not similarly waste a resource like energy and pollute
the environment.
In this work we present an alternative to these: Proofs of Work whose work is actually useful
to solving practical problems.

wpaper source: https://eprint.iacr.org/2017/203.pdf

And if I am not mistake they talk about it on cource about cryptocurrency created by Princeton University at coursera (week 8, Alternative Mining Puzzles)   

Satoshi's book editor; SCIpher - https://pdos.csail.mit.edu/archive/scigen/scipher.html
ir.hn (OP)
Member
**
Offline Offline

Activity: 322
Merit: 54

Consensus is Constitution


View Profile
June 10, 2018, 03:30:06 AM
 #52

Interesting paper but it appears their chosen problem to solve is an orthogonal vector problem.  While that is a small vector field problem so it would take some memory, it is probably still very vulnerable to acceleration (GPU/ASIC).  Their main goal is to provide a useful outcome, whereas my goal is to provide parallelization resistance with useful work a secondary goal.

ir.hn (OP)
Member
**
Offline Offline

Activity: 322
Merit: 54

Consensus is Constitution


View Profile
June 10, 2018, 06:28:06 AM
 #53

Thanks for pointing out those quantum algorithms for further research, but can we really say that shor's will be faster than grover's?  But yes you seem to be right that shor's algorithm could be an "asic" of sorts for GNFS.

But quantum computing is still a hypothetical and we have ASIC's for hashing right now which is a real, pressing, and proven threat.  So I would rather make my bank vault impervious to explosives and torches and not worry so much about people teleporting in at this point.  I think this new PoW at a absolute minimum would give us 50 years of real fairness.  Bitcoin had what, a couple years?  Litecoin a few years?  We are talking drastic improvement here.

Lots of universities and researchers use regular personal computers for doing GNFS and they have been for decades.  There is nothing better.  They can use GPU's for some portions of the work but the heavy lifting is basic commodity CPU's.  Sure you can design a motherboard to be optimized but you will only get incremental gains, nothing like an asic.  That paper is the only sub billion dollar proposed ASIC for large number factoring.  That just shows how impractical it is and I would guess based on reading between the lines of the paper it would give less than an order of magnitude speedup vs the same price and power usage of commodity hardware, and they admit that is their competition.

Off topic but using liquid nitrogen for cooling could give pretty substantial efficiency gains to any hardware for any algorithm...

Pages: « 1 2 [3]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!