Bitcoin Forum
May 06, 2024, 09:53:06 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 [3] 4 5 6 »  All
  Print  
Author Topic: I am going to build a true random number generator ...  (Read 7797 times)
DeathAndTaxes (OP)
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
April 07, 2014, 10:43:45 PM
Last edit: April 07, 2014, 11:02:36 PM by DeathAndTaxes
 #41

It will be measuring the time between TWO particle detections (if time between this interval is larger than prior interval that is a "1" and if it is shorter it is a "0" and if it is equal we throw it out).  Quantum mechanics tells us that while the average rate of decay can be calculated the time between each individual decay can not.

Is this a good algorithm?

I know that what seems intuitive is often wrong when dealing with things like this, so I may not be thinking this through correctly...

It would seem that while you cannot know how long it will be to the next detection, there will be an oscillating tendency

Anytime you get a "0", it implies that the time was shorter than the previous detection.  While this is not a guarantee that the time is shorter than the average, it certainly is an indicator that the time is more likely to shorter than the average. (If you average all the intervals when you get a "0", and compare it to an average of all the intervals, the average interval when you get a "0" should be shorter than the average of all intervals, shouldn't it?)

The reverse can be said about any instance where you get a "1".  This would seem to imply that after a "1", there is a higher than average chance that your next interval will be a "0" (and vice versa).

I suppose for these purposes the bias might not be significant enough to be a problem, but I can't help but wonder if there isn't a better solution.

That is a good point.   What I wrote didn't accuracy describe the conversion method and you are right, as described it probably would lead to some alternate bit bias.  A better explanation is that each bit will require two independent intervals.  

Imagine 3 particle detections a, b, c,
Interval A is the time between a & b
Interval B is the time between b & c

if (A > B)
   then bit = 1
else if (B > A)
   then bit = 0
// if A = B then no bit is recorded

This requires two counts (events) per bit so the bitrate will be roughly CPM / 120 (CPM = counts per minute).  It will actually be less than half due to the portion of the counts which need to be dropped because the intervals match.  The amount of the reduction will depend on the accuracy of the clock relative to the average interval.  

Some ballpark numbers:
To produce a throughput of 1 kbps (about a 10MB stream of random digits per day) would require a source and tube combination capable of 120,000 CPM and a real time clock with at least 10 microsecond accuracy.   Initially I am going to try for a much lower ~100 bps using the microcontroller clock with roughly 1ms accuracy.





1715032386
Hero Member
*
Offline Offline

Posts: 1715032386

View Profile Personal Message (Offline)

Ignore
1715032386
Reply with quote  #2

1715032386
Report to moderator
There are several different types of Bitcoin clients. The most secure are full nodes like Bitcoin Core, which will follow the rules of the network no matter what miners do. Even if every miner decided to create 1000 bitcoins per block, full nodes would stick to the rules and reject those blocks.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1715032386
Hero Member
*
Offline Offline

Posts: 1715032386

View Profile Personal Message (Offline)

Ignore
1715032386
Reply with quote  #2

1715032386
Report to moderator
DannyHamilton
Legendary
*
Online Online

Activity: 3388
Merit: 4622



View Profile
April 07, 2014, 10:51:10 PM
 #42

A better explanation is that each bit will require two independent intervals.

Imagine 3 particle detections a, b, c,

Interval A is the time between a & b
Interval B is the time between b & c

if (A > B)
   then bit = 1
else if (B > A)
   then bit = 0
// if A = B then no bit is recorded
d this tube) and a clock with at least 10 microsecond accuracy.  With a clock accuracy of only 1 millisecond the upper limit would be around 100 bps.

Ah. Ok.  This makes more sense.  Thanks for the clarification.
Tirapon
Hero Member
*****
Offline Offline

Activity: 898
Merit: 1000



View Profile
April 07, 2014, 10:52:01 PM
 #43

Is it possible to use the hash of each new block as a source of entropy?
SgtSpike
Legendary
*
Offline Offline

Activity: 1400
Merit: 1005



View Profile
April 07, 2014, 10:53:38 PM
 #44

I always thought that a microphone could work just as effectively for randomness.  Put a mic outside, record for 10 seconds, take the hash of that, viola!  Or just a straight sampling of it, like 10 bits, although the effective randomness would be less bits than that.
thats a neat idea. I'd assume the codec and/or file extension might not make it too random though
I don't see how the codec or file extension would affect the effectiveness of the hash... I mean, sure, the headers might be known, but random audio would still be made up of random bits, regardless of what codec is used.

Why not use the feed from all those public webcams? Pixel and hue variations should be random enough as data.
It's public data, so by definition, not really safe to use.  Someone else could replicate what you are doing.  The whole goal with a good RNG is that no one else can replicate it even if they see the code you are using.

Is it possible to use the hash of each new block as a source of entropy?
Not really - see above.
Tirapon
Hero Member
*****
Offline Offline

Activity: 898
Merit: 1000



View Profile
April 07, 2014, 10:56:37 PM
 #45

Is it possible to use the hash of each new block as a source of entropy?
Not really - see above.


Ah okay, yeah that makes sense. So a pretty useless suggestion then...

Well, at least I learned something.  Smiley
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 3920
Merit: 2348


Eadem mutata resurgo


View Profile
April 07, 2014, 11:06:53 PM
 #46

http://phys.org/news202456660.html

... be careful, there maybe periodic signals in there ...  Cheesy

DeathAndTaxes (OP)
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
April 07, 2014, 11:20:30 PM
Last edit: April 07, 2014, 11:31:09 PM by DeathAndTaxes
 #47

Is it possible to use the hash of each new block as a source of entropy?
Not really - see above.


Ah okay, yeah that makes sense. So a pretty useless suggestion then...

Well, at least I learned something.  Smiley

It isn't useless, it just isn't a TRNG.  You could take block hashes and use the trailing 160 bits (to avoid bias due to leading zeros).  That would give you 160 bits per block or about 48 Mb since the genesis block.   You would now have a random uniformly distributed sequence of random bits (or at least you could test it to verify that assumption). If it is uniformly distributed (and what we know of SHA suggest it is but that could be tested) it has some use however as SpikeSgt pointed out, for cryptogrphic purposes you want uniformly distributed random values not know or reproducible by anyone else.

However by applying a little crypto you could mutate that stream into something that nobody else can reproduce.  One example would be to take a HMAC-256 hash of the sequence (in 256 bit "chunks") using the raw truncated block data and a private key you know.  Now it isn't a TRNG because if someone obtains your private key they could generate your private stream.  However this did get me thinking.  Using a programmable smart card capable of HMAC the key could remain inside the smart card.

Code:
256bit_random_output = HMAC-256(smartcard_private_key, "256 bit RAW input + nonce)

The point of using a nonce would be to prevent someone who has access to the cards from producing previous values (i.e. the same raw block hashes would produce a different output each time they are input to the smartcard).

Still not a TRNG but a pretty cool concept regardless.  Private keys in theory can be extracted from smart cards but it is a difficult process and requires physical access to the card.  If the card is stolen then move your coins.  Don't anyone rush out and use that I am just thinking out loud.  Still in general it is an interesting concept. The blockchain is a large source of KNOWN entropy, a system which mutates that into unknown entropy is easier than one which attempts to produce unknown entropy natively.
chiguireitor
Legendary
*
Offline Offline

Activity: 872
Merit: 1010


Coins, Games & Miners


View Profile WWW
April 08, 2014, 12:06:06 AM
 #48

I've always pondered if there's any use on adapting random behavior of certain natural systems as entropy sources. For example, you could use an USB microscope to record E.Coli movements on a petri dish and use frames as an entropy source, or you could even go and use a string vibrating with an electromagnet under a bridge (i know this source has harmonic behaviour, but you could use the noise in the sine wave to generate additional entropy, or even use FM as the entropy and not the amplitude). This could be recorded with a smallish system like the Electric Imp and sent to a server to seed different PRNGs regularly, resulting in a TRNG seeded PRNG. You could use the TRNGs directly, but real life sources have low bandwidth (This could be desirable for certain systems).

MaxwellsDemon
Full Member
***
Offline Offline

Activity: 187
Merit: 109

Converting information into power since 1867


View Profile
April 08, 2014, 12:24:43 AM
 #49

It will be measuring the time between TWO particle detections (if time between this interval is larger than prior interval that is a "1" and if it is shorter it is a "0" and if it is equal we throw it out).  Quantum mechanics tells us that while the average rate of decay can be calculated the time between each individual decay can not.

Is this a good algorithm?

I know that what seems intuitive is often wrong when dealing with things like this, so I may not be thinking this through correctly...

It would seem that while you cannot know how long it will be to the next detection, there will be an oscillating tendency

Anytime you get a "0", it implies that the time was shorter than the previous detection.  While this is not a guarantee that the time is shorter than the average, it certainly is an indicator that the time is more likely to be shorter than the average. (If you average all the intervals when you get a "0", and compare it to an average of all the intervals, the average interval when you get a "0" should be shorter than the average of all intervals, shouldn't it?)

The reverse can be said about any instance where you get a "1".  This would seem to imply that after a "1", there is a higher than average chance that your next interval will be a "0" (and vice versa).

I suppose for these purposes the bias might not be significant enough to be a problem, but I can't help but wonder if there isn't a better solution.

I think it doesn't work that way. Radioactive decay is a Poisson process, and the time interval between decay events follows the exponential distribution, which is "Memoryless". I'm pretty sure that means that the length of one interval has absolutely no predictive value regarding the length of the next interval.



if (A > B)
   then bit = 1
else if (B > A)
   then bit = 0
// if A = B then no bit is recorded

This requires two counts (events) per bit so the bitrate will be roughly CPM / 120 (CPM = counts per minute).

Following what I said above, I think it should be possible to use only one event per bit. Just check whether an interval is shorter or longer than the median of the exponential distribution, which is ln2 divided by the rate parameter (which can be estimated given the half-life).

We're hunting for Leviathan, and Bitcoin is our harpoon.
precrime3
Member
**
Offline Offline

Activity: 84
Merit: 10

PM for journalist,typing,and data entry services.


View Profile WWW
April 08, 2014, 12:31:28 AM
 #50

What are the benefits of a random number generator? Does this have any scientific or mathematical benefits?

MaxwellsDemon
Full Member
***
Offline Offline

Activity: 187
Merit: 109

Converting information into power since 1867


View Profile
April 08, 2014, 12:36:56 AM
 #51

I've always pondered if there's any use on adapting random behavior of certain natural systems as entropy sources. For example, you could use an USB microscope to record E.Coli movements on a petri dish and use frames as an entropy source, or you could even go and use a string vibrating with an electromagnet under a bridge (i know this source has harmonic behaviour, but you could use the noise in the sine wave to generate additional entropy, or even use FM as the entropy and not the amplitude). This could be recorded with a smallish system like the Electric Imp and sent to a server to seed different PRNGs regularly, resulting in a TRNG seeded PRNG. You could use the TRNGs directly, but real life sources have low bandwidth (This could be desirable for certain systems).

This is all very interesting, but just like the chicken idea above, none of this is truly random. Theoretically, the movements of E. coli are predictable. Every gust of wind in the atmosphere is predictable. Highly complex or chaotic systems may be unpredictable in practice, but they are still predictable in theory.
The only things which are theoretically unpredictable are quantum phenomena.

If you had perfect knowledge of the physical universe, plus an infinitely powerful computer on which to run your perfect simulations, you could predict which direction each E. coli will turn, but you would still have no idea when the next radioactive atom will decay. Or so the theory goes...

We're hunting for Leviathan, and Bitcoin is our harpoon.
chiguireitor
Legendary
*
Offline Offline

Activity: 872
Merit: 1010


Coins, Games & Miners


View Profile WWW
April 08, 2014, 12:50:46 AM
 #52

I've always pondered if there's any use on adapting random behavior of certain natural systems as entropy sources. For example, you could use an USB microscope to record E.Coli movements on a petri dish and use frames as an entropy source, or you could even go and use a string vibrating with an electromagnet under a bridge (i know this source has harmonic behaviour, but you could use the noise in the sine wave to generate additional entropy, or even use FM as the entropy and not the amplitude). This could be recorded with a smallish system like the Electric Imp and sent to a server to seed different PRNGs regularly, resulting in a TRNG seeded PRNG. You could use the TRNGs directly, but real life sources have low bandwidth (This could be desirable for certain systems).

This is all very interesting, but just like the chicken idea above, none of this is truly random. Theoretically, the movements of E. coli are predictable. Every gust of wind in the atmosphere is predictable. Highly complex or chaotic systems may be unpredictable in practice, but they are still predictable in theory.
The only things which are theoretically unpredictable are quantum phenomena.

If you had perfect knowledge of the physical universe, plus an infinitely powerful computer on which to run your perfect simulations, you could predict which direction each E. coli will turn, but you would still have no idea when the next radioactive atom will decay. Or so the theory goes...

Although the "predictability" of E.Coli is provable, the computational power needed to predict every and each movement of an E.Coli on a petri dish would be very high, and even then, the computational power needed to predict that TRNG would probably have a lot more of the power needed to break ECDSA.

You could also use Brownian Motion [1] which is a random phenomena that also exhibits the qualities desirable on a TRNG, and has much more bandwidth than decaying atoms. (Again, low bandwidth is desirable on some TRNGs, so it is a trade-off)

[1] http://en.wikipedia.org/wiki/Brownian_motion

jonald_fyookball
Legendary
*
Offline Offline

Activity: 1302
Merit: 1004


Core dev leaves me neg feedback #abuse #political


View Profile
April 08, 2014, 01:26:41 AM
 #53

Sounds like a fun project.  Very cool.  But why do you need to prove randomness?
It is unlikely to provide any business advantage in my opinion.

kjj
Legendary
*
Offline Offline

Activity: 1302
Merit: 1024



View Profile
April 08, 2014, 01:33:36 AM
 #54

Looks like you are doing things right so far.  Personally, I use a different tube and a chunk of thoriated welding rod, but that's because it is what I had sitting around.  After your A/B filter, you need to feed it through von Neumann's filter (1,0 -> 1; 0,1 -> 0; 0,0 ->discard; 1,1 -> discard).

Next up for your project is monitoring.  Bias creeps in.  You need to keep careful track or you'll end up with garbage.

Finally, you need security.  Ideal would be an old line printer hooked up to a totally offline box.  Every time you get enough bits, encrypt the privkey and print that along with the pubkey (or address).

17Np17BSrpnHCZ2pgtiMNnhjnsWJ2TMqq8
I routinely ignore posters with paid advertising in their sigs.  You should too.
MaxwellsDemon
Full Member
***
Offline Offline

Activity: 187
Merit: 109

Converting information into power since 1867


View Profile
April 08, 2014, 01:42:25 AM
 #55

Although the "predictability" of E.Coli is provable, the computational power needed to predict every and each movement of an E.Coli on a petri dish would be very high, and even then, the computational power needed to predict that TRNG would probably have a lot more of the power needed to break ECDSA.

You could also use Brownian Motion [1] which is a random phenomena that also exhibits the qualities desirable on a TRNG, and has much more bandwidth than decaying atoms. (Again, low bandwidth is desirable on some TRNGs, so it is a trade-off)

[1] http://en.wikipedia.org/wiki/Brownian_motion


I agree of course. For any practical purpose, you don't even need E. coli - any classical method of inputting external entropy (like moving your mouse around) will probably be more than enough. I was just pointing out why OP's use of radioactive decay is really cool  Smiley

As for Brownian motion, it is random insofar as statistical mechanics is affected by quantum phenomena. You could say that in principle, Brownian motion should be truly random, but in practice the motion of particles is affected by multiple environmental factors that could completely overwhelm the true Brownian component. When we let undergrads peek through a microscope and show them lots of little things moving around, we tell them this is called Brownian motion, but in fact much of this motion could be caused by all sorts of external forces. I suppose you'd need some kind of entirely isolated and adiabatic system to observe true Brownian motion...

We're hunting for Leviathan, and Bitcoin is our harpoon.
b¡tco¡n
Member
**
Offline Offline

Activity: 84
Merit: 10

Correct Horse Battery Staple


View Profile
April 08, 2014, 01:44:05 AM
 #56

Sounds like a fun project.  Very cool.  But why do you need to prove randomness?
It is unlikely to provide any business advantage in my opinion.

Security. If your private key ain't truly random. I may be able to guess it and steal your bitcoins.


1GiB1jQnqjwmNW4U4i8autnnVb1fG8HTYM

This would be my avitar; http://s9.postimg.org/m2pzsiy57/avi.png
chiguireitor
Legendary
*
Offline Offline

Activity: 872
Merit: 1010


Coins, Games & Miners


View Profile WWW
April 08, 2014, 02:13:14 AM
 #57

I agree of course. For any practical purpose, you don't even need E. coli - any classical method of inputting external entropy (like moving your mouse around) will probably be more than enough. I was just pointing out why OP's use of radioactive decay is really cool  Smiley

Indeed! it is very cool!

As for Brownian motion, it is random insofar as statistical mechanics is affected by quantum phenomena. You could say that in principle, Brownian motion should be truly random, but in practice the motion of particles is affected by multiple environmental factors that could completely overwhelm the true Brownian component. When we let undergrads peek through a microscope and show them lots of little things moving around, we tell them this is called Brownian motion, but in fact much of this motion could be caused by all sorts of external forces. I suppose you'd need some kind of entirely isolated and adiabatic system to observe true Brownian motion...

Also i'm aware that Brownian motion is completely correlated to the physical properties of the material/specimen/sample being observed, i was expanding over the fact that to obtain a "good enough" TRNG you don't have to go full Quantum (for now, what future holds, i don't know). Also, it would be nice to have a multisource TRNG that is publicly available like Random.org but for cryptocurrencies at large.

hello_good_sir
Hero Member
*****
Offline Offline

Activity: 1008
Merit: 531



View Profile
April 08, 2014, 02:19:54 AM
 #58

Following what I said above, I think it should be possible to use only one event per bit. Just check whether an interval is shorter or longer than the median of the exponential distribution, which is ln2 divided by the rate parameter (which can be estimated given the half-life).

This will create independent bits but there will be a bias towards 1 or 0, depending on the details of your particular setup.  You need to compare two intervals created by the same process within the same system, instead of replacing one of them with an external constant.

Nagle
Legendary
*
Offline Offline

Activity: 1204
Merit: 1000


View Profile WWW
April 08, 2014, 02:23:21 AM
 #59

using radiation is risky.. it has a known half-life which a mathematician could possibly abuse to work out the base number used to then create randomness..
It won't be measuring anything against a base number.  It will be measuring the time between TWO particle detections (if time between this interval is larger than prior interval that is a "1" and if it is shorter it is a "0" and if it is equal we throw it out). 
That's the right way to do it. Von Neumann figured that out around 1950.
DeathAndTaxes (OP)
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
April 08, 2014, 02:36:06 AM
Last edit: April 08, 2014, 02:49:15 AM by DeathAndTaxes
 #60

Following what I said above, I think it should be possible to use only one event per bit. Just check whether an interval is shorter or longer than the median of the exponential distribution, which is ln2 divided by the rate parameter (which can be estimated given the half-life).

This will create independent bits but there will be a bias towards 1 or 0, depending on the details of your particular setup.  You need to compare two intervals created by the same process within the same system, instead of replacing one of them with an external constant.

I agree the later is a better solution but using a Von Neumann filter, the bias of independent bits can be removed.  For example in the setup proposed say the system was biased toward producing 0s over 1s.  Since a 00 sequence (or 11 sequence) is discarded and a 01 and 10 sequence are equally likely the bias can be easily removed (01 = 1 and 10 = 0).  Still you end up using at least 2 counts per bit after filtering.  The actual number of counts required will depend on the amount of bias.  The more biased the source the more counts it will take to produce the "rare" 1 needed to complete the sequence.  For example if a system was biased 70%/30% in favor of zeroes then it will require on average 2.38 counts for each bit that passes out of the filter.
Pages: « 1 2 [3] 4 5 6 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!