Bitcoin Forum
November 21, 2017, 01:16:26 AM *
News: Latest stable version of Bitcoin Core: 0.15.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: « 1 [2]  All
  Print  
Author Topic: How to create an N bit ECDSA compatible private key from dice rolls in python  (Read 2042 times)
bitsec731
Jr. Member
*
Offline Offline

Activity: 32


View Profile
January 23, 2017, 10:08:39 AM
 #21

I partially disagree, because I don't think they are the same language.

Yes.  They are.  They are both just numbers.


Wow, amazing.

A very interesting view on information.

So you are saying basically that every number or text that physically exists in the world as an object has minimum as much entropy as the computational representation of it?

So if we have the number 2384923482983, and this number in Chinese characters painted on a wall. Then if we take a picture of that wall, then that picture will have minimum as much entropy as the number itself? Or if I say out the number 2384923482983 in German and record it in an MP3 file, then that MP3 file will also have minimum as much entropy as the number itself?


A very interesting theory, do you know any scientific papers on it, I would like to read more?
Join ICO Now A blockchain platform for effective freelancing
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1511226986
Hero Member
*
Offline Offline

Posts: 1511226986

View Profile Personal Message (Offline)

Ignore
1511226986
Reply with quote  #2

1511226986
Report to moderator
1511226986
Hero Member
*
Offline Offline

Posts: 1511226986

View Profile Personal Message (Offline)

Ignore
1511226986
Reply with quote  #2

1511226986
Report to moderator
1511226986
Hero Member
*
Offline Offline

Posts: 1511226986

View Profile Personal Message (Offline)

Ignore
1511226986
Reply with quote  #2

1511226986
Report to moderator
ArcCsch
Full Member
***
Offline Offline

Activity: 224


▲ Portable backup power source for mining.


View Profile
January 23, 2017, 01:15:16 PM
 #22

So if we have the number 2384923482983, and this number in Chinese characters painted on a wall. Then if we take a picture of that wall, then that picture will have minimum as much entropy as the number itself? Or if I say out the number 2384923482983 in German and record it in an MP3 file, then that MP3 file will also have minimum as much entropy as the number itself?
Yes, provided the Chinese characters and the MP3 file have sufficient quality to make it possible to recover the number.

If you don't have sole and complete control over the private keys, you don't have any bitcoin!  Signature campaigns are OK, zero tolorance for spam!
1JGYXhfhPrkiHcpYkiuCoKpdycPhGCuswa
DannyHamilton
Legendary
*
Online Online

Activity: 1974



View Profile
January 23, 2017, 03:12:49 PM
 #23

So you are saying basically that every number or text that physically exists in the world as an object has minimum as much entropy as the computational representation of it?

No.  You have that backwards.

I'm saying that any representation of a number has at minimum as much entropy as the choice of the number itself.  Entropy isn't lost from the selection method when the selection is represented, regardless of whether that representation is a die, pencil on paper, a computer file, or a wall painting.

So if we have the number 2384923482983, and this number in Chinese characters painted on a wall. Then if we take a picture of that wall, then that picture will have minimum as much entropy as the number itself? Or if I say out the number 2384923482983 in German and record it in an MP3 file, then that MP3 file will also have minimum as much entropy as the number itself?

Note that "a number" (such as 2384923482983) doesn't have ANY entropy at all.  In order for that number on that wall to have an amount of informational entropy there needs to be some randomness to what that value could be.  If that number on that wall was a choice between 2384923482981 and 2384923482985, then it has very little entropy at all (no matter how it was chosen).  If it was a completely random selection between 0 and 10001000 where every value had equal probability of being selected, then it has significantly more entropy.

https://en.wikipedia.org/wiki/Entropy_(information_theory)
Quote
Entropy is zero when one outcome is certain. Shannon entropy quantifies all these considerations exactly when a probability distribution of the source is known. The meaning of the events observed (the meaning of messages) does not matter in the definition of entropy. Entropy only takes into account the probability of observing a specific event, so the information it encapsulates is information about the underlying probability distribution, not the meaning of the events themselves.

Generally, entropy refers to disorder or uncertainty.

That being said, I think it should be self-evident that any representation of a number such that all representations in the selection set are significantly distinguishable from each other will have at minimum as much entropy as the number selection method itself.  Note that in some languages (or representations) a number may NOT be sufficiently distinguishable from another number.  In that case, entropy CAN be lost.  For example, if you write down a randomly generated number in english, and your handwriting is sloppy such that all your 7's look like your 1's, then there will be less entropy in your written representation than in the number generated since your written number will effectively never have any 7's (they will always be 1's) in the written form.

A very interesting theory, do you know any scientific papers on it, I would like to read more?

Entropy is a measure of randomness in information.  If the information being represented is a number, and the representation of that information is distinguishable from the other representations.  Then the randomness of the representation of the information will never be less than the randomness of the information itself, since the actual information hasn't changed (only the representation).

Think about this for a moment...

Lets say I have a machine that gives me truly random numbers between 0 and 2256 (perhaps it uses radioactive decay as a source of randomness).  Is the entropy of the representation any less (or more) if I represent that number in binary than if I represent that number in decimal or base 58, or hexadecimal?  The "information"  (the concept of the number itself) hasn't been lost with any of those representations, has it?


theymos
Administrator
Legendary
*
expert
Offline Offline

Activity: 2842


View Profile
January 23, 2017, 09:02:15 PM
 #24

I wrote a page on the wiki about this a while ago: https://en.bitcoin.it/wiki/Passphrase_generation#Generating_keys.2C_seeds.2C_and_random_numbers_.28Advanced.29

My recommended method is to just hash a string containing the dice rolls. The rolls will contain enough entropy in themselves (if you're using enough dice), and the hash should not degrade this in any significant way.

1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
Pages: « 1 [2]  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!