Bitcoin Forum
April 27, 2024, 06:16:46 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1] 2 »  All
  Print  
Author Topic: Improved Measurement of Proof-of-Work using entropy  (Read 298 times)
mechanikalk (OP)
Member
**
Offline Offline

Activity: 99
Merit: 91


View Profile WWW
March 10, 2023, 11:08:28 PM
Merited by odolvlobo (10), Halab (2), ABCbits (1)
 #1

This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.
1714241806
Hero Member
*
Offline Offline

Posts: 1714241806

View Profile Personal Message (Offline)

Ignore
1714241806
Reply with quote  #2

1714241806
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714241806
Hero Member
*
Offline Offline

Posts: 1714241806

View Profile Personal Message (Offline)

Ignore
1714241806
Reply with quote  #2

1714241806
Report to moderator
1714241806
Hero Member
*
Offline Offline

Posts: 1714241806

View Profile Personal Message (Offline)

Ignore
1714241806
Reply with quote  #2

1714241806
Report to moderator
larry_vw_1955
Sr. Member
****
Offline Offline

Activity: 1036
Merit: 351


View Profile
March 11, 2023, 01:57:56 AM
 #2

can you give a simple example because it seems kind of abstract.
IShishkin
Member
**
Offline Offline

Activity: 76
Merit: 28


View Profile
March 11, 2023, 06:15:35 AM
 #3

This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.
IShishkin
Member
**
Offline Offline

Activity: 76
Merit: 28


View Profile
March 11, 2023, 01:19:01 PM
 #4

I spend 15 minutes reading this paper, but doesn't understand a thing. It's a shame they don't spend some time to write example or simpler explanation on their website.

--snip--

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

What do you think about takeaway/explanation by one of the author? https://twitter.com/mechanikalk/status/1633694566200094723

Quote them here, please.
mechanikalk (OP)
Member
**
Offline Offline

Activity: 99
Merit: 91


View Profile WWW
March 11, 2023, 06:10:48 PM
 #5

This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

Please tell me what you think the mistake in probability theory is.

Thanks in advanced.
IShishkin
Member
**
Offline Offline

Activity: 76
Merit: 28


View Profile
March 11, 2023, 06:30:56 PM
 #6

This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

Please tell me what you think the mistake in probability theory is.

Thanks in advanced.

There is no "mistake in probability theory". Don't distort my message, please.

What is your personal opinion about this paper? Do you think it is correct?
mechanikalk (OP)
Member
**
Offline Offline

Activity: 99
Merit: 91


View Profile WWW
March 11, 2023, 06:36:03 PM
 #7

This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

Please tell me what you think the mistake in probability theory is.

Thanks in advanced.


There is no "mistake in probability theory". Don't distort my message, please.

What is your personal opinion about this paper? Do you think it is correct?

I think the paper is correct.
IShishkin
Member
**
Offline Offline

Activity: 76
Merit: 28


View Profile
March 11, 2023, 06:44:23 PM
 #8

This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

Please tell me what you think the mistake in probability theory is.

Thanks in advanced.


There is no "mistake in probability theory". Don't distort my message, please.

What is your personal opinion about this paper? Do you think it is correct?

I think the paper is correct.

In your opinion, what definition of "entropy" does the author use through out the paper?
garlonicon
Hero Member
*****
Offline Offline

Activity: 800
Merit: 1932


View Profile
March 11, 2023, 06:56:35 PM
 #9

Quote
Current difficulty based weighting systems do not take the intrinsic block weight into account.
And it is good that they don't. Because this change will lower requirements for chain reorganization. If I understand it correctly, this proposal is about using chainwork based on the current block, instead of difficulty. Those kind of changes were also discussed in other topics.
mechanikalk (OP)
Member
**
Offline Offline

Activity: 99
Merit: 91


View Profile WWW
March 11, 2023, 07:15:50 PM
 #10

Quote
Current difficulty based weighting systems do not take the intrinsic block weight into account.
And it is good that they don't. Because this change will lower requirements for chain reorganization. If I understand it correctly, this proposal is about using chainwork based on the current block, instead of difficulty. Those kind of changes were also discussed in other topics.

If you look at the paper it is not just proposing you change chain work but that you simultaneously change the calculation of tip weight from very roughly being Td_new = Td_old + Td_threshold to Td_new = Td_old * Td_chainwork.  That would make both chain work and chain weight geometric which would actually improve finalization time by minimizing conflicts and maximizing recorded work.
mechanikalk (OP)
Member
**
Offline Offline

Activity: 99
Merit: 91


View Profile WWW
March 11, 2023, 08:15:25 PM
 #11

This paper POEM: Proof-of-Entropy-Minima (https://arxiv.org/abs/2303.04305) was just published to arxiv.  It seems like a much better way to measure the heaviest chain tip as well as minimize time to resolve orphans.  It can instantaneously resolve 67% of orphans rather than having to wait for the next block.  Additionally, it seems to have better finalization time guarantees for a given hash.  Also, it has an equation that relates to finalization that would create objective measurable preference between different hash functions.

No, it's not. The author has made a very common mistake. This mistake is a consequence of a superficial understanding of probability theory.

Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

Please tell me what you think the mistake in probability theory is.

Thanks in advanced.


There is no "mistake in probability theory". Don't distort my message, please.

What is your personal opinion about this paper? Do you think it is correct?

I think the paper is correct.

In your opinion, what definition of "entropy" does the author use through out the paper?

delta_S = 1/2^n where n is the number of leading zeros.  This makes sense from a Shannon entropy concept where entropy is a reduction in divergence and it also makes sense from a system entropy standpoint as well were miners exert work to lower the entropy and increase the system order.
IShishkin
Member
**
Offline Offline

Activity: 76
Merit: 28


View Profile
March 11, 2023, 08:25:27 PM
Merited by vapourminer (2)
 #12

Hold on.

The first lesson in Probability Theory class is the following.

Whenever you approach a problem, the first thing you should do is to define a probability space. If the probability space is not defined, then all these concepts don't make sense. There is no "probability", there is no "expectation" and "divergence". Also there is no "entropy" and "Shanon entropy".

The probability space has not been defined in the paper. If the choice of the probability space is obvious, then you can easily fill this gap and define probability space instead of the author.

mechanikalk (OP)
Member
**
Offline Offline

Activity: 99
Merit: 91


View Profile WWW
March 11, 2023, 08:27:09 PM
 #13

Hold on.

The first lesson in Probability Theory class is the following.

Whenever you approach a problem, the first thing you should do is to define a probability space. If the probability space is not defined, then all these concepts don't make sense. There is no "probability", there is no "expectation" and "divergence". Also there is no "entropy" and "Shanon entropy".

The probability space has not been defined in the paper. If the choice of the probability space is obvious, then you can easily fill this gap and define probability space instead of the author.



They do define the probability space 2^l which in the case of bitcoin is 2^256.
IShishkin
Member
**
Offline Offline

Activity: 76
Merit: 28


View Profile
March 11, 2023, 10:26:34 PM
Merited by odolvlobo (10), vapourminer (2)
 #14

Hold on.

The first lesson in Probability Theory class is the following.

Whenever you approach a problem, the first thing you should do is to define a probability space. If the probability space is not defined, then all these concepts don't make sense. There is no "probability", there is no "expectation" and "divergence". Also there is no "entropy" and "Shanon entropy".

The probability space has not been defined in the paper. If the choice of the probability space is obvious, then you can easily fill this gap and define probability space instead of the author.



They do define the probability space 2^l which in the case of bitcoin is 2^256.

I is finite, so it's a discrete probability space. The elementary event in this probability space is "hash h, h in 2^I, has been generated". The probability of every elementary event is 2^-I regardless of what hash h corresponds to this elementary event.

Now let's consider the event "block with target T has been mined". It means a miner has calculated a hash h<=T. Every two hashes h1 and h2, which satisfy this inequality, have the same probability, regardless of what they look like and how many zeros are there in their binary representation. They carry the same "information".

The probability P of mining a block in this setting is (T+1)/2^I. The average number of attempts before the success is 1/P. That is 2^I/(T+1). This number is called the "block difficulty" and "block weight". It's unbiased estimation of the work executed by miners. That's what we need and the story often ends here.

In the paper an extra step is made. There is an assumption that every hash has an "information" or "weight". There is an attempt "to reduce an entropy" or, in other words, as I understood, there is an attempt to reduce the standard deviation of estimation from the estimated value. The fact is that when every hash has the same weight, the deviation is minimal.

I think, at some point in the paper, when the entropy was introduced, an unintentional switch to another probability space has occurred. When you start treating hashes by quantity of leading zeros, the fraction of "information" within the hash get lost. You get a new probability space, a new problem and a new optimal unbiased estimation. However, this new solution might not be a solution to the original problem in the original probability space.
n0nce
Hero Member
*****
Offline Offline

Activity: 882
Merit: 5814


not your keys, not your coins!


View Profile WWW
March 12, 2023, 12:30:08 AM
 #15

~
What do you think about takeaway/explanation by one of the author? https://twitter.com/mechanikalk/status/1633694566200094723
Quote them here, please.

New paper on novel mechanism used in @QuaiNetwork to scale blockchain. Proof-of-Entropy-Minima (PoEM). TL;DR blockchains are time invariant static sequence of commitments. PoW is massively inefficient bc it doesn’t do the math correctly. #btc #ETH #Crypto

Some more takeaways:
1) entropy reduction is the goal of work
2) deltaS=1/2^n defines blockchain
3) All PoW vulnerabilities arise bc #2 is not used
4) 67% of all orphans are completely avoidable with PoEM.
5) The best hash algos reduce entropy most efficiently while preserving one-way collision resistance w uniform distribution.
6) Using time to describe a blockchain is completely incorrect
7) Withholding and reordering attacks only exist bc difficulty is computed incorrectly
Cool correct computation of head entropy makes perfect head choices, even with withholding and stales, without any secondary measurement or consideration
9) Satoshi added a friendly finalization mechanism to #BTC. Coercion of intrinsic bits to threshold difficulty. Without it #BTC would not have any finite guarantee of finalization.

The point about 'PoW is massively inefficient' is immediately wrong, in the sense that as a general concept, if you make it more easy / efficient to calculate valid block hashes, the difficulty would adjust upwards to keep the ASICs running for 10 minutes on average to find a new block. Therefore, the energy consumption will not be reduced.
This holds true for any system or proposal that aims to make mining more efficient. Even reusing heat is just an efficiency improvement that leads to more money / value extracted from the same amount of electricity, driving up the difficulty to account for it.

█▀▀▀











█▄▄▄
▀▀▀▀▀▀▀▀▀▀▀
e
▄▄▄▄▄▄▄▄▄▄▄
█████████████
████████████▄███
██▐███████▄█████▀
█████████▄████▀
███▐████▄███▀
████▐██████▀
█████▀█████
███████████▄
████████████▄
██▄█████▀█████▄
▄█████████▀█████▀
███████████▀██▀
████▀█████████
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
c.h.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀█











▄▄▄█
▄██████▄▄▄
█████████████▄▄
███████████████
███████████████
███████████████
███████████████
███░░█████████
███▌▐█████████
█████████████
███████████▀
██████████▀
████████▀
▀██▀▀
larry_vw_1955
Sr. Member
****
Offline Offline

Activity: 1036
Merit: 351


View Profile
March 12, 2023, 02:12:02 AM
 #16


Satoshi has made a good job and his calculation of the chain weight is correct.

If you disagree with me we can dive into details.

yeah i mean satoshi did things using the KISS principle but amazingly he got things right. people keep trying to come up with "improvements" to certain things but they are invariably way over complicated and have issues so they don't turn out to be an improvement in any real sense. just Keep it simple.

if you can't explain something with a simple example then its probably not something worth trying to figure out.
odolvlobo
Legendary
*
Offline Offline

Activity: 4298
Merit: 3209



View Profile
March 12, 2023, 11:34:11 PM
Last edit: March 12, 2023, 11:45:50 PM by odolvlobo
Merited by vapourminer (1), ABCbits (1)
 #17

I is finite, so it's a discrete probability space. The elementary event in this probability space is "hash h, h in 2^I, has been generated". The probability of every elementary event is 2^-I regardless of what hash h corresponds to this elementary event.

I think that is the key point. The amount of work needed to produce a block depended on the target value and not on the value of its hash. Therefore, using the hash value to determine the longest chain could be considered as incorrect since the criteria is the most amount of work.

However, I don't see a problem with the distortion in practice, and it does provide the benefits of a reduction in orphaned blocks and a deterrence of selfish mining.

BTW, using the hash values rather than the target values to determine the longest chain is not a new idea, though this is the first time I have seen the idea explored in detail. My main criticism is the misapplication of "entropy".

Join an anti-signature campaign: Click ignore on the members of signature campaigns.
PGP Fingerprint: 6B6BC26599EC24EF7E29A405EAF050539D0B2925 Signing address: 13GAVJo8YaAuenj6keiEykwxWUZ7jMoSLt
mechanikalk (OP)
Member
**
Offline Offline

Activity: 99
Merit: 91


View Profile WWW
March 02, 2024, 04:46:46 AM
Merited by vapourminer (1)
 #18

There is now formal security proof showing the safety and liveness of Proof-of-Entropy-Minima (PoEM).  https://eprint.iacr.org/2024/200

This relatively simple adjustment to the measurement of block weight has been empirically shown to: prevent selfish mining, create faster finalization time, increase throughput, and decrease block times while maintaining safety and liveness of the chain.  Modifications to the current heaviest chain rule are very straight forward.

The TL;DR is:

Presently the heaviest chain rule does not treat blocks as being dependent events even though they are dependent because of the hash link to the parent.

PoEM uses combinatorics to weigh the dependent block events appropriately. This generates a guaranteed unique weight for each block which is the average number of hashes, in expectation, that would be required to produce a chain of equal weight.

The guaranteed unique weight prevents the network from mining on competing forks because the likelyhood of having blocks of equal weight is (1/2^256) ~= 0

Additionally, the process naturally introduces randomization on each sample (ie block) which prevents profitable withholding attacks.

Finally, it shows that the threshold for a Sybil attack can be improved from 33% to 40%.
gmaxwell
Moderator
Legendary
*
expert
Offline Offline

Activity: 4158
Merit: 8382



View Profile WWW
March 06, 2024, 07:04:22 PM
Merited by vapourminer (1)
 #19

Use of the apparent difficult has been proposed many times before-- it and other schemes that result in a unique value for each block which is calcuable by the block finder immediately lead to a withholding attack:  If you know your block is unusually good, such that it is likely to win even if announced late, you are better off not announcing it.  The withholding profits are proporitional to your share of hash power (as thats how often your advantage gets you a second block) so it's also a centralization pressure.

Bitcoin's criteria has always been that each node adopt the first block it sees which meets the target, which creates an incentive to announce (one which breaks down if your share of the hashpower is over a large threshold, but if any party controls that much hashpower the system has other worse problems).
tromp
Legendary
*
Offline Offline

Activity: 978
Merit: 1077


View Profile
March 07, 2024, 08:16:29 AM
 #20

Use of the apparent difficult has been proposed many times before-- it and other schemes that result in a unique value for each block which is calcuable by the block finder immediately lead to a withholding attack
The withholding attack also reduces bitcoin's stochastic finality: a tx 6 blocks deep still has about a 1% chance of being reorged. One has to wait much longer before a tx can be considered finalized.

The scheme has other downsides as well:

Anytime two blocks are found in short succession, with the later one having a lower hash, it causes the earlier to be reorged, when that earlier one has already been relayed across most of the network,
wasting bandwidth and causing unnecessarily many small reorgs.

Pages: [1] 2 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!