Bitcoin Forum
December 05, 2016, 08:56:17 PM *
News: Latest stable version of Bitcoin Core: 0.13.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Poll
Question: Will you support Gavin's new block size limit hard fork of 8MB by January 1, 2016 then doubling every 2 years?
1.  yes
2.  no

Pages: « 1 ... 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 [1409] 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 ... 1560 »
  Print  
Author Topic: Gold collapsing. Bitcoin UP.  (Read 1804778 times)
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
July 04, 2015, 11:35:25 PM
 #28161

Thoughts on how fraud proofs could make it possible for SPV clients to reject an invalid chain, even if the invalid chain contains the most PoW:

https://gist.github.com/justusranvier/451616fa4697b5f25f60

(some modifications to the Bitcoin protocol required)
1480971377
Hero Member
*
Offline Offline

Posts: 1480971377

View Profile Personal Message (Offline)

Ignore
1480971377
Reply with quote  #2

1480971377
Report to moderator
1480971377
Hero Member
*
Offline Offline

Posts: 1480971377

View Profile Personal Message (Offline)

Ignore
1480971377
Reply with quote  #2

1480971377
Report to moderator
1480971377
Hero Member
*
Offline Offline

Posts: 1480971377

View Profile Personal Message (Offline)

Ignore
1480971377
Reply with quote  #2

1480971377
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1480971377
Hero Member
*
Offline Offline

Posts: 1480971377

View Profile Personal Message (Offline)

Ignore
1480971377
Reply with quote  #2

1480971377
Report to moderator
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
July 05, 2015, 12:02:59 AM
 #28162

Some numbers:

Assume it takes on average 30 seconds to verify 1 MB of typical transactional data (k =0.5 min / MB).  Since T = 10 min, this means the maximum average blocksize (network capacity) is limited to:

    Seffective   =   T / (4 k)   =   (10 min) / (4 x 0.5 min / MB)
                   = 5 MB.

QED. We've shown that there exists a limit on the maximum value of the average blocksize, due to the time it takes to verify a block, irrespective of any protocol enforced limits.  

Great work Peter, but do we have any empirical evidence for the 30 seconds? Seems surprisingly high and I would have guessed just a few seconds.

Peter R
Legendary
*
Offline Offline

Activity: 938



View Profile
July 05, 2015, 12:07:18 AM
 #28163

Some numbers:

Assume it takes on average 30 seconds to verify 1 MB of typical transactional data (k =0.5 min / MB).  Since T = 10 min, this means the maximum average blocksize (network capacity) is limited to:

    Seffective   =   T / (4 k)   =   (10 min) / (4 x 0.5 min / MB)
                   = 5 MB.

QED. We've shown that there exists a limit on the maximum value of the average blocksize, due to the time it takes to verify a block, irrespective of any protocol enforced limits.  

Great work Peter, but do we have any empirical evidence for the 30 seconds? Seems surprisingly high and I would have guessed just a few seconds.

No, I just made it up.  I think I'll change it to 15 second, as I agree it's probably too high.  I guess we could estimate it by looking at the ratio of empty blocks to non-empty blocks produced by F2Pool.

If someone wants to tabulate that data, I'll update my post.  

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
TheRealSteve
Hero Member
*****
Offline Offline

Activity: 686

FUN > ROI


View Profile
July 05, 2015, 12:23:37 AM
 #28164

I guess we could estimate it by looking at the ratio of empty blocks to non-empty blocks produced by F2Pool.

If someone wants to tabulate that data, I'll update my post.  

If you by empty you mean coinbase-only, then in the last 27,027 blocks (basically since jan 1st 2015), f2pool-attributed blocks: 5241, of which coinbase-only: 139
For antpool, this is 4506 / 246.  Not sure if that's all the info you'd need, though.
See also: Empty blocks [bitcointalk.org]

thezerg
Legendary
*
Offline Offline

Activity: 1246


View Profile
July 05, 2015, 12:31:33 AM
 #28165

Thoughts on how fraud proofs could make it possible for SPV clients to reject an invalid chain, even if the invalid chain contains the most PoW:

https://gist.github.com/justusranvier/451616fa4697b5f25f60

(some modifications to the Bitcoin protocol required)

Your modification to require the inputs to state which block it comes from is a clever way to reduce the addr does not exist proof.  But I dont understand your subsequent complexity.  If the txn input states that block B is the UTXO then the invalid proof is simply to supply B, right?
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
July 05, 2015, 12:41:24 AM
 #28166

If the txn input states that block B is the UTXO then the invalid proof is simply to supply B, right?
That's one way to do it, however even this can be shortened.

Right now with all the blocks < 1 MB it's not really a big deal to supply the entire block to prove that the referenced transaction doesn't exist, but it'd be nice to not require the entire block especially for when blocks are larger.

By adding a rule to new blocks that require all the transactions to be ordered by their hash, you don't need to supply the entire block to prove that the transaction doesn't exist.

It would be good to have that ordering requirement in place before blocks are allowed to grow to make sure that fraud proof size is bounded.
Peter R
Legendary
*
Offline Offline

Activity: 938



View Profile
July 05, 2015, 12:41:29 AM
 #28167

I guess we could estimate it by looking at the ratio of empty blocks to non-empty blocks produced by F2Pool.

If someone wants to tabulate that data, I'll update my post.  

If you by empty you mean coinbase-only, then in the last 27,027 blocks (basically since jan 1st 2015), f2pool-attributed blocks: 5241, of which coinbase-only: 139
For antpool, this is 4506 / 246.  Not sure if that's all the info you'd need, though.
See also: Empty blocks [bitcointalk.org]

Awesome!  Thanks!!

We can estimate the average effective time it takes to process the blocks, then, as

    τ ~= T (Nempty / Nnotempty)
      ~= T (Nempty / (Ntotal - Nempty))

F2Pool:

      ~= (10 min) x [139 / (5241 - 139)] = 16.3 seconds

AntPool:

      ~= (10 min) x [246 / (4506 - 246)] = 34.6 seconds
  

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
cypherdoc
Legendary
*
Offline Offline

Activity: 1764



View Profile
July 05, 2015, 01:03:38 AM
 #28168

I guess we could estimate it by looking at the ratio of empty blocks to non-empty blocks produced by F2Pool.

If someone wants to tabulate that data, I'll update my post.  

If you by empty you mean coinbase-only, then in the last 27,027 blocks (basically since jan 1st 2015), f2pool-attributed blocks: 5241, of which coinbase-only: 139
For antpool, this is 4506 / 246.  Not sure if that's all the info you'd need, though.
See also: Empty blocks [bitcointalk.org]

is there a way for you to tell what % of blocks have been full over the last 3 wks and compare that to prior going back to say Jan 1?

i'd include those in the 900+ and 720-750 kB range as being full.
thezerg
Legendary
*
Offline Offline

Activity: 1246


View Profile
July 05, 2015, 01:31:40 AM
 #28169

If the txn input states that block B is the UTXO then the invalid proof is simply to supply B, right?
That's one way to do it, however even this can be shortened.

Right now with all the blocks < 1 MB it's not really a big deal to supply the entire block to prove that the referenced transaction doesn't exist, but it'd be nice to not require the entire block especially for when blocks are larger.

By adding a rule to new blocks that require all the transactions to be ordered by their hash, you don't need to supply the entire block to prove that the transaction doesn't exist.

It would be good to have that ordering requirement in place before blocks are allowed to grow to make sure that fraud proof size is bounded.

Makes sense... I'd recommend a quick line or two in your blog to explain that:

"In order to reduce the size of the fraud proof needed to show that a transaction input does not exist, additional information must be added to Bitcoin blocks to indicate the block which is the source of each outpoint used by every transaction in the block.

A node can provide the source block to the SPV client to prove or disprove the existence of this transaction.  But with a few more changes we can provide a subset of the source block.  This may become very important if block sizes increase.
"

TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420


View Profile
July 05, 2015, 01:35:33 AM
 #28170

The block size debate is iatrogenesis — any 'cure' is worse than the illness.


Freemarket, in 2010 everyone could buy thousands of Bitcoin for almost nothing, what hindered it was, besides being relatively unknown at that point in time, few people actually believed cryptocurrencies could be a thing, with Monero its almost the same, the difference being its swiming in a sea of shitcoins and not many can see its potential, its the second Cryptonote coin, the first being heavily premined, and it was launched with a MIT licence, there is absolutely no merit to claims Monero stole anything, its like saying Ubuntu stole code from Debian, or that Apple stole from FreeBSD, so even though Monero market cap is low, few people will actually bother buying a large stack because it is not a 100% certain bet, but its clear there is nothing close to Monero as Zerocash/Zerocoin is vaporware and Bitcoin sidechains are like dragons.

My point about my personal preference where I employed the word "clusterfuck" to describe the hundreds of Cryptonote clones and that Monero's marketing (on these forums) to some extent had to vilify other CN clones in order to assert its dominance of CN clones, is instead I would have preferred to add features to CN that would naturally assert dominance over other CN clones. It felt to me like Monero used strong-armed community tactics instead to gain more critical mass than the other CN clones, yet not so much capabilities innovation (rather a lot of refinement which I assume includes a lot of fine grained performance innovations). And I am nearly certain this (lack of outstanding capabilities other than the on chain rings) is why Monero is not more widely adopted and will be the downfallstunted growth of Monero (and I say this with specific knowledge of capabilities that I think will subsume Monero very soon). And that is precisely why I would not prematurely release those features in a whitepaper for 1000s of clones to go implement simultaneously. And yet people criticize me for not spilling the beans before the software is cooked.

The marketing battle is not against the other "shitcoins" thus differentiating Monero from shit. Rather the battle is against Bitcoin core on who is going to own the chain that most of the BTC migrates to.

Also most of the interest in altcoins is not ideological, but rather speculative. We are in a down market now until BTC bottoms this October, so only getting mostly ideological investment in Monero, not speculative fever. This will turn after October, but it might be too late for Monero depending on the competition that might arise interim. However, I tend to think Monero will get a big boost after October in spite of any new competition, because it is a more stable codebase. As smooth pointed out, the greatest threat to breakage is in implementation error. It would behove Monero to be the first CN coin to apply my suggested fix to insure combinatorial analysis of partially overlapping rings can't occur.

P.S. CN is very important.

Peter R
Legendary
*
Offline Offline

Activity: 938



View Profile
July 05, 2015, 01:46:56 AM
 #28171

Am I being sensitive or is this an unnecessarily spiteful reply from Greg Maxwell?:

Quote
...
You've shown that you can throw a bunch of symbolic markup and mix in a lack of understanding and measurement and make a pseudo-scientific argument that will mislead a lot of people, and that you're willing to do so or too ignorant to even realize what you're doing.

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
ssmc2
Hero Member
*****
Offline Offline

Activity: 840


View Profile
July 05, 2015, 01:50:41 AM
 #28172

Am I being sensitive or is this an unnecessarily spiteful reply from Greg Maxwell?:

Quote
...
You've shown that you can throw a bunch of symbolic markup and mix in a lack of understanding and measurement and make a pseudo-scientific argument that will mislead a lot of people, and that you're willing to do so or too ignorant to even realize what you're doing.

He sounds bitter.
laurentmt
Sr. Member
****
Offline Offline

Activity: 379


View Profile
July 05, 2015, 01:53:15 AM
 #28173

What this shows is that since the subtracted term, τ (1- Pvalid), is strictly positive, the miner's expectation of revenue, <V>, is maximized if the time to verify the previous block is minimized (i.e., if τ is as small as possible).
Actually, <V> is also maximized if Pvalid == 1 (or Pvalid as close as possible to 1).
How to reach this result ? My humble proposal: make a deal with a few mining pools. Participants will never push invalid blocks to others participants. Blocks received from the cartel aren't checked before hashing a new block.

Conclusion: As the average blocksize gets larger, the time to verify the previous block also gets larger. This means that miners will be motivated to improve how quickly their nodes can perform the ECDSA operations needed to verify blocks or that they will be more motivated to trick the system.

EDIT:
Quote from: Peter R
Am I being sensitive or is this an unnecessarily spiteful reply from Greg Maxwell?
Well, he seems a bit upset for now Wink but I think his message is close from what I've tried to suggest with my comment.
We must analyze all the possibilities before jumping to a conclusion which backs our initial hypothesis. The point is valid for all of us, whatever our opinion on this blocksize issue.
thezerg
Legendary
*
Offline Offline

Activity: 1246


View Profile
July 05, 2015, 02:06:05 AM
 #28174

Am I being sensitive or is this an unnecessarily spiteful reply from Greg Maxwell?:

Quote
...
You've shown that you can throw a bunch of symbolic markup and mix in a lack of understanding and measurement and make a pseudo-scientific argument that will mislead a lot of people, and that you're willing to do so or too ignorant to even realize what you're doing.

Its a strategy (implemented unconsciously by many) to limit participation to the select few.  Unfortunately it tends to create a situation where only similar personalities contribute which is where we are today with the core devs, Gavin excepted.


I read his 21ms validation number but its weird because I was wondering just weeks ago why it was taking so long to sync a measly week of blockchain data and came to the conclusion that either the P2P code is complete garbage (compared to bittorrent for example) OR the validation cost is high (given my fan speed, I assumed it was validation).  And if validation is so fast, why would these pools have custom code to skip it?

It will be interesting to look at stats gathering mode he mentions.
Peter R
Legendary
*
Offline Offline

Activity: 938



View Profile
July 05, 2015, 02:17:54 AM
 #28175

Am I being sensitive or is this an unnecessarily spiteful reply from Greg Maxwell?:

Quote
...
You've shown that you can throw a bunch of symbolic markup and mix in a lack of understanding and measurement and make a pseudo-scientific argument that will mislead a lot of people, and that you're willing to do so or too ignorant to even realize what you're doing.
Its a strategy (implemented unconsciously by many) to limit participation to the select few.  Unfortunately it tends to create a situation where only similar personalities contribute which is where we are today with the core devs, Gavin excepted.

Thanks.  That makes me feel better.

Quote
I read his 21ms validation number but its weird because I was wondering just weeks ago why it was taking so long to sync a measly week of blockchain data and came to the conclusion that either the P2P code is complete garbage (compared to bittorrent for example) OR the validation cost is high (given my fan speed, I assumed it was validation).  And if validation is so fast, why would these pools have custom code to skip it?

This is the empirical fact that motivated me to perform the analysis. I was surprised how many "defensive blocks" we were seeing.  Furthermore, based on these numbers:

...in the last 27,027 blocks (basically since jan 1st 2015), f2pool-attributed blocks: 5241, of which coinbase-only: 139
For antpool, this is 4506 / 246.  

it looks like the average time these pools are mining empty blocks is 16 seconds (F2Pool) and 35 seconds (AntPool) before switching to non-empty blocks.  Like you said, why are these numbers so big if processing the blocks is so fast?

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420


View Profile
July 05, 2015, 02:38:54 AM
 #28176

actually, there's 2 names for what's he's done.

one, from a moral standpoint, and one from a legal standpoint.  i'll let you figure out what those names are.

Unethical and extortion? (my pops is an attorney perhaps I inherited some of it ... and now you know why I won't go on Skype with you ...)

cloverme
Legendary
*
Offline Offline

Activity: 896



View Profile
July 05, 2015, 02:57:50 AM
 #28177

Am I being sensitive or is this an unnecessarily spiteful reply from Greg Maxwell?:

Quote
...
You've shown that you can throw a bunch of symbolic markup and mix in a lack of understanding and measurement and make a pseudo-scientific argument that will mislead a lot of people, and that you're willing to do so or too ignorant to even realize what you're doing.

I agree with part of the quote. Also, the blocksize debate needs to go away by people stopping throwing 2 cents in a problem Gavin has a good solution for.

What we really need solutions for...
-Greater adoption with ease of use (Generation x's and earlier are still using paypal because it's easier)
-China mining pools are greed based and harm bitcoin and the blockchain
-When people think of bitcoins, they assume drugs and other shady activities

Enough.... sorry, no idea who you are, so no offense, but tired of seeing these blocksize posts show up on Reddit.
TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420


View Profile
July 05, 2015, 03:03:28 AM
 #28178

Let τ be the time it takes to verify a typical block and let T be the average block time (10 min).  The fraction of time the miner does not know whether the most recent block was valid is clearly τ / T; the fraction of the time the miner does know is 1 - τ / T = (T - τ) / T.  We will assume that every miner applies the same policy of producing empty SPV blocks before they've verified, and blocks of size S' after they've verified.  

Under these conditions, the expectation value of the blocksize is equal to the expectation value of the blocksize during the time a miner doesn't know, plus the expectation value of the blocksize during the time he does know:

    Seffective = ~0 [(τ / T)]   +  S' [(T - τ) / T]          
                 = S' [(T - τ) / T]                          (Eq. 1)

The time, τ, it takes to process a block is not constant, but rather depends linearly** on the size of the block.  Approximating the size of the previous block as S', we get:

    τ = k S'

....

QED. We've shown that there exists a limit on the maximum value of the average blocksize, due to the time it takes to verify a block, irrespective of any protocol enforced limits.

Your egregious mathematical error (myopia) of course is that you assume k is the same for all miners. And this is why you totally miss the centralization caused by your nonproof.

TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420


View Profile
July 05, 2015, 03:14:58 AM
 #28179

The fraction of time the miner does not know whether the most recent block was valid is clearly τ / T, which means the fraction of the time the miner does know is 1 - τ / T = (T - τ) / T.

You are attempting to develop an equation on orphan rate relative to the propagation delay (which includes verification delay), but this can't be done without context of the distribution of computation in the network, which some argue is modeled by a Poisson distribution.

... miners will be motivated to improve how quickly their nodes can perform the ECDSA operations needed to verify blocks.

I had argued upthread that bandwidth limitation (propagation delay) is the only justification for not expending more resources on verification instead of forming 0 txn blocks (when blocks are mostly funded by txn fees, which may not be the case now):

This delay is a form of propagation delay and thus drives up the orphan rate for miners with less resources. Afaik proportional increases in orphan rate are more costly than proportional decreases in hashrate, because the math is compounded (but diminishing) on each subsequent block of the orphaned chain. Thus this action doesn't appear to make economic sense unless it is explained as a lack of bandwidth and not a lack of desire to apply more of their resources to processing the txns than to hashrate. If it is bandwidth that is culprit, then it argues against larger block sizes.

This poll is inaccurate because voters can't change their vote!! Peter R posts nonsense, then the Yes votes go bonkers and they can't change their vote after Peter R has been thoroughly refuted.

Peter R
Legendary
*
Offline Offline

Activity: 938



View Profile
July 05, 2015, 03:23:52 AM
 #28180

Quote
Your egregious mathematical error (myopia) of course is that you assume k is the same for all miners. And this is why you totally miss the centralization caused by your nonproof.

I agree that in reality, k will not be constant across miners. What I posted was a simplified model so that the effect can be easily isolated and understood. It would be interesting to try to model a distribution of verification times, among other details. In any case, I suspect the important result will be the same.

I'd love to see you show with mathematics that under certain assumptions the mining will centralize. Why don't you give it a try?

Run Bitcoin Unlimited (www.bitcoinunlimited.info)
Pages: « 1 ... 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 [1409] 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 ... 1560 »
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!