As/when the blocks fill up with real demand then the empty blocks will start getting used because the fees will incentivise the miners to develop more efficient software that will queue profitable enough waiting transactions to be mined immediately into the next block. You need to leave them in for your analysis ...
Nope. Because the reason for mining empty blocks is that the miner does not, at the time, have enough information to mine a valid block with transactions from the mempool. It's fiendishly clever. So the miner is mining purely for the block reward because that is all that is available for that short period of time. Which incentive only goes away when the block reward does.
|
|
|
Ideally I'd like to see both versions plus a snapshot of the mempool size. Or a version without empty blocks, number of empty blocks for that period, and a snapshot of the mempool size... but that sounds like a lot of work.
Actually surprisingly little. I'll see if I can do it and work out JJG's size increase. Also, there's a bug which nobody seems to have noticed yet
|
|
|
So I see no issue with including them.
The issue is that they cause the situation to be represented inaccurately. Say we had a backlog sufficiently large that we could fill a hundred blocks. Each block could legitimately be mined up to close to the full block limit of 1000000. Somewhere around 5-7 blocks mined in an hour's sample should honestly produce an aggregate average block fill of 99/100%. Now, consider that if 3 empty blocks get thrown out there on top of those 5-7 (empty blocks tend to get mined very quickly since miners will usually start mining in transactions asap), that pulls the percentage down to maybe 70%. The question is, is that a fair representation of the situation? My inclination is to regard empty blocks as NOOPs and exclude them from the calculation. Though I would probably indicate that empty blocks were found.
|
|
|
It iterates over all blocks since the last check, totals the block space used and the block space available (by adding 1000000 for each block) and calculates the percentage from that. In theory, empty blocks *could* be valid but in practice, I think it's fair to say that 100% are from early mining empty blocks (which is a valid miner activity according to the rules but skews calculations and causes the difficulty higher than it should be, causing a lower tps availability)
|
|
|
Shill? The desperation is showing. I'm going to stop including 1 transaction blocks in the full-block calculations though. There were three out of six (all F2Pool) when I looked yesterday. That's skewing things way down.
|
|
|
Which only means that it must really suck for them that the master himself thought of max blocksize as a trivially easily changeable spam prevention measure, not some grand economic variable that must be protected by the spilling of blood of free Randian Übermenschen once in a while.
Yes. This is really my point. I'm not saying that we should increase it because Satoshi said we should, I'm just saying that it is an arbitrary limit with no meaning and was chosen to be so high that it would not interfere with regular transactions (which will no longer be the case shortly). Something like doubling the limit would have zero effect in the immediate term but will avoid us running head-first into a wall in 3 months or so.
|
|
|
In a stunning display of courage, and with an unwavering commitment to the forging of the truth in the crucible of the dialectical method... Our friend, and highly qualified mentor, has graciously offered to continue the debate of these important ideas rather than retreat like a coward to the wizard's irc clubhouse.
Like those who have not #ragequit under adversity before... may his example continue to inspire us in our efforts to fully realize the potential of this bottom-up, community driven effort towards changing the world through a fully decentralized, dialup and raspberryPi compatible, layer 1 settlement network.
Not fair! I'm still recovering from new years and now I have to open a bottle of Champagne this soon?
|
|
|
Trying to fit more data into the same 1MB is work. Trying to offload txs until they self-cancel, so that the network can scale better is work. Just upping a number is lame.
TIL Rube Goldberg has descendants. Why do something the easy way when you can complicate things and introduce risks of failure? That it *is* lame and it *is* trivial is what makes it so exasperating that the opposition to it works so hard against it. It can be phased in, like: if (blocknumber > 115000) maxblocksize = largerlimit
It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.
|
|
|
The WAP2 links should be blocked from indexing by search engines. Hasn't been done. Even the Google result to find my post below returned the wap2 page. theymos can make a robots.txt with these lines added:
User-agent: * ...(other stuff) Disallow: /index.php?action=printpage Disallow: /index.php?wap2
Yep. I don't see any reason not to make this small change. Well, with the new forum software coming real soon now... Oh wait, this was over a year ago and it's still doing it.
|
|
|
I do not feel like I have much of a stake in either direction, but it is possible that I am siding a bit with the concept that there is some legitimacy to maintaining the status quo, and accordingly the burden is on those who want to change it to convince the rest to change and how to change, etc etc....
The status-quo so far is that (other than on certain occasions) there has been plenty of space for all transactions that needed to get into a block without needing excessive fees. Full blocks is something that the bitcoin ecosystem is not familiar with and will be a big change in operational status.
|
|
|
For completeness, I would add that 'Big Blocker' Prime, Gavin, made some strategically unwise decisions early on -- basically, being rather uncompromising wrt his 'exponential block size growth' model, which likely lead to further hardening of the fronts, and personally cost him a lot of sympathies and trust.
That already was a big compromise. Sometimes, the problem is that the only compromise the other side will accept is you acceding to their demands utterly,
|
|
|
In that regard, there is no problem having varying opinions about the direction forward, but when several large block proponents argue like spoiled children regarding having to do x "right now," or everything is going to go to hell in a handbasket, it comes off as short-sighted at best and disingenuous at worse, because bitcoin is not broken at the moment, even though plans and measures do need to take place in order to scale and prepare for the future seemingly inevitable increases in transaction volume.
Meh, the big-blockers are mostly putting forward arguments and trying to actually do something about things. The small blockers are the ones engaging in DDOS attacks, censorship (mostly led by Theymos) and other childish tricks. It's one thing to have disagreements, it's another to try and control the narrative.
|
|
|
He's a bright kid. If only he'd use his powers for good.
|
|
|
By the way, there is no excuse for the cost to be quadratic. That is one of the many crocks in the BitcoinCore implementation, that will take more crocks to work around. Like the Segregated Witnesses proposal, malleability and its partial patches, blockchain voting to increase the limit, etc..
Jorge, do you have a link to how this issue arises and, ideally, how it might be solved? I have to agree that it seems like the core devs are working on the wrong things currently. I know open source is supposed to be about scratching an itch but there are some quite serious issues that have consistently failed to be addressed with core (some of which are seeing progress in some of the alternate versions).
|
|
|
Some miners might process junk for free* or near-zero cost. Others won't even process higher fee txs. As it stands, the answer is "it depends".
* I recently consolidated some of my dust into one address for free.
Those zero-cost transactions are heavily subsidized by the block reward currently. Which is going away in time. Fees are an afterthought at the moment.
|
|
|
WTF is that Punk yellow it is hard on the eyes. Try a royal blue approach Don't listen to me I am in a pre-drunk phase for the appending new year. Just orange. It seems to be in fashion right now.
|
|
|
With my feelings of aging, 10-15% larger would be a little better.
I will see what I can do. I was just picking rough values when I did it.
|
|
|
I'd like to see a business start to pay people for running nodes. Maybe they can get advertisers to pay them money to cover the cost. I end up running a node for a few weeks then stop since I get nothing.
They could already pay Amazon to run a node. What would be interesting would be to have a method for rewarding people for running nodes. Possibly by putting a Bitcoin address in the comment or something?
|
|
|
I take it that means the last block was 860KB. Getting closer.
It's actually the average of the blocks in the previous hour. Some were more. Empty blocks bring the average down too even though they don't contribute to processing transactions.
|
|
|
It's tough to get an accurate picture with a single number. Usually, the less than 10 min blocks are going to be smaller, and coinbase only blocks are going to be tiny. If you plan max capacity to avg capacity... you're going to have issues. Yeah, that's why I'm thinking 85%. Not every block needs to be full to cause problems and empty blocks will probably still be mined bringing the "average" down significantly. Probably actual transactions/average max transactions might be a better measure. Empty blocks are probably best just ignored completely for calculations.
|
|
|
|