Bitcoin Forum
September 29, 2016, 01:38:34 AM *
News: Due to DDoS attacks, there may be periodic downtime.
 
   Home   Help Search Donate Login Register  
Pages: « 1 [2] 3 4 5 »  All
  Print  
Author Topic: Block size limit automatic adjustment  (Read 13054 times)
FreeMoney
Legendary
*
Offline Offline

Activity: 1246


Strength in numbers


View Profile WWW
November 21, 2010, 10:53:57 PM
 #21

Wait, so what stops an attacker from generating a block with one million or more spam transactions?

Nothing, but it stops attackers from drowning legitimate transactions in junk inside the normal blocks. Are gargantuan phony entire blocks really a problem? They will be expensive to produce and won't be long lived as they are extremely hard not to spot and no generators in their right mind would continue building the chain from one of them. They would lose the income from any subsequent blocks when everyone else ditches the offending block. So that should take care of itself through generator self interest.


Hmm, okay outrageous blocks containing only junk would be easy to spot. But what of badly sized say 100k when average is 10k that contain mostly junk, but also the legit transactions that had been received. Maybe now some will reject and some will not? If there is no uniform rule it will be splits all over the place. And even normal users will be affected. If their transaction is in a very oversize block do they just hope it will stay? Or hope it will not be accepted and send again? I think there must be a max block size to avoid this.

Also the "public good of difficulty" comment made me realize that block size may need to be "artificially" limited in some way. But I think updating it along with difficulty to be slightly more than the average size of the previous 2016 blocks, but never decreasing, is a resonable way to do it.

Play Bitcoin Poker at sealswithclubs.eu. We're active and open to everyone.
1475113114
Hero Member
*
Offline Offline

Posts: 1475113114

View Profile Personal Message (Offline)

Ignore
1475113114
Reply with quote  #2

1475113114
Report to moderator
1475113114
Hero Member
*
Offline Offline

Posts: 1475113114

View Profile Personal Message (Offline)

Ignore
1475113114
Reply with quote  #2

1475113114
Report to moderator
The grue lurks in the darkest places of the earth. Its favorite diet is adventurers, but its insatiable appetite is tempered by its fear of light. No grue has ever been seen by the light of day, and few have survived its fearsome jaws to tell the tale.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1475113114
Hero Member
*
Offline Offline

Posts: 1475113114

View Profile Personal Message (Offline)

Ignore
1475113114
Reply with quote  #2

1475113114
Report to moderator
1475113114
Hero Member
*
Offline Offline

Posts: 1475113114

View Profile Personal Message (Offline)

Ignore
1475113114
Reply with quote  #2

1475113114
Report to moderator
db
Sr. Member
****
Offline Offline

Activity: 279



View Profile
November 21, 2010, 11:25:28 PM
 #22

Hmm, okay outrageous blocks containing only junk would be easy to spot. But what of badly sized say 100k when average is 10k that contain mostly junk, but also the legit transactions that had been received. Maybe now some will reject and some will not? If there is no uniform rule it will be splits all over the place. And even normal users will be affected. If their transaction is in a very oversize block do they just hope it will stay? Or hope it will not be accepted and send again? I think there must be a max block size to avoid this.
Too expensive. But it doesn't matter. This made me realize the whole idea won't work anyway. Generators could just pad their blocks with transactions to themselves with fees set so that they can include as many transactions as they want, i.e. all of them.

Also the "public good of difficulty" comment made me realize that block size may need to be "artificially" limited in some way. But I think updating it along with difficulty to be slightly more than the average size of the previous 2016 blocks, but never decreasing, is a resonable way to do it.
The blocks would quickly grow too large.
db
Sr. Member
****
Offline Offline

Activity: 279



View Profile
November 21, 2010, 11:32:48 PM
 #23

Too expensive. But it doesn't matter. This made me realize the whole idea won't work anyway. Generators could just pad their blocks with transactions to themselves with fees set so that they can include as many transactions as they want, i.e. all of them.

Which could be prevented if other generators ignore new blocks with lots of unpublished transactions. But that feels a little messy.
MoonShadow
Legendary
*
Offline Offline

Activity: 1666



View Profile
November 21, 2010, 11:52:36 PM
 #24

Wait, so what stops an attacker from generating a block with one million or more spam transactions?

Nothing, but it stops attackers from drowning legitimate transactions in junk inside the normal blocks. Are gargantuan phony entire blocks really a problem? They will be expensive to produce and won't be long lived as they are extremely hard not to spot and no generators in their right mind would continue building the chain from one of them. They would lose the income from any subsequent blocks when everyone else ditches the offending block. So that should take care of itself through generator self interest.


There is no 'dropping' a valid block, spamming or not.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
db
Sr. Member
****
Offline Offline

Activity: 279



View Profile
November 22, 2010, 12:08:33 AM
 #25

There is no 'dropping' a valid block, spamming or not.

Sure there is. Just ignore it and continue building the chain from the previous block.
MoonShadow
Legendary
*
Offline Offline

Activity: 1666



View Profile
November 22, 2010, 12:13:42 AM
 #26

There is no 'dropping' a valid block, spamming or not.

Sure there is. Just ignore it and continue building the chain from the previous block.


Then you have created a new rule that will split the network.  Part of the point on agreeing in advance on a common set of network rules is to avoid regularly spitting the chain.

"The powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent meetings and conferences. The apex of the systems was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations. Each central bank...sought to dominate its government by its ability to control Treasury loans, to manipulate foreign exchanges, to influence the level of economic activity in the country, and to influence cooperative politicians by subsequent economic rewards in the business world."

- Carroll Quigley, CFR member, mentor to Bill Clinton, from 'Tragedy And Hope'
db
Sr. Member
****
Offline Offline

Activity: 279



View Profile
November 22, 2010, 12:20:26 AM
 #27

Then you have created a new rule that will split the network.  Part of the point on agreeing in advance on a common set of network rules is to avoid regularly spitting the chain.

Yes, but these particular splits would be very small and unnoticeable for the normal user.
RHorning
Full Member
***
Offline Offline

Activity: 210


View Profile
November 22, 2010, 04:08:34 AM
 #28

Then you have created a new rule that will split the network.  Part of the point on agreeing in advance on a common set of network rules is to avoid regularly spitting the chain.

Yes, but these particular splits would be very small and unnoticeable for the normal user.


If some "normal user" happened to get a transaction adopted into one of these forks, they'd sure notice.

What decides which part of the chain split is accepted is the 51% of the CPU processing.  This isn't even a theoretical speculation, as there have been similar chain splits in the network already, most notably when the clients upgraded from 0.3.9 to 0.3.10.  The "bad" transactions were thrown into "good" blocks and the block rejected as falling out of the rules by some of the generators but accepted by others.  Yes, it created a mess, but what I'm saying is that the network as already dealt with this situation and it passed with flying colors.

Of course warning messages had to be passed around for everybody to "know" which chain was more likely to be permanently accepted by the network, as it did take place with the upgrade of the clients + generators.  This is also why there is still a warning not to use clients prior to 0.3.10 right now because they are missing some of the rules which stopped what appears to be an attack on the network.

BTW, otherwise "valid" blocks were dropped because they were included in the "wrong" chain, and unfortunately this did include a few legitimate transactions.  Not many transactions were lost, as the warning messages were sent out that it was a problem at the time.

There is no 'dropping' a valid block, spamming or not.

Sure there is. Just ignore it and continue building the chain from the previous block.


Then you have created a new rule that will split the network.  Part of the point on agreeing in advance on a common set of network rules is to avoid regularly spitting the chain.

Agreed, but that doesn't imply that the rules to the network must always stay the same either.  The main point is that most of the network must agree to the same rules, and if the rules change it must be something seen to be implicitly necessary to keep the network running... usually to stop spaming or some attack on the network would be the most logical reasons for adding rules.  This is similar to other networking protocols that do changes from time to time, sometimes because of malicious attacks on the network.

The reason to deal with this issue now, rather than later, is that we can talk objectively regarding what solutions or algorithms we might want to implement to resolve this issue.  If there is huge pressure because the transactions are starting to pile up and transaction fees are escalating as a result, any changes in the algorithm and network protocols are going to be seen as being a huge advantage to one group or another and it will become a political process instead.

Politics and computer programming don't mix very well.

1FLK3uUT3Vup5JtkGJVXKHAoS3AZWPcKdv
theymos
Administrator
Legendary
*
expert
Online Online

Activity: 2422


View Profile
November 22, 2010, 05:02:37 AM
 #29

Not many transactions were lost, as the warning messages were sent out that it was a problem at the time.

IIRC, the legitimate chain overtook the "contaminated" chain within the 100-block maturation time, so all transactions were ported to the new chain (except for the illegal ones).

Chain forks are not inherently bad. If the network disagrees about a policy, then a split is good. The better policy will win. If block forks start happening a lot, it would be simple to consider a transaction unconfirmed if it relies on a generation that isn't 500 blocks deep or whatever.

1NXYoJ5xU91Jp83XfVMHwwTUyZFK64BoAD
caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
November 22, 2010, 08:48:17 AM
 #30

This made me realize the whole idea won't work anyway. Generators could just pad their blocks with transactions to themselves with fees set so that they can include as many transactions as they want, i.e. all of them.

There is no economic incentive in flooding. Actually, you can only do it on the blocks you create, otherwise you have to pay fees for it.
So, flooding would be done just by silly people trying to attack the system. They would hardly be sufficiently numerous to make what you say here:

The blocks would quickly grow too large.

That would only happens if flooders are numerous, what I would doubt. Not to mention that, if the block max size is "just", there will always be quite a good number of paying transactions to be added. Maybe there is enough to fill the block with. There is an incentive not to flood if we think this way.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
November 22, 2010, 08:53:13 AM
 #31

Not to mention that, the larger the block, the longer it takes to propagates it to the network, what I suppose can slightly increase the chance that another block generated by somebody else propagates faster. Really tiny chance, but anyway, it's another counter-incentive to flooding...

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
db
Sr. Member
****
Offline Offline

Activity: 279



View Profile
November 22, 2010, 10:07:53 AM
 #32

This made me realize the whole idea won't work anyway. Generators could just pad their blocks with transactions to themselves with fees set so that they can include as many transactions as they want, i.e. all of them.

There is no economic incentive in flooding. Actually, you can only do it on the blocks you create, otherwise you have to pay fees for it.
So, flooding would be done just by silly people trying to attack the system.

Definitely; the worry wasn't flooding but circumventing the artificial scarcity keeping transaction fees above zero.

They would hardly be sufficiently numerous to make what you say here:

The blocks would quickly grow too large.

That would only happens if flooders are numerous, what I would doubt. Not to mention that, if the block max size is "just", there will always be quite a good number of paying transactions to be added. Maybe there is enough to fill the block with. There is an incentive not to flood if we think this way.

Again the worry wasn't flooding but keeping the block size small enough to support transaction fees. But anyway, under that scheme, wouldn't it take just one person that fills every block with free transactions to make the block size grow exponentially?
caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
November 22, 2010, 12:45:30 PM
 #33

I see, you think people could push up the limit to be sure that it would be big enough to always fit every transaction in it, therefore collecting more fees. On the long run that would be bad for generators themselves though, as the fee values would fall.
I'm not even sure this is interesting to the generator in the short run itself. He wouldn't collect more fees in the "flooded" blocks he generates.. and he doesn't have a real guarantee of being able to do so in the future blocks either.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
November 22, 2010, 01:04:29 PM
 #34

But anyway, under that scheme, wouldn't it take just one person that fills every block with free transactions to make the block size grow exponentially?

Regarding free transactions, I don't see why would somebody accept them as long as the client gives users the option to add fees to the transactions.
A generator could add free or dummy transactions to his own blocks in the intend to push the limit up, but then, one person only wouldn't be that effective in increasing the limit, as s/he wouldn't be able to generate enough blocks. If periods of adjustment are short, one person only wouldn't even be able to generate one block per period, so in the end s/he would be inoffensive, as the limit would fall back.
Only an attacker with strong computing power could push up the limit considerably. And I don't see much incentives in using a strong computing power like this... do you?

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
db
Sr. Member
****
Offline Offline

Activity: 279



View Profile
November 22, 2010, 01:12:24 PM
 #35

I see, you think people could push up the limit to be sure that it would be big enough to always fit every transaction in it, therefore collecting more fees. On the long run that would be bad for generators themselves though, as the fee values would fall.
I'm not even sure this is interesting to the generator in the short run itself. He wouldn't collect more fees in the "flooded" blocks he generates.. and he doesn't have a real guarantee of being able to do so in the future blocks either.

Not quite. A max block size that grows to accommodate all transactions won't impose scarcity even without flooding. And, unrelated, a single attacker could cheaply do a massive flooding attack; not for profit but out of malice.

In the rule scheme without a max block size the problem is that a generator could trick the rule to allow more transactions into the block for each individual block generated if the generator is allowed to include unpublished transactions to itself.
db
Sr. Member
****
Offline Offline

Activity: 279



View Profile
November 22, 2010, 01:16:48 PM
 #36

Regarding free transactions, I don't see why would somebody accept them as long as the client gives users the option to add fees to the transactions.

Possibly not. But what about very low fee transactions?
caveden
Legendary
*
Offline Offline

Activity: 1106



View Profile
November 22, 2010, 02:04:03 PM
 #37

Well, I do think that there must be some blocks with enough free space to fit all almost-free transactions once in a while.. the adjustment shouldn't be so that all blocks are filled up.
I suppose transfers don't happen homogeneously during the 24h of a day, 7 days of a week. So, if the limit is "just", there will be blocks filled, and there will be blocks with free space where even free transactions could enter if the generator doesn't mind.

18rZYyWcafwD86xvLrfuxWG5xEMMWUtVkL
ShadowOfHarbringer
Legendary
*
Offline Offline

Activity: 1470


Bringing Legendary Har® to you since 1952


View Profile
November 22, 2010, 02:13:14 PM
 #38

I think i agree with caveden - having such an important constant hardcoded in bitcoin may be devastating at some point when the network changes significantly or grows much larger than it is now.
Generally, almost every important value in core of bitcoin algorithms should be a non-constant elastic variable which can adapt to changes.

Anyway, I still really would like to see Satoshi's & Gavin's opinions on this.

ByteCoin
Sr. Member
****
expert
Offline Offline

Activity: 416


View Profile
November 26, 2010, 02:13:48 AM
 #39

To clarify, the block size limit is the size beyond which a received block will be rejected by a client just because it's too big.

I agree with caveden that having a fixed block size limit could cause problems in future.

Let's consider the scenario in which Bitcoin becomes popular and the non-spam transaction rate starts to rise. The current fees and priority scheme is fine until the size of the fees required becomes a disincentive for new users to start using Bitcoin. The miners must choose between taking a smaller fee from a given transaction or maintaining their fee schedule and effectively turning away lots of new users perhaps to other competing cryptographic currency schemes.
I think it's reasonable to imagine that everyone will decide to drop fees to a level that encourages the widest possible adoption of Bitcoin until other limiting factors (such as network bandwidth) come into play.
So with the reduced fees, block sizes increase until blocks get rejected by old clients with lower hard block size limits. These clients can't relay the new blocks and so new clients would have to only connect to other new clients. Miners which reject the large blocks would continue to build a block chain of "normal" sized blocks. As soon as transactions start to refer to coins in new large blocks then the old clients would reject these transactions and these coins could be double spent on the "old" client network. I don't think this would be pretty.

The ostensible reason for hard block limits is to prevent spam. As ribuck mentions current spam attacks have two effects, one which you can see and one that you can't. You can see block sizes rising but this is an effect which counteracts the less visible problem of your transaction cache filling up with spam transactions. I believe that memory exhaustion due to transaction cache filling will be the main problem with spam attacks so large blocks removing lots of transactions from it will mitigate it. The real solution to spam is "shunning" which I will outline in another post. I believe having any block limits is likely to exacerbate the adverse effects of spam transactions.

As FreeMoney observes, in the absence of block limits there's nothing to stop a miner from including arbitrary amounts of its own spam transactions in the block. This is true. However, it's certainly not in the non-generating client's interest to reject the block even if it only removes a few transactions from the cache. Rather the onus is on the other miners to notice that the new block does not remove enough transactions from the cache and reject it. They will then build the longer chain while ignoring that block which will be an orphan. Hence the spamming miner is punished.

The moral of this story is that the non-generating clients operate on the network at the pleasure of the miners. The miners are effectively in control of the "health" of the network and the current block size limits reflect that. So for example block http://blockexplorer.com/b/92037 is about 200455 bytes long and mostly contains spam. Normal blocks max out at 50k. This shows that at least one generator has chosen to waive the current fees scheme. I think that letting miners effectively decide their own fees scheme will be seen to be the least bad option.

ByteCoin
asdf
Hero Member
*****
Offline Offline

Activity: 527


View Profile
November 26, 2010, 07:50:38 AM
 #40

The moral of this story is that the non-generating clients operate on the network at the pleasure of the miners. The miners are effectively in control of the "health" of the network and the current block size limits reflect that. So for example block http://blockexplorer.com/b/92037 is about 200455 bytes long and mostly contains spam. Normal blocks max out at 50k. This shows that at least one generator has chosen to waive the current fees scheme. I think that letting miners effectively decide their own fees scheme will be seen to be the least bad option.

We came to a similar conclusion in this thread:
http://bitcointalk.org/index.php?topic=1847.0;all
My concern is I don't see any inherent force that will stabilize transaction fees.

Generators have the ability to accept any transactions they see fit as well as reject any block that doesn't adhere to their "ethics". The question is; will this game result in an oligopoly of price gouging generators, will it result in a dead market where no one generates or will some competing forces reach a common ground of a fair stable fee structure.

I'm not smart enough to figure this out. I wish Satoshi would weigh in on this issue. I suspect he may have already envisioned the outcome.
Pages: « 1 [2] 3 4 5 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!