Bitcoin Forum
April 25, 2024, 08:08:57 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2] 3 »  All
  Print  
Author Topic: Scaling bitcoin: the elephant in the room  (Read 3362 times)
monsterer2 (OP)
Full Member
***
Offline Offline

Activity: 351
Merit: 134


View Profile
July 21, 2017, 06:44:38 AM
 #21

Good thought.

Now explain it for 10000 blocks ready for the position of 101st block.

Sorry, I don't understand the question?

I think he means that there's 1000 blocks (think transactions) and every one wants to get on the main chain. How long would it take? I mean if the main block time would be a 1 sec that could be possible, as in your example after adding 1 block to main chain every transaction would have to solve puzzle again with a new header. That would take forever to include a single block. Or did I misunderstood your proposition?

Ok, so in the pathological case that 10000 users all begin sending a transaction at the same time, and by some miracle all their PoW are solved at exactly the same time, and no other transactions arrive during this period, then, yes you would have a temporary 10000 wide set of single block forks. But unless this set of circumstances continues, the process for resolving will happen naturally as miners of differently chosen difficulty solve blocks making one branch the leader in terms of cumulative difficulty.

The way to visualise this is like a tree, which expands in width as the transaction throughput increases, but further back in time, the trunk is much more narrow as a single best path of blocks back to the genesis will be visible, with the forks preserved by uncle references (other branches of the tree).

Cheers, Paul.
1714032537
Hero Member
*
Offline Offline

Posts: 1714032537

View Profile Personal Message (Offline)

Ignore
1714032537
Reply with quote  #2

1714032537
Report to moderator
1714032537
Hero Member
*
Offline Offline

Posts: 1714032537

View Profile Personal Message (Offline)

Ignore
1714032537
Reply with quote  #2

1714032537
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714032537
Hero Member
*
Offline Offline

Posts: 1714032537

View Profile Personal Message (Offline)

Ignore
1714032537
Reply with quote  #2

1714032537
Report to moderator
sonulrk
Full Member
***
Offline Offline

Activity: 266
Merit: 100


View Profile
July 21, 2017, 07:04:11 AM
 #22

Good thought.

Now explain it for 10000 blocks ready for the position of 101st block.

Sorry, I don't understand the question?

I think he means that there's 1000 blocks (think transactions) and every one wants to get on the main chain. How long would it take? I mean if the main block time would be a 1 sec that could be possible, as in your example after adding 1 block to main chain every transaction would have to solve puzzle again with a new header. That would take forever to include a single block. Or did I misunderstood your proposition?

Ok, so in the pathological case that 10000 users all begin sending a transaction at the same time, and by some miracle all their PoW are solved at exactly the same time, and no other transactions arrive during this period, then, yes you would have a temporary 10000 wide set of single block forks. But unless this set of circumstances continues, the process for resolving will happen naturally as miners of differently chosen difficulty solve blocks making one branch the leader in terms of cumulative difficulty.

The way to visualise this is like a tree, which expands in width as the transaction throughput increases, but further back in time, the trunk is much more narrow as a single best path of blocks back to the genesis will be visible, with the forks preserved by uncle references (other branches of the tree).

Cheers, Paul.

You are getting me but what if there are continuous thousands of new solved blocks then how would blockchain adds new blocks? I am not saying there are 10000 blocks at any one time but I mean that as users can set their difficulty very low to solve blocks and propagate their new block as soon as possible (yes, they can get little reward but swift transaction. Just think about a normal user). Isn't in that case network speed and latency etc. plays a very big role?
monsterer2 (OP)
Full Member
***
Offline Offline

Activity: 351
Merit: 134


View Profile
July 21, 2017, 07:11:24 AM
 #23

You are getting me but what if there are continuous thousands of new solved blocks then how would blockchain adds new blocks? I am not saying there are 10000 blocks at any one time but I mean that as users can set their difficulty very low to solve blocks and propagate their new block as soon as possible (yes, they can get little reward but swift transaction. Just think about a normal user). Isn't in that case network speed and latency etc. plays a very big role?

Network latency plays a very big part. When transaction throughput is high, the width of the top of the tree grows but then it'll shrink back down in periods where throughput is lower.

As miners who are in it for the money contribute their blocks, they greatly shrink the width of the tree because their blocks have a lot of 'weight' in terms of the difficulty solved, so when one of these blocks appears, everyone will tend to extend from it.

edit: think about the width of the tree being similar to a dynamically adjusting block size in normal bitcoin.
dfebrero
Newbie
*
Offline Offline

Activity: 49
Merit: 0


View Profile
July 21, 2017, 09:37:06 AM
 #24

Did you get hacked through email recovery or password leak ?
I think moderator changed email for my account by "fake" requesst
https://bitcointalk.org/index.php?topic=2030689.msg20221556#msg20221556
Sorry for offtopic.

Not very reassuring that a moderator can do that Shocked Thank you for the explanation
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
July 24, 2017, 07:33:07 PM
 #25

How does the wallet=miner gets all the required data like last block, utxo,...? Only SPV style?

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
monsterer2 (OP)
Full Member
***
Offline Offline

Activity: 351
Merit: 134


View Profile
July 25, 2017, 06:47:22 AM
 #26

How does the wallet=miner gets all the required data like last block, utxo,...? Only SPV style?

SPV clients will continue to work as normal. They can still mine blocks - it wont matter if they don't place their blocks right at the latest head block because their blocks will get referenced as an uncle by other, more dedicated miners.
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
July 25, 2017, 07:24:57 AM
 #27

How does the wallet=miner gets all the required data like last block, utxo,...? Only SPV style?

SPV clients will continue to work as normal. They can still mine blocks - it wont matter if they don't place their blocks right at the latest head block because their blocks will get referenced as an uncle by other, more dedicated miners.

So the more dedicated miners are the common miners we know yet?

What is the difference than to 0-conf TX ?

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
monsterer2 (OP)
Full Member
***
Offline Offline

Activity: 351
Merit: 134


View Profile
July 25, 2017, 08:33:14 AM
 #28

So the more dedicated miners are the common miners we know yet?

What is the difference than to 0-conf TX ?

Full nodes will likely be mining for profit, because they need up to date info on the latest blocks to make sure they get paid. The difference is that you wont need a mining farm to earn bitcoins anymore, even SPV clients can potentially earn a small block reward if they place their blocks on the longest chain and mine with sufficient difficulty.
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
September 12, 2017, 02:07:03 PM
 #29

I was just thinking of the SPV nodes /  mining a bit more and was reading / externding an idea that was about enriching SPVs with a minimum random part of other TX / blockchain parts.

If we would have enough such SPV + random nodes, the part's sum of those will do the job of many 'full' nodes.

Further: If mining could be done in such a SPV + random style as well, like:

You can only send (+  pre-mine) your own TX if you 'atomically' mine a couple of others TXs (at least 2) than mining could be fully decentralized as well and scaling / storage is no issue - at least if there are also enough merchants / miners with huge SPV + random nodes, where the random part might be very huge one for some bigger and riskier entities.

Any comments?  

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
chiller
Full Member
***
Offline Offline

Activity: 129
Merit: 101


View Profile
September 22, 2017, 08:08:59 AM
 #30

Well looks like there's a plan now it's time to actually execute it. Maybe someone will jump on board as contributors if there's proof of concept to develop it further. 

Borilla
Jr. Member
*
Offline Offline

Activity: 83
Merit: 1


View Profile
September 22, 2017, 12:20:26 PM
 #31

Could you write it down. Instead of having to go through questions and answers
Fuserleer
Legendary
*
Offline Offline

Activity: 1050
Merit: 1016



View Profile WWW
September 23, 2017, 01:14:46 AM
Last edit: September 23, 2017, 02:05:40 AM by Fuserleer
Merited by ABCbits (11), DooMAD (5)
 #32

You just explained exactly a Block-Tree which I developed back in 2012, which the original eMunie project used.

It doesn't provide an sufficient scalability improvement over a blockchain and was subsequently dumped in 2014.

The overhead required to maintain the global state so that everyone knows there are n number of child blocks to a parent block at scale becomes quite extreme.  This kills performance past that point.  

If you don't keep all nodes consistent to the very latest information (CAP theorem gets in the way a bit here) then two problems arise:

1) Nodes may also not know what the main branch is because they don't have all the state information and reference parents in weaker branches by mistake.  If that happens, your main branch becomes weaker, because hash power is inadvertently distributed across many branches and lots of miners don't get rewarded.

2) New blocks that are children of old parents will be created and won't be included in the uncle list of the next real block of that parent due to it being created already.  

Your diagram that you put together shows exactly this (by coincedence I assume):



Block(X) is not reference by other blocks, maybe it came in late.  There is also no guarantee if or when it will be referenced by any future blocks either and therefore poses a double spend security risk.

You also have the problem that a dishonest miner can throw in a block on a recent parent with more POW than the other blocks referencing that parent and those which come after it.  That then becomes the branch with the most work.  Therefore, any inputs in a block higher up, can be represented in a block as a child of that new block.  If the input that is seen first is considered the legit spend, and those after it are considered double spends, well.....double spends can happen.

Radix - DLT x.0

Web - http://radix.global  Forums - http://forum.radix.global Twitter - @radixdlt
monsterer2 (OP)
Full Member
***
Offline Offline

Activity: 351
Merit: 134


View Profile
October 02, 2017, 01:23:26 PM
 #33

Hi Fuserleer,

I largely agree with your assessment, but I think the only real problem here is making sure that nodes with old data don't end up posting transactions which are never included in the main ordering. There needs to be some incentive for miners to reference previously unreferenced blocks; perhaps something like ethereum's uncle reward.

The other issued you've outlined are minor, I think?

Cheers, Paul.
monsterer2 (OP)
Full Member
***
Offline Offline

Activity: 351
Merit: 134


View Profile
January 25, 2018, 10:56:13 AM
 #34

I've been thinking about this more recently, because no-one has come up with a real replacement, or workable improvement for bitcoin yet...

The primary problem in my proposed design is the incentive for miners to reference orphaned blocks, or branches - in the initial proposal there is no incentive, and in fact it would be more profitable for miners to ignore uncles and just concentrate on generating large difficulty PoW blocks.

I propose as a solution to this problem that miners get rewarded for 'information gain', defined as the sum of block rewards of a previously unreferenced, or orphaned branches which a newly mined block includes in the main ordering via an uncle reference.

By using the block rewards of orphaned blocks as the reward, it isn't advantageous for a miner to purposefully generate a bunch of orphaned blocks, because the reward he will get from including them later is <= the reward he would get by just increasing the difficulty of his newly mined block.

In this way, it is advantageous for miners to add information to the main ordering, which will not only prevent orphaned, or stuck transactions, but also decrease the time it takes for a transaction to become 'confirmed'.

Comments?
Anti-Cen
Member
**
Offline Offline

Activity: 210
Merit: 26

High fees = low BTC price


View Profile
January 25, 2018, 06:45:17 PM
 #35

Scaling of the block-chain should had been in from day one never-mind later using this
as an excuse to pump up mining fees.

Global money ! You must be joking unless you think that trying to read, decode 200gb
block-chain on a Intel I3 has the processing power of a AS-400 machine

it's like trying to get Microsoft Access using Local MDB files to feed the world without linked tables!

 

Mining is CPU-wars and Intel, AMD like it nearly as much as big oil likes miners wasting electricity. Is this what mankind has come too.
Anti-Cen
Member
**
Offline Offline

Activity: 210
Merit: 26

High fees = low BTC price


View Profile
January 25, 2018, 09:55:42 PM
 #36

I've been thinking about this more recently, because no-one has come up with a real replacement, or workable improvement for bitcoin yet...

Ripple, NEO, IOTA would fit the bill but I exclude ETH and the clones because they are much too slow

Single transactions leaves the wallet and it sure seems complicated how that simple transaction gets
written into the block-chain and it's also difficult to walk with one leg and a blind fold on a system that's
been designed to eat up CPU's

Mining is CPU-wars and Intel, AMD like it nearly as much as big oil likes miners wasting electricity. Is this what mankind has come too.
monsterer2 (OP)
Full Member
***
Offline Offline

Activity: 351
Merit: 134


View Profile
January 26, 2018, 09:37:44 AM
 #37

You basically mentioned uncle blocks which is used by Ethereum.

While it could reward miners and not wasting resources used to mine blocks, there are some flaws such as :
1. Increasing bitcoin production which could rise maximum bitcoin supply and could cause inflation.
2. It could be abused by miners to earn more coins and my 1st point would happen even worse.
3. While i don't have much info, i'm sure adding uncle blocks will require more computing power to run full nodes which could risk decentralization.

Source :
https://bitslog.wordpress.com/2016/04/28/uncle-mining-an-ethereum-consensus-protocol-flaw/

In ethereum, uncle blocks contain redundant data. In my proposal, there is never any redundant data, because every transaction sent mines itself, so the analysis in that link doesn't apply.
dinofelis
Hero Member
*****
Offline Offline

Activity: 770
Merit: 629


View Profile
January 26, 2018, 03:50:17 PM
 #38

*) Block reward is proportional to chosen difficulty (up to a an moore's law based maximum and with a spam preventing minimum)

This is a very good idea, but it shouldn't be used as a reward to consensus.  I think consensus should be done freely and then PoS is a good method (see other thread).  However, I think PoW is a good technique for coin creation (independent of consensus decision).  If coin creation is proportional to a weighted form of proof of work, where the weight is the economic cost of proof of work (technology dependent), then we have an automatic value control mechanism, and we are finally making a CURRENCY, not a speculative get-rich-by-hodling asset.  That currency would be created more if its market price rose (it is then more interesting to spend PoW to make it, than to buy it in the market).   Coin issuance would stop if the price goes below the cost of PoW.  In other words, PoW would work as the central bank regulating supply, and there's no problem of seigniorage, because it is wasted on PoW.

If a coin is set to cost about, say, $10 in PoW, then that coin will hover steadily around $10.  If it rises, more coins are created and this extra offer plummets the price.  If it drops, no more coins are created and its scarcity can make the price rise again.  Moreover, the expectation of constant price will avoid speculation, and promote currency usage.

But again, one should not reward consensus. 
monsterer2 (OP)
Full Member
***
Offline Offline

Activity: 351
Merit: 134


View Profile
January 27, 2018, 07:15:51 AM
 #39

*) Block reward is proportional to chosen difficulty (up to a an moore's law based maximum and with a spam preventing minimum)

This is a very good idea, but it shouldn't be used as a reward to consensus.  I think consensus should be done freely and then PoS is a good method (see other thread).  However, I think PoW is a good technique for coin creation (independent of consensus decision).  If coin creation is proportional to a weighted form of proof of work, where the weight is the economic cost of proof of work (technology dependent), then we have an automatic value control mechanism, and we are finally making a CURRENCY, not a speculative get-rich-by-hodling asset.  That currency would be created more if its market price rose (it is then more interesting to spend PoW to make it, than to buy it in the market).   Coin issuance would stop if the price goes below the cost of PoW.  In other words, PoW would work as the central bank regulating supply, and there's no problem of seigniorage, because it is wasted on PoW.

If a coin is set to cost about, say, $10 in PoW, then that coin will hover steadily around $10.  If it rises, more coins are created and this extra offer plummets the price.  If it drops, no more coins are created and its scarcity can make the price rise again.  Moreover, the expectation of constant price will avoid speculation, and promote currency usage.

But again, one should not reward consensus. 

Rewarding consensus is completely and utterly key to the security and usability of a PoW cryptocurrency - without it, there is no way to bound transaction acceptability.

In addition to that, you cannot control the value of a currency by changing the PoW difficulty. Value is derived from supply and demand, changing the difficulty only affects the supply side.
Kakmakr
Legendary
*
Offline Offline

Activity: 3430
Merit: 1957

Leading Crypto Sports Betting & Casino Platform


View Profile
January 27, 2018, 07:35:52 AM
 #40

1) There is still an LCR rule for selecting the branch with the largest cumulative difficulty.
And how will you send your transaction if there are another 100500 users who have more 100500x times hashrate than you?
It will be very complicated quest to send coins in your network Smiley
"Run you PC for a week or buy an ASIC device to purchase a cup of coffee"  Grin

Because you can chose your own difficulty. Miners who are in it for the money, as they are in bitcoin, can mine with maximum difficulty. End users like you and I can chose a much, much lower difficulty because we don't expect to earn anything, we just want to send a transaction.

The difficulty was brought in to prevent one "bad" actor with massive amounts of hashing power to mine all the blocks for himself and also to adapt to the technological improvement of processing power. So as soon as there are a massive spike in the total hashing power, then the difficulty will adjust to balance things out.

The difficulty is one of the core principles of the protocol. ^smile^

..Stake.com..   ▄████████████████████████████████████▄
   ██ ▄▄▄▄▄▄▄▄▄▄            ▄▄▄▄▄▄▄▄▄▄ ██  ▄████▄
   ██ ▀▀▀▀▀▀▀▀▀▀ ██████████ ▀▀▀▀▀▀▀▀▀▀ ██  ██████
   ██ ██████████ ██      ██ ██████████ ██   ▀██▀
   ██ ██      ██ ██████  ██ ██      ██ ██    ██
   ██ ██████  ██ █████  ███ ██████  ██ ████▄ ██
   ██ █████  ███ ████  ████ █████  ███ ████████
   ██ ████  ████ ██████████ ████  ████ ████▀
   ██ ██████████ ▄▄▄▄▄▄▄▄▄▄ ██████████ ██
   ██            ▀▀▀▀▀▀▀▀▀▀            ██ 
   ▀█████████▀ ▄████████████▄ ▀█████████▀
  ▄▄▄▄▄▄▄▄▄▄▄▄███  ██  ██  ███▄▄▄▄▄▄▄▄▄▄▄▄
 ██████████████████████████████████████████
▄▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▄
█  ▄▀▄             █▀▀█▀▄▄
█  █▀█             █  ▐  ▐▌
█       ▄██▄       █  ▌  █
█     ▄██████▄     █  ▌ ▐▌
█    ██████████    █ ▐  █
█   ▐██████████▌   █ ▐ ▐▌
█    ▀▀██████▀▀    █ ▌ █
█     ▄▄▄██▄▄▄     █ ▌▐▌
█                  █▐ █
█                  █▐▐▌
█                  █▐█
▀▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▀█
▄▄█████████▄▄
▄██▀▀▀▀█████▀▀▀▀██▄
▄█▀       ▐█▌       ▀█▄
██         ▐█▌         ██
████▄     ▄█████▄     ▄████
████████▄███████████▄████████
███▀    █████████████    ▀███
██       ███████████       ██
▀█▄       █████████       ▄█▀
▀█▄    ▄██▀▀▀▀▀▀▀██▄  ▄▄▄█▀
▀███████         ███████▀
▀█████▄       ▄█████▀
▀▀▀███▄▄▄███▀▀▀
..PLAY NOW..
Pages: « 1 [2] 3 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!