johnyj
Legendary
Offline
Activity: 1988
Merit: 1012
Beyond Imagination
|
|
January 05, 2016, 05:06:05 PM |
|
have an exaggerated opinion that you can substitute bitcoin to altcoins that don't have this problem or already solved this problem, eventually bitcoin as a platform for consensus-building and testing scenarios of justified and I think , though, that bitcoin has run its course and at this stage it is necessary to concentrate on the more advanced features in this area, but also active implementation at all levels
After those who complain about the high fee left, the bitcoin network will be as good as before, so eventually all the poor people leave for alt-coins and rich people stay at bitcoin, which coin will be more valuable? The one with more utility. That's exactly what 21 inc's computer is about: Trying to have as much utility as possible, do all the things in one package
|
|
|
|
lottery248
Legendary
Offline
Activity: 1582
Merit: 1006
beware of your keys.
|
|
January 06, 2016, 05:15:13 AM |
|
i was wishing to keep the blockchain size remains, however, we would need to transfer a lot of stuff and assets and bitcoin-based markets. moreover, more people would like to have verification without trust, blockchain could be increased but not that much.
|
out of ability to use the signature, i want a new ban strike policy that will fade the strike after 90~120 days of the ban and not to be traced back, like google | email me for anything urgent, message will possibly not be instantly responded i am not really active for some reason
|
|
|
Cryptology (OP)
Legendary
Offline
Activity: 1008
Merit: 1001
In Cryptography We Trust
|
|
January 09, 2016, 06:32:35 AM |
|
|
|
|
|
Yanidas
Member
Offline
Activity: 60
Merit: 10
The swarm is headed towards us.
|
|
January 11, 2016, 02:27:54 PM |
|
How long will it be on the testnet before it is implemented in the mainnet. Softfork is quite good for the block size increase.
|
|
|
|
indstove
Newbie
Offline
Activity: 33
Merit: 0
|
|
January 12, 2016, 01:00:49 PM |
|
How long will it be on the testnet before it is implemented in the mainnet. Softfork is quite good for the block size increase. Yes, I do agree on that.
|
|
|
|
Quantus
Legendary
Offline
Activity: 883
Merit: 1005
|
|
January 14, 2016, 01:34:39 AM |
|
I hope we reach the 1MB limit soon I'm sick of all this fear mongering. Nothing will happen other then threw natural selection the micro transactions will be pushed out of the blockchain.
|
(I am a 1MB block supporter who thinks all users should be using Full-Node clients) Avoid the XT shills, they only want to destroy bitcoin, their hubris and greed will destroy us. Know your adversary https://www.youtube.com/watch?v=BKorP55Aqvg
|
|
|
Cconvert2G36
|
|
January 14, 2016, 03:17:59 AM Last edit: January 14, 2016, 04:00:37 AM by Cconvert2G36 |
|
I hope we reach the 1MB limit soon I'm sick of all this fear mongering. Nothing will happen other then threw natural selection the micro transactions will be pushed out of the blockchain.
Why not just soft fork it down to 0.5MB to clear up some of this cruft sooner than later? Bitcoin is a value storage and transfer tool of the wealthy elite, poors may use doge and litecoin.
|
|
|
|
Quantus
Legendary
Offline
Activity: 883
Merit: 1005
|
|
January 14, 2016, 05:37:43 AM |
|
I hope we reach the 1MB limit soon I'm sick of all this fear mongering. Nothing will happen other then threw natural selection the micro transactions will be pushed out of the blockchain.
Why not just soft fork it down to 0.5MB to clear up some of this cruft sooner than later? Bitcoin is a value storage and transfer tool of the wealthy elite, poors may use doge and litecoin. We should never have gone above 0.5mb
|
(I am a 1MB block supporter who thinks all users should be using Full-Node clients) Avoid the XT shills, they only want to destroy bitcoin, their hubris and greed will destroy us. Know your adversary https://www.youtube.com/watch?v=BKorP55Aqvg
|
|
|
rubygon
Newbie
Offline
Activity: 40
Merit: 0
|
|
January 14, 2016, 07:12:31 PM |
|
Blocksize increae is absolutely necessary now and it will be done immediately. There are occasions where we have seen unconfimed transactions for days and even a week or two. The strongly accept the debate over blocksize increase. However, we need more nodes and high performing machines to handle the increase in size.
|
|
|
|
ZephramC
|
|
January 15, 2016, 03:20:25 PM |
|
I hope we reach the 1MB limit soon I'm sick of all this fear mongering. Nothing will happen other then threw natural selection the micro transactions will be pushed out of the blockchain.
Why not just soft fork it down to 0.5MB to clear up some of this cruft sooner than later? Bitcoin is a value storage and transfer tool of the wealthy elite, poors may use doge and litecoin. Because once you allow max block size change there will be endless arguments about which exact size is the right one (0.5 MB, 2 MB, 100 kB, 16 MB, 1 kB, ...) For every size you can find group that would benefit from it. Only neutral interest-group-free solution is to keep constants fixed. Status quo. Change is possible, but with much, much higher consensus. 75% is not enough, only miners, only large business, only small merchants, only startups, only investors, only full-noders ... this is not consensus at all. All of them should agree, No one of them should veto. ( Every one of them should have right to veto.) Until we find broadly acceptable solution, status quo is the best one.
|
|
|
|
eternalgloom
Legendary
Offline
Activity: 1792
Merit: 1283
|
|
January 15, 2016, 03:27:51 PM |
|
I hope we reach the 1MB limit soon I'm sick of all this fear mongering. Nothing will happen other then threw natural selection the micro transactions will be pushed out of the blockchain.
Why not just soft fork it down to 0.5MB to clear up some of this cruft sooner than later? Bitcoin is a value storage and transfer tool of the wealthy elite, poors may use doge and litecoin. We should never have gone above 0.5mb The problem with that though is that quite a lot of people are currently depending on it for it's use a a currency or payment method. Not increasing the block size would just be a slap in the face to those people :-/
|
|
|
|
xdrpx
|
|
January 15, 2016, 03:31:46 PM |
|
I myself am very confused about what would be the best blocksize and how should this be incremented. A lot have suggested initially at least increasing it to 2MB and increment it over the years, but I don't really understand what would be the aftereffects of this. Would Bitcoin nodes be downloading and upload a lot more? Would we lose a lot of full nodes because of the higher BW and memory requirements? Also because of this would there be more control over hash-power which could lead to centralization - need for attaining equilibrium?
|
|
|
|
achow101
Moderator
Legendary
Offline
Activity: 3542
Merit: 6885
Just writing some code
|
|
January 15, 2016, 05:05:40 PM |
|
Would Bitcoin nodes be downloading and upload a lot more?
If the new maximum block size was reached, then yes, nodes would be downloading and uploading more. The blockchain's size would grow faster and it would require even more disk space. Would we lose a lot of full nodes because of the higher BW and memory requirements?
Probably. Also because of this would there be more control over hash-power which could lead to centralization - need for attaining equilibrium?
Probably not. Since most miners already mine at one of 10 or so mining pools, only the pools have to maintain nodes that can handle the larger blockchain and bandwidth requirements. I'm fairly certain that the pool operators have the money to be able to upgrade, although if the block size limit grows even larger, there are concerns that it would take a while for blocks to reach pools in China due to the Great Firewall of China.
|
|
|
|
shorena
Copper Member
Legendary
Offline
Activity: 1498
Merit: 1540
No I dont escrow anymore.
|
|
January 15, 2016, 05:37:39 PM |
|
Would Bitcoin nodes be downloading and upload a lot more?
If the new maximum block size was reached, then yes, nodes would be downloading and uploading more. The blockchain's size would grow faster and it would require even more disk space. Would we lose a lot of full nodes because of the higher BW and memory requirements?
Probably. After reading the 0.12 patch notes I see no higher memory requirements. The memory imprint is already small, the biggest part are the transactions, which will be less with bigger blocks as more get confirmed. At least for a while. IF memory was meant as = disk space, then yes.
|
Im not really here, its just your imagination.
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
January 15, 2016, 06:22:09 PM |
|
Consensus is soon, I just wish there were more details as to what Bitcoin Classic represents as I cannot give my opinion on the matters until there is. My best guess is that it is a 2 -4 plan with Segwit hardforked??? https://bitcoinclassic.consider.it/merge-3-v0112-2mb-4mb?results=trueMiners supporting Classic Bitmain/Antpool - 27% BitFury - 13% BW.COM - 4% HAOBTC.com - KnCMiner - 5% Genesis Mining = over 49% Companies supporting Classic Coinbase OKCoin Bitstamp Foldapp Bitcoin.com Bread Wallet Snapcard.io Cubits Developers Jonathan Toomim Gavin Andresen Ahmed Bodiwala Jeff Garzik Peter Rizun Hopefully there will be some reconciliation between Core and Classic and the can find consensus together as there is a lot of support and talent within both groups.
|
|
|
|
Queenvio
|
|
January 15, 2016, 06:38:29 PM |
|
What about an dynamic size? So I would recommend something like the average of the past 50 blocks . With an maximum of 10 MB , to prevent spam or dos,...
I'm not an developer so I'm sure there are a lot of more things thing about it.
|
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
January 15, 2016, 06:57:52 PM |
|
What about an dynamic size? So I would recommend something like the average of the past 50 blocks . With an maximum of 10 MB , to prevent spam or dos,...
I'm not an developer so I'm sure there are a lot of more things thing about it.
I can empathize with Core not wanted to do 2 hard forks and simply preferring to perfect flexcap.... but despite it being a pain in the ass, there needs to be some reconciliation where all parties feel they win and lose equally to move forward. The Big Blockers have already gone from 20MB to 8MB to 2MB .... and Core has already gone from 1 to ~1.75MB with segwitt for capacity. I'm sure with some reasoned discussion there could be reconciliation between Classic and Core where SegWit is soft forked in ASAP, and a planned and controlled hardfork of 2-4 MB BIP is rolled out later this year to give a little breathing room to properly test flexcap , and most importantly stop all the politicking so everyone can focus on code and testing.
|
|
|
|
sAt0sHiFanClub
|
|
January 15, 2016, 09:35:49 PM |
|
I'm sure with some reasoned discussion there could be reconciliation between Classic and Core
I applaud your efforts towards harmony and union. There are dscussions going on now, but its not looking good for your hopes re: segwit. Maybe at a more realistic timeframe.
|
We must make money worse as a commodity if we wish to make it better as a medium of exchange
|
|
|
BitUsher
Legendary
Offline
Activity: 994
Merit: 1035
|
|
January 15, 2016, 09:43:52 PM |
|
I'm sure with some reasoned discussion there could be reconciliation between Classic and Core
I applaud your efforts towards harmony and union. There are dscussions going on now, but its not looking good for your hopes re: segwit. Maybe at a more realistic timeframe. source? From the Classic developers statements on slack and elsewhere I see them favorable to segwit(but perhaps more in favor of a hardfork rollout instead which is fine to do in conjunction of a blocksize increase because most of the problem with hardfork is coordinating everyone ) Cores main problems with a hardfork is the difficulty in coordinating this responsibly. If one is going to perform a hardfork you may as well do 2MB+ Segwit and the appropriate sigop protections being adjusted.
|
|
|
|
sAt0sHiFanClub
|
|
January 15, 2016, 10:55:11 PM |
|
I'm sure with some reasoned discussion there could be reconciliation between Classic and Core
I applaud your efforts towards harmony and union. There are dscussions going on now, but its not looking good for your hopes re: segwit. Maybe at a more realistic timeframe. source? From the Classic developers statements on slack and elsewhere I see them favorable to segwit(but perhaps more in favor of a hardfork rollout instead which is fine to do in conjunction of a blocksize increase because most of the problem with hardfork is coordinating everyone ) Cores main problems with a hardfork is the difficulty in coordinating this responsibly. If one is going to perform a hardfork you may as well do 2MB+ Segwit and the appropriate sigop protections being adjusted. source? re-read the slack and you will see that segwit is also-ran for now. What ill be developed will not necessarily reflect whats in core right now.
|
We must make money worse as a commodity if we wish to make it better as a medium of exchange
|
|
|
|