Bitcoin Forum
May 06, 2024, 01:47:25 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 [9] 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 ... 123 »
  Print  
Author Topic: ToominCoin aka "Bitcoin_Classic" #R3KT  (Read 157066 times)
This is a self-moderated topic. If you do not want to be moderated by the person who started this topic, create a new topic.
DieJohnny
Legendary
*
Offline Offline

Activity: 1639
Merit: 1004


View Profile
January 22, 2016, 04:38:32 AM
 #161

Core is full of shit. There is absolutely no issue with raising the block size. How about by 10%, nope, they won't raise it at all.....

Core is forcing bitcoin to be something different than what 99% of the community expected it to be... they want the blockchain to be the backbone. A reasonable position, however, Core had this agenda long ago and only now it has become completely obvious. Unfortunately, most of the community wasn't along for the Core ride... now we want off this ship.  
I would not be surprised if someone said that you were 'full of shit' after such a statement. There is a huge issue with raising the block size to 2 MB (right now; transactions that take too long to validate). How about you get proper and relevant education(enabling you to understand/do your own tests) before making baseless conclusions and accusations? There is nothing wrong with transacting mostly by the use of off chain solutions such as LN. The best way to scale would be a combination of everything.

Core has stated many times that everyone wants to raise the max_bloc_size, that is not contentious. It is how and when that it is done that is being debated in a civilised and calculated manner.
I'll wait for the validation time to get fixed and propagation time improved. Eventually the block size will be raised.


Funny that they created a centralized easily manipulated voting system to prove why their ideas for decentralization should be implemented.  A bad joke to say the least.  Ignore the static.
I think I've read somewhere that Toomin has been working hard on promoting that platform. Seems like something is not right.

you are just buying into the lies and bullshit. everything is stall and stall and stall making more time for LN and off chain.....

once that is in place then core hopes their plans will be realized and nobody will care about block size anymore....

wake up you have swallowed the wrong pill

Those who hold and those who are without property have ever formed distinct interests in society
1714960045
Hero Member
*
Offline Offline

Posts: 1714960045

View Profile Personal Message (Offline)

Ignore
1714960045
Reply with quote  #2

1714960045
Report to moderator
1714960045
Hero Member
*
Offline Offline

Posts: 1714960045

View Profile Personal Message (Offline)

Ignore
1714960045
Reply with quote  #2

1714960045
Report to moderator
1714960045
Hero Member
*
Offline Offline

Posts: 1714960045

View Profile Personal Message (Offline)

Ignore
1714960045
Reply with quote  #2

1714960045
Report to moderator
Bitcoin addresses contain a checksum, so it is very unlikely that mistyping an address will cause you to lose money.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
January 22, 2016, 06:46:37 AM
 #162

re: "huge issue with raising it" -- pretty unconvincing. 
isn't there already a fix for the 'big transactions'? 
Last I heard there was and core isn't including it... 
why am I not surprised.
There is a commit by Gavin which is not a fix, but rather a limitation per transaction (IRC). This is 'not a fix' but a bad workaround that limits the system. There is a proposal by Wuille that aims to scale down the validation time. This is a real (potential) fix.

you are just buying into the lies and bullshit. everything is stall and stall and stall making more time for LN and off chain.....

once that is in place then core hopes their plans will be realized and nobody will care about block size anymore....
You have no knowledge and making up propaganda yourself or are falling for the charlatans. There is nothing wrong with transacting on LN or off chain in general. In many use cases it will be equally secure, yet faster and cheaper than doing so on the main chain.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
da2ce7
Legendary
*
Offline Offline

Activity: 1222
Merit: 1016


Live and Let Live


View Profile
January 22, 2016, 07:32:06 AM
 #163

and we thought that the XT train was lulzy...  Grin

One off NP-Hard.
watashi-kokoto
Sr. Member
****
Offline Offline

Activity: 682
Merit: 268



View Profile
January 22, 2016, 12:43:48 PM
 #164

and we thought that the XT train was lulzy...  Grin

the ride never ends Grin
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
January 22, 2016, 12:48:29 PM
 #165

There is a commit by Gavin which is not a fix, but rather a limitation per transaction (IRC). This is 'not a fix' but a bad workaround that limits the system.

Imagine if you applied that logic to the blocksize! Wink

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
January 22, 2016, 12:51:20 PM
 #166

This is 'not a fix' but a bad workaround that limits the system.
Imagine if you applied that logic to the blocksize! Wink
I can. The block size limit does indeed limit the system. Just raising it (bad workaround) and hoping everything will be alright would be a bad move. This is why the infrastructure around it needs to be improved after which the block size limit can be safely increased.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
watashi-kokoto
Sr. Member
****
Offline Offline

Activity: 682
Merit: 268



View Profile
January 22, 2016, 12:53:38 PM
 #167

They think they raise the limit and crash system using blocks full of dust.

LOL
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
January 22, 2016, 01:11:37 PM
 #168

This is 'not a fix' but a bad workaround that limits the system.
Imagine if you applied that logic to the blocksize! Wink
I can. The block size limit does indeed limit the system. Just raising it (bad workaround) and hoping everything will be alright would be a bold move. This is why the infrastructure around it needs to be improved after which the block size limit can be safely increased.

Sorry it was a cheap shot Wink

I totally accept that raising blocksize limit is a workaround, I don't agree that its a case of blind hope. Nobody knows for sure the outcome, but I think it would be difficult to argue against the most likely scenario being that nothing drastic will happen. I can see how the risk profile might not be acceptable for some though, I am a bit more risk tolerant so that will colour my opinion. Though interestingly I wonder how you profile the risk of full transactions - is it that the 'fee market' mitigates that? are you pro RBF? genuinely interested.

When you say infrastructure do you mean hardware or protocol improvements? If the latter I would 100% agree that other solutions need to be developed. My favourite is IBLT, I'm just a guy that does a bit of web dev, but the elegance of that solution really stands out to me. I like segwit (believe it or not I'd dreamed up some similar solution in my head about partitioning off data, but anyone can *imagine* a hoverboard!) Some of franky's recent posts about the specific implementation are interesting though, I note that wuilles transaction fix also looked to depend on segwit.

If it's hardware then I'm not sure how we can measure that - I think there will always be a natural resistance to growth based on the economics of running a full node and what resources that requires. I accept that doubling the blocksize limit now does push up against the current boundary, and as a result (bit fury paper even models this) there would be some node attrition. Too much? I think this is subjective. Dr Back already expresses great concern with mining centralisation and feels that as a result we have to be super careful about node centralisation, and now more than ever I've felt more in accord with that.

So yes 2MB is not a great workaround, but imho its a necessary one for now to buy some time. I think core's resistance is no longer about technical merit, but about having become entrenched in a position and not wanting to set a new precedent. Thats just opinion though and I can't possibly prove it, nevertheless I think its worth considering as a possibility.


"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
RealBitcoin
Hero Member
*****
Offline Offline

Activity: 854
Merit: 1007


JAYCE DESIGNS - http://bit.ly/1tmgIwK


View Profile
January 22, 2016, 01:17:23 PM
 #169


To be honest I think Roger Ver agrees with Classic so when he is asked in interviews why he agrees with classic he can subtlety drop an advertisement about his website. "By the way Classic is censored in the official reddit thats why we created Bitcoin.com..." it's not that subtle actually.

Yes the censorship sucks, and they will use that against us to gain credibility. I think censorship should not be done even if we are right, it is just a bad thing to do.

it was in no way censorship, you've bought into the double-speak of the sock-puppets and shills who were over-running the legitimate bitcoin forums before better moderation showed up (notice how they still jealously covet getting a voice on these "hated censored forums") ... it was totally out of control, there was no debate before, just a loud cacophony of propaganda ... once they go to their "own" forums they had nothing to say and just started tearing strips off each other.

Wow that's amusing, there are certainly dozens of suckpuppets on this forum, the way you can buy easily 20-30 legendary accounts for a cost.

So what, it costs 10-15 BTC for the enemy to sow discontent with legendary or hero accounts. Clever and cheap. For 15 BTC is really nothing if you want to sabotage BTC.

Few legendary accounts leading the propaganda, and a few more member/jr. member sockpuppets to confirm their position, and done, you have changed the public opinion.

Now I see why the moderators have to clean up this mess Cheesy

Carlton Banks
Legendary
*
Offline Offline

Activity: 3430
Merit: 3071



View Profile
January 22, 2016, 01:32:56 PM
 #170

Some of franky's recent posts about the specific implementation are interesting though, I note that wuilles transaction fix also looked to depend on segwit.



HI FRANKY YOU ARE VERY INTERESTING AND LEVEL HEADED

WHY THANK YOU BETT, YOU ARE WELL ROUNDED AND GREAT TO TALK WITH

Vires in numeris
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
January 22, 2016, 01:36:07 PM
 #171

I totally accept that raising blocksize limit is a workaround, I don't agree that its a case of blind hope. Nobody knows for sure the outcome, but I think it would be difficult to argue against the most likely scenario being that nothing drastic will happen. I can see how the risk profile might not be acceptable for some though, I am a bit more risk tolerant so that will colour my opinion. Though interestingly I wonder how you profile the risk of full transactions - is it that the 'fee market' mitigates that? are you pro RBF? genuinely interested.
What do you mean by 'profile the risk of full transactions'? I don't have a particular opinion on RBF, I'm still in the 'grey zone' when it comes that that. However, I can see the use cases for it and making it opt in seems acceptable.

When you say infrastructure do you mean hardware or protocol improvements? If the latter I would 100% agree that other solutions need to be developed. My favourite is IBLT, I'm just a guy that does a bit of web dev, but the elegance of that solution really stands out to me. I like segwit (believe it or not I'd dreamed up some similar solution in my head about partitioning off data, but anyone can *imagine* a hoverboard!) Some of franky's recent posts about the specific implementation are interesting though, I note that wuilles transaction fix also looked to depend on segwit.
I'm talking about improvements to the protocol. Almost all of franky's posts in regards to SegWit are at least partially wrong though, and some are completely wrong.

So yes 2MB is not a great workaround, but imho its a necessary one for now to buy some time. I think core's resistance is no longer about technical merit, but about having become entrenched in a position and not wanting to set a new precedent. Thats just opinion though and I can't possibly prove it, nevertheless I think its worth considering as a possibility.
2 MB is problematic because of the validation time ATM. Besides, trying to push a hard fork so quickly is even more dangerous. You're essentially cutting everyone off who does not update. Because of this the update/consensus window needs to be greater (i.e. longer). SegWit gives enough capacity 'short term'.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
franky1
Legendary
*
Online Online

Activity: 4214
Merit: 4470



View Profile
January 22, 2016, 02:02:49 PM
 #172

This is 'not a fix' but a bad workaround that limits the system.
Imagine if you applied that logic to the blocksize! Wink
I can. The block size limit does indeed limit the system. Just raising it (bad workaround) and hoping everything will be alright would be a bad move. This is why the infrastructure around it needs to be improved after which the block size limit can be safely increased.

your doomsday scare mongering of nefarious people creating 2mb blocks filled with a single transaction of 1.99mb data wont happen..
for 2 reasons

1. as you said yourself it takes 10minutes+ just to validate the transaction before even working on hashing the block itself.. so thats 25btc lost not even mining.. then if they start mining and create a block. because its 1.99mb of data it will take longer than a block of 1.025mb or 0.99mb would take just to hash. so it wont get solved the fastest either

so.. with that said.. imagine blockheight was 400,000... the nefarious miner would need to start validating the transaction, taking 10minutes ..
simultaneously other miners validate AND hash out height 400,001.

its been 10 minutes and now the nefarious miner finally gets around to hashing the block, but because 1.99mb of data is more then 1.025mb
it takes longer.. so again the non nefarious miner hashes out 400,002 while nefarious miner is still working..

and when the nefarious miner finally gets a solution.. its blockheight will be 400,001 because its merkle data is linked to 400,000 while the rest of the network is at 400003.. making the nefarious block instantly out of sync and rejected.. not due to blocksize rules.. but due to being behind in the chain.

2. because miners know that being nefarious is risking them losing out on 3 blocks due to time to do it.. thats a potential 75btc loss ($30,000) so they would need to be bribed with $30k just to attempt it with still no guarantee it would work.

yes 30k is a small bribe but its not like they would successfully get a large tx into the chain on first attempt so the $30k payments will mount up..
and there is nothing stopping the devs from adding other rules that
a. rejects blocks with less than 200 transactions..  to force miners to actually put multiple transactions into block, instead of creating near empty blocks which not only solves the doomsday problem, but also helps ensure transactions are not held in mempool for hours while blocks are being solved without tx's
b. or reject and not relay transactions where a single tx has more than 500k of data. to teach people how to create transactions properly and more lean
EG
instead of
single TX {
1originfunds
2originfunds
3originfunds
4originfunds
5originfunds
6originfunds  -> 1destination
7originfunds
8originfunds
9originfunds
10originfunds
}

single TX {
1originfunds
2originfunds  -> 1destination
}
single TX {
3originfunds
4originfunds  -> 1destination
}
single TX {
5originfunds
6originfunds  -> 1destination
}
single TX {
7originfunds
8originfunds  -> 1destination
}
single TX {
9originfunds
10originfunds  -> 1destination
}
single TX {
1originfunds
1originfunds  -> 1destination
}

oh and by the way segwit doesnt prevent the doomsday 1tx with 4000 dust inputs, because although segwit separates the signatures it still needs to check the signatures.. so it would still be 10 minute of validation time for nefarious segwit miners or normal miners.. before even getting to the hashing part

infact segwit makes it easier at the hashing block stage to make such a doomsday block be able to be part of the chain faster. because after the 10minute validation.. the actual block data wont be 1.99mb it would be 1mb-1.2mb because its not holding the signatures.. so the time to actually hash the block can result in a solution sooner than non segwit miners..

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
iCEBREAKER (OP)
Legendary
*
Offline Offline

Activity: 2156
Merit: 1072


Crypto is the separation of Power and State.


View Profile WWW
January 22, 2016, 03:21:37 PM
Last edit: January 23, 2016, 11:55:10 AM by iCEBREAKER
 #173

This is 'not a fix' but a bad workaround that limits the system.
Imagine if you applied that logic to the blocksize! Wink
I can. The block size limit does indeed limit the system. Just raising it (bad workaround) and hoping everything will be alright would be a bad move. This is why the infrastructure around it needs to be improved after which the block size limit can be safely increased.

your doomsday scare mongering of nefarious people creating 2mb blocks filled with a single transaction of 1.99mb data wont happen..
for 2 reasons

1. as you said yourself it takes 10minutes+ just to validate the transaction before even working on hashing the block itself.. so thats 25btc lost not even mining.. then if they start mining and create a block. because its 1.99mb of data it will take longer than a block of 1.025mb or 0.99mb would take just to hash. so it wont get solved the fastest either


2. because miners know that being nefarious is risking them losing out on 3 blocks due to time to do it.. thats a potential 75btc loss ($30,000) so they would need to be bribed with $30k just to attempt it with still no guarantee it would work.

The construed troll tx don't have to be all in a single 0.99MB (BTC) or 1.99MB (Classic SW or whatever they are calling that shade of bikeshed today) one.

That's just the worst case scenario, for maximum quadratically superlinear sigop trolling fun.  The less-than-worse-case outcomes aren't OK either.


Your textbook hand-waving "won't happen" ignores 4 key facts

1. "10minutes+" varies depending on the CPU power used to construe it (I'll just assume you are right about construal taking time equal to validation given equal CPU/IO).

2.  Only one miner needs to defect and mine a block for it/the to be included.  If there is are huge >75BTC or whatever (in-band) tx fees and/or huge (out-of-band) bribes at stake, you can bet it will happen.  Also, renting hashpower is a thing.  And it's easy to set up a pool with subsidized 101% or 200% payouts to summon any amount of ASICs as easily as turning on a faucet firehose.

3.  MP and other small block militia types have staggering sums of BTC in their war chests; the coffers of La Serenissima are second to none but The Creator.

4.  Unlike the attacking greedy VC ratfucks, the defenders in 3. are willing to spend every last BTC to defend the walls of La Serenissima, as they consider the BTC worthless if the walls fall.


██████████
█████████████████
██████████████████████
█████████████████████████
████████████████████████████
████
████████████████████████
█████
███████████████████████████
█████
███████████████████████████
██████
████████████████████████████
██████
████████████████████████████
██████
████████████████████████████
██████
███████████████████████████
██████
██████████████████████████
█████
███████████████████████████
█████████████
██████████████
████████████████████████████
█████████████████████████
██████████████████████
█████████████████
██████████

Monero
"The difference between bad and well-developed digital cash will determine
whether we have a dictatorship or a real democracy." 
David Chaum 1996
"Fungibility provides privacy as a side effect."  Adam Back 2014
Buy and sell XMR near you
P2P Exchange Network
Buy XMR with fiat
Is Dash a scam?
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
January 22, 2016, 04:05:12 PM
 #174

I totally accept that raising blocksize limit is a workaround, I don't agree that its a case of blind hope. Nobody knows for sure the outcome, but I think it would be difficult to argue against the most likely scenario being that nothing drastic will happen. I can see how the risk profile might not be acceptable for some though, I am a bit more risk tolerant so that will colour my opinion. Though interestingly I wonder how you profile the risk of full transactions - is it that the 'fee market' mitigates that? are you pro RBF? genuinely interested.
What do you mean by 'profile the risk of full transactions'? I don't have a particular opinion on RBF, I'm still in the 'grey zone' when it comes that that. However, I can see the use cases for it and making it opt in seems acceptable.

sorry i meant full blocks. i.e. do you think RBF and/or fee market mitigates any risk there. do you even think that full blocks is a problem?

the reason I said franky's posts were interesting was because I haven't verified them Wink he might have good points, he might not!

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
watashi-kokoto
Sr. Member
****
Offline Offline

Activity: 682
Merit: 268



View Profile
January 22, 2016, 04:08:43 PM
 #175

I think Americans that you should be worrying. It would be really a shame that some power lines may fail from the storm and that you will not be able to mine some smaller than 1MB blocks.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
January 22, 2016, 05:55:49 PM
 #176

-snip-
I won't waste my time going through the basics again. There's a security risk, i.e. new attack vector. Whether you believe that someone is going to try abusing it or not does not change the fact of its existence.

sorry i meant full blocks. i.e. do you think RBF and/or fee market mitigates any risk there. do you even think that full blocks is a problem?
I don't think that there's a problem when some blocks are full (e.g. right now). However, I think that if the percentage is high (full blocks/total blocks) a problem might occur. The fees might become too high for Bitcoin to make sense especially for people transacting smaller amounts. This could also be worsened if we do not have off-chain solutions by then. I'm in favor of raising the block size limit once the needed improvements are implemented; I'm also in favor of a dynamic block size.

the reason I said franky's posts were interesting was because I haven't verified them Wink he might have good points, he might not!
He's starting to look like a hopeless case though.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
RealBitcoin
Hero Member
*****
Offline Offline

Activity: 854
Merit: 1007


JAYCE DESIGNS - http://bit.ly/1tmgIwK


View Profile
January 22, 2016, 06:32:20 PM
 #177


I won't waste my time going through the basics again. There's a security risk, i.e. new attack vector. Whether you believe that someone is going to try abusing it or not does not change the fact of its existence.

What is the difference between a 1mb block size and a 1.001 mb block size? Is there any logical boundary why the 1mb is the block limit or is it only arbitrary?

If so, then raising the block size from 1mb to 1.001mb would cause any harm? Or the mere fact of raising the block size is the issue here (due to panic, dividing the userbase, and creating division in users)?


I`m just trying to understand things here.

franky1
Legendary
*
Online Online

Activity: 4214
Merit: 4470



View Profile
January 22, 2016, 06:35:24 PM
 #178


Almost all of franky's posts in regards to SegWit are at least partially wrong though, and some are completely wrong.


-snip-
I won't waste my time going through the basics again. There's a security risk, i.e. new attack vector. Whether you believe that someone is going to try abusing it or not does not change the fact of its existence.

the reason I said franky's posts were interesting was because I haven't verified them Wink he might have good points, he might not!
He's starting to look like a hopeless case though.


so my ongoing rant has been that segwit will cut off other implementations from fully validating unless they too upgraded to be segwit supporters.
your rant that everything will be fine...
you even said that the dev's said everything would be fine.. but................

Quote
[01:03] <Lauda> sipa what about a client that does not support segwit?
[01:03] <maaku> Lauda: why would you care to?
[01:03] <Lauda> Just out of curiousity.
[01:04] <sipa> they won't see the witness data
[01:04] <sipa> but they also don't care about it

[01:04] <Lauda> Someone mentioned it. So it is not possible for a client that does not support Segwit to see the witness data?
[01:04] <maaku> Lauda: it is certainly possible
[01:04] <maaku> Lauda: but it's meaningless to do.
[01:05] <sipa> of course it is "possible"... but that "possible" just means supporting segwit

[01:05] <Chiwawa_> imagine people wanted to stick with bitcoin-core 0.11 and not upgrade, will they be cut off from getting witness data, by defalt if segwit gets consensus?
[01:06] <maaku> Chiwawa_: they could certainly code up their wallet to get it, but again what's the point? are they going to check the witness themselves?

so unless other implementations add more code just to be able to fully validate again. they are going to get cut off and just passing the parcel of data they dont understand.. which in itself is a risk if a non-segwit miner adds data it cant check into a block.

basically
bitcoin-core v0.1
bitcoin-core v0.11
bitcoin-core v0.12
bitcoin classic
bitcoin unlimited
bitcoin xt
bitcoin .. whatever the other dozen implementations are
will be cut off from seeing signatures if segwit gets consensus..
and that makes bitcoincore v0.13SW the dictator

have a nice day.. as you are becoming a hopeless case

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
BitUsher
Legendary
*
Offline Offline

Activity: 994
Merit: 1034


View Profile
January 22, 2016, 06:38:39 PM
 #179


I won't waste my time going through the basics again. There's a security risk, i.e. new attack vector. Whether you believe that someone is going to try abusing it or not does not change the fact of its existence.

What is the difference between a 1mb block size and a 1.001 mb block size? Is there any logical boundary why the 1mb is the block limit or is it only arbitrary?

If so, then raising the block size from 1mb to 1.001mb would cause any harm? Or the mere fact of raising the block size is the issue here (due to panic, dividing the userbase, and creating division in users)?


I`m just trying to understand things here.

Well yes , rushed HFs in general even to 1.001 MB can cause problems. This is Why Wang Chun with F2Pool agreed with the devs that we need at least 3 months with soft forks and 1 year with hard forks and the next planned HF is Feb /March 17' to 2MB. As Pieter suggests I think that we can safely do one within 6 months with a high degree of consensus but 1 year would be better.
RealBitcoin
Hero Member
*****
Offline Offline

Activity: 854
Merit: 1007


JAYCE DESIGNS - http://bit.ly/1tmgIwK


View Profile
January 22, 2016, 07:06:35 PM
 #180


Well yes , rushed HFs in general even to 1.001 MB can cause problems. This is Why Wang Chun with F2Pool agreed with the devs that we need at least 3 months with soft forks and 1 year with hard forks and the next planned HF is Feb /March 17' to 2MB. As Pieter suggests I think that we can safely do one within 6 months with a high degree of consensus but 1 year would be better.

Alright then, see I`m fair enough to admit that i`m not a technically well experienced person to call the shots on this.

However many trolls who also dont have any idea on the issue seem to now try to dictate policy. It's very arrogant.

So yes people need to really look into a mirror and think before they say ridiculous proposals.

Pages: « 1 2 3 4 5 6 7 8 [9] 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 ... 123 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!