Bitcoin Forum
May 03, 2024, 03:24:31 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 [All]
  Print  
Author Topic: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network"  (Read 18277 times)
BittBurger (OP)
Hero Member
*****
Offline Offline

Activity: 924
Merit: 1001


View Profile
March 08, 2016, 08:10:07 AM
Last edit: March 10, 2016, 12:50:13 AM by BittBurger
 #1

Can someone explain to me why there is any debate when Nakamoto himself said:

---------------------
Quote from Mike Hearn:

https://bitcointalk.org/index.php?topic=149668.msg1596879#msg1596879
https://duckduckgo.com/?q=%22Bitcoin+can+already+scale+much+larger+than+that+with+existing+hardware+for+a+fraction+of+the+cost.%22

  • Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.
  • The block size limit was a quick safety hack that was always meant to be removed.
  • In fact, in the very first email he sent me back in April 2009, he said this:

--------------------------------------------------
Email from Satoshi Nakamoto to Mike Hearn:

"Hi Mike,
I'm glad to answer any questions you have. If I get time, I ought to write a FAQ to supplement the paper.
There is only one global chain.

The existing Visa credit card network processes about 15 million Internet purchases per day worldwide. Bitcoin can already scale much larger than that with existing hardware for a fraction of the cost. It never really hits a scale ceiling. If you're interested, I can go over the ways it would cope with extreme size.  By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.

I don't anticipate that fees will be needed anytime soon, but if it becomes too burdensome to run a node, it is possible to run a node that only processes transactions that include a transaction fee. The owner of the node would decide the minimum fee they'll accept. Right now, such a node would get nothing, because nobody includes a fee, but if enough nodes did that, then users would get faster acceptance if they include a fee, or slower if they don't. The fee the market would settle on should be minimal. If a node requires a higher fee, that node would be passing up all transactions with lower fees.
It could do more volume and probably make more money by processing as many paying transactions as it can. The transition is not controlled by some human in charge of the system though, just individuals reacting on their own to market forces.

Eventually, most nodes may be run by specialists with multiple GPU cards. For now, it's nice that anyone with a PC can play without worrying about what video card they have, and hopefully it'll stay that way for a while. More computers are shipping with fairly decent GPUs these days, so maybe later we'll transition to that."


~ Satoshi Nakamoto
---------------------------------------
Quote:

"Satoshi said back in 2010 that he intended larger block sizes to be phased in with some simple if (height > flag_day) type logic, theymos has linked to the thread before. I think he would be really amazed at how much debate this thing has become. He never attributed much weight to it, it just didn't seem important to him. And yes, obviously, given the massive forum dramas that have resulted it'd have been nice if he had made the size limit floating from the start like he did with difficulty. However, he didn't and now we have to manage the transition."

~ Mike Hearn, on bitcointalk.org, March 07, 2013, 06:15:30 PM

https://bitcointalk.org/index.php?topic=1347.msg15366#msg15366
bit.ly/1YqiV41

----------------------------------------
Quote from Satoshi:

It can be phased in, like:

if (blocknumber > 115000)
    maxblocksize = largerlimit

It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.  When we're near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade.


~ Satoshi Nakamoto, on bitcointalk.org, October 04, 2010, 07:48:40 PM

----------------------------------------
----------------------------------------
----------------------------------------

So now,

If Satoshi himself "never really gave block size limit much weight"  (he assumed scaling was an obvious need that would happen quickly and easily), why are a group of developers refusing to scale the protocol... while simultaneously creating a tool that will generate massive income by moving transactions off the block chain, and into their exclusive transaction processing system (Lightening Network)?  Is it any wonder they were given nearly $50 million in VC funding when VC's realized they just took over Bitcoin transaction processing?

Is this not blatantly changing the design and purpose Satoshi gave to Bitcoin (to freely scale to massive sizes, to support on-chain transaction needs).  This seems to be of grave concern, no?

-B-


----------------------------------------
----------------------------------------
----------------------------------------

Owner: "The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
View it on the Blockchain | Genesis Block Newspaper Copies
"This isn't the kind of software where we can leave so many unresolved bugs that we need a tracker for them." -- Satoshi
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 08:15:31 AM
 #2

Quote
By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.
Did it increase by tenfold in 5 years? Not even close. Satoshi did not have the adequate data here.
Quote
Satoshi certainly didn’t do much (if any) analysis of the scaling limitations of Bitcoin.
We need not appeal to authority. Just because Satoshi invented it, that does not mean that he knows the answers to everything.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 08:24:59 AM
 #3

I can't explain it.

The people who have decided to change Bitcoin by not doing these things keep trying to explain it.

The elephant in the room just will not go away though. The limit was a temporary fix.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 08:29:00 AM
Last edit: March 08, 2016, 08:42:29 AM by CIYAM
 #4

The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Quantus
Legendary
*
Offline Offline

Activity: 883
Merit: 1005



View Profile
March 08, 2016, 08:38:06 AM
 #5

Satoshi was wrong about so many things.

(I am a 1MB block supporter who thinks all users should be using Full-Node clients)
Avoid the XT shills, they only want to destroy bitcoin, their hubris and greed will destroy us.
Know your adversary https://www.youtube.com/watch?v=BKorP55Aqvg
Maslate
Hero Member
*****
Offline Offline

Activity: 2996
Merit: 678


Message @Hhampuz if you are looking for a CM!


View Profile
March 08, 2016, 08:45:37 AM
 #6

Satoshi did a fork by adding one line, why cannot the Core do the same?

----------------------------------------
Quote from satoshi:
It can be phased in, like:

if (blocknumber > 115000)
    maxblocksize = largerlimit

It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.  When we're near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade.

~ Satoshi Nakamoto, on bitcointalk.org, October 04, 2010, 07:48:40 PM



R


▀▀▀▀▀▀▀██████▄▄
████████████████
▀▀▀▀█████▀▀▀█████
████████▌███▐████
▄▄▄▄█████▄▄▄█████
████████████████
▄▄▄▄▄▄▄██████▀▀
LLBIT
  CRYPTO   
FUTURES
 1,000x 
LEVERAGE
COMPETITIVE
    FEES    
 INSTANT 
EXECUTION
.
   TRADE NOW   
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 08:47:27 AM
 #7

Satoshi did a fork by adding one line, why cannot the Core do the same?

Sure - you could change it to say 100 MB or say 10 GB by a very simple coding change.

Result - no-one can verify all the txs in a block in 10 minutes (maybe not even in an hour) - oops - blockchain stops working. But at least it was a "quick fix". Cheesy

Lesson to be learned:

"Just because something can be changed easily doesn't mean that such a change is actually a good idea."

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Amph
Legendary
*
Offline Offline

Activity: 3206
Merit: 1069



View Profile
March 08, 2016, 08:49:29 AM
 #8

Satoshi was wrong about so many things.

like? he said "expect" not that it will be exactly 10 times higher, so it's still a legit prediction
Zarathustra
Legendary
*
Offline Offline

Activity: 1162
Merit: 1004



View Profile
March 08, 2016, 11:10:43 AM
 #9

Satoshi was wrong about so many things.

like? he said "expect" not that it will be exactly 10 times higher, so it's still a legit prediction

Yes, it doesn't matter if he was exactly right or not about it. The message is clear.
To the shills of the segwit trojan horse even 4 MB is - surprise! - no problem:

https://www.reddit.com/r/Bitcoin/comments/492tnm/if_according_to_core_roadmap_segwit_will_be/d0p0jes?context=3
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 11:16:30 AM
 #10

The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?



"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
Lethn
Legendary
*
Offline Offline

Activity: 1540
Merit: 1000



View Profile WWW
March 08, 2016, 11:17:11 AM
 #11

That's very interesting, so what he's saying is that the network should have no problems handling an increase in the amount of transactions then. Was this limit put in to prevent people from spamming transactions until there was enough network hashing power to cope with an increase if I'm understanding this correctly?
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 11:20:30 AM
 #12

Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?

Am very glad you asked this question.

Let's say you are a rich corporation with money to waste on the most expensive mining rigs, supporting computers and the best bandwidth available on the planet.

You now expend all of those resources to fill up your blocks with 64 MB of txs (create your own txs if you can't find enough from others in the mempool) which basically no-one else can do (and will struggle to even verify in 10 minutes).

You now *own* the blockchain and get all the fees and block rewards - of course that means that Bitcoin is no longer decentralised but of course you'd try and pretend that you were more than one identity wouldn't you. Wink

The people that think getting Bitcoin to compete with the likes of VISA is just a question of increasing block size are naive at best (or at worst are actually being paid by whoever is wanting to gain control).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
pedrog
Legendary
*
Offline Offline

Activity: 2786
Merit: 1031



View Profile
March 08, 2016, 11:28:18 AM
 #13

The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?




A shitload of money...

When we can fill a 64 MB block, or if we will ever fill a 64 MB block, there's probably no subsidy anymore, so miners make money with transactions fees, to collect such fees they need to include transactions in blocks, the more the merrier, big blocks is how they pay their bills, there's your incentive.

sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 11:29:17 AM
 #14

Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?

Am very glad you asked this question.

Let's say you are a rich corporation with money to waste on the most expensive mining rigs, supporting computers and the best bandwidth available on the planet.

You now expend all of those resources to fill up your blocks with 64 MB of txs (create your own if you can't find enough in the mempool) which basically no-one else can do (and will struggle to even verify in 10 minutes).

You now *own* the blockchain and get all the fees and block rewards - of course that means that Bitcoin is no longer decentralised but of course you'd try and pretend that you were more than one identity wouldn't you. Wink

The people that think getting Bitcoin to compete with the likes of VISA is just a question of increasing block size are naive at best (or perhaps more likely are being paid by whoever is wanting to gain control).


As a bad actor how do you mitigate SPV mining?

How quickly do you think other miners will ignore your 'malicious' blocks?

If such a malicious actor exists how does a 1MB block size limit stop them from attacking/destroying bitcoin?

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 11:31:14 AM
 #15

The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?




A shitload of money...

When we can fill a 64 MB block, or if we will ever fill a 64 MB block, there's probably no subsidy anymore, so miners make money with transactions fees, to collect such fees they need to include transactions in blocks, the more the merrier, big blocks is how they pay their bills, there's your incentive.

correct!

So as a rational actor you include as many as you think you can get away with without somebody else beating you to the punch.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 11:31:39 AM
 #16

How quickly do you think other miners will ignore your 'malicious' blocks?

If you've changed the rules to say that 64MB blocks are legit then they are not "malicious" by definition (they would be in accordance with the consensus rules) so they can't be ignored.

If such a malicious actor exists how does a 1MB block size limit stop them from attacking/destroying bitcoin?

Because everyone with even fairly average bandwidth and supporting hardware can compete to create the next block (which is what happens now).

The reason that SPV mining is used currently is to try and get a slight advantage that poor bandwidth would otherwise prevent.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 11:33:39 AM
 #17

How quickly do you think other miners will ignore your 'malicious' blocks?

If you've changed the rules to say that 64MB blocks are legit then they are not "malicious" by definition (they would be in accordance with the consensus rules).

If such a malicious actor exists how does a 1MB block size limit stop them from attacking/destroying bitcoin?

Because everyone with even fairly average bandwidth and supporting hardware can compete to create the next block (which is what happens now).


Your theoretical bad actor mounts a 51% attack => bitcoin is destroyed.


"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 11:34:10 AM
Last edit: March 08, 2016, 12:02:14 PM by Lauda
 #18

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?
Actually this can happen at 2 MB block size limit due to the problem of quadratic scaling. Segwit aims to improve on this problem by scaling it down to make it linear. Gavin does acknowledge this as his own proposal (BIP 109) contains a workaround that prevents this from happening (sigops limitation).

Was this limit put in to prevent people from spamming transactions until there was enough network hashing power to cope with an increase if I'm understanding this correctly?
No. The problem is not directly related to the hashing power. You can look at three factors: 1) Storage; 2) Bandwidth; 3) Processing power (for validation).


Update: Corrections (typo at 3rd point).

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
psybits
Legendary
*
Offline Offline

Activity: 1386
Merit: 1000



View Profile
March 08, 2016, 11:34:34 AM
 #19

If Satoshi came back and helped solve this debate now that would be something.
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 11:35:23 AM
 #20

Your theoretical bad actor mounts a 51% attack => bitcoin is destroyed.

That is of course up to them (they might happily continue to reap the benefits of the block rewards and fees).

Of course if they did end up with >50% (you don't actually need 51% but for some reason people are fixated on that number) they might be tempted to take advantage of a huge double-spend opportunity.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 11:35:31 AM
 #21

How quickly do you think other miners will ignore your 'malicious' blocks?

If you've changed the rules to say that 64MB blocks are legit then they are not "malicious" by definition (they would be in accordance with the consensus rules) so they can't be ignored.

If such a malicious actor exists how does a 1MB block size limit stop them from attacking/destroying bitcoin?

Because everyone with even fairly average bandwidth and supporting hardware can compete to create the next block (which is what happens now).

The reason that SPV mining is used currently is to try and get a slight advantage that poor bandwidth would otherwise prevent.


You stated that a 64MB block would give control of the blockchain to the bad actor. This is wrong because as soon as the bad actor broadcasts the header every other miner is on an even footing.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 11:37:33 AM
 #22

You stated that a 64MB block would give control of the blockchain to the bad actor. This is wrong because as soon as the bad actor broadcasts the header every other miner is on an even footing.

You are forgetting that it takes time to broadcast 64MB (so they could have already mined the next block in advance) and if others are not going to bother validating (which as stated would likely take them more than 10 minutes) then they could easily be tricked into mining on a fork.

It should be noted that the larger that you make the block size the easier such an attack becomes.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
eternalgloom
Legendary
*
Offline Offline

Activity: 1792
Merit: 1283



View Profile WWW
March 08, 2016, 11:42:59 AM
 #23

Satoshi was wrong about so many things.

like? he said "expect" not that it will be exactly 10 times higher, so it's still a legit prediction
If only it were possible for him to come back and give his opinion on the recent debate Smiley
Yeah, likely never going to happen sadly..

sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 11:45:39 AM
 #24

You stated that a 64MB block would give control of the blockchain to the bad actor. This is wrong because as soon as the bad actor broadcasts the header every other miner is on an even footing.

You are forgetting that it takes time to broadcast 64MB (and they will have already mined the next block in advance).


How big is the header? How long does it take to broadcast *the header*?

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 11:45:57 AM
 #25

You should accept the science of software engineering rather than hoping that the "words of a prophet" will solve the scaling issue.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 11:46:46 AM
 #26

How big is the header? How long does it take to broadcast *the header*?

You only broadcast that to your miners (if you are a pool) not to everyone else (you do know how this stuff works or don't you?).

And as stated before if you don't validate the txs in a block then you are going to be easily tricked into mining on an invalid fork.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Lethn
Legendary
*
Offline Offline

Activity: 1540
Merit: 1000



View Profile WWW
March 08, 2016, 11:53:43 AM
 #27

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?
Actually this can happen at 2 MB block size limit due to the problem of quadratic scaling. Segwit aims to improve on this problem by scaling it down to make it linear. Gavin does acknowledge this as his own proposal (BIP 109) contains a workaround that prevents this from happening (sigops limitation).

Was this limit put in to prevent people from spamming transactions until there was enough network hashing power to cope with an increase if I'm understanding this correctly?
No. The problem is not directly related to the hashing power. You can look at three factors: 1) Storage; 2) Bandwidth; 3) Process power (for validation).

Thanks for clearing it up I won't pretend to be an expert on this, it's not my area at all.
Gleb Gamow
In memoriam
VIP
Legendary
*
Offline Offline

Activity: 1428
Merit: 1145



View Profile
March 08, 2016, 12:00:21 PM
 #28

Interesting email!  Cool Cool Cool

Not sure if anybody else caught it, but I believe I uncovered an interesting clue within as to who SN is, with apologies for not disclosing while I conduct further research with hopes of being the first to disclose. That said, feel free to beat me to the punch if you, too, spotted the obvious (at least to me) clue.
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 12:08:20 PM
 #29

How big is the header? How long does it take to broadcast *the header*?

You only broadcast that to your miners (if you are a pool) not to everyone else (you do know how this stuff works or don't you?).

And as stated before if you don't validate the txs in a block then you are going to be easily tricked into mining on an invalid fork.


Block 401302: https://blockchain.info/block-index/1085183/000000000000000003ab3fee4bedfd4c8afa723454e322b44b69dd42262ecf9b
This block was mined by Antpool on 2016-03-05 at 17:59:38, it was a "full" block, as you have told me repeatedly it takes longer for big blocks to propagate.

Block 401303: https://blockchain.info/block-index/1085184/000000000000000006910040284ea1769d482a4ab7ea65b30af14aabc2b69749
This block was mined by BW.COM on 2016-03-05 18:00:39, it was an empty block, because the block had not yet propagated to them. They did however, have the header for block 401302.

So, in answer to your question, I don't know exactly how this stuff works, but I know enough.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 12:10:07 PM
 #30

This is because they are doing what is being termed as SPV mining (not normal or should I say "proper" mining).

When you mine in this manner you risk ending up on a fork (as happened last year to the Chinese mining pools after a soft-fork).

So if you had huge blocks and incredibly good hardware (and bandwidth within your pool of miners) then you'd just play games with those that do SPV mining and make them all end up on forks (after a while they would simply give up trying).

I hope you are starting to understand that SPV mining is *not good enough* (and the more you scale things up the worse it gets).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 08, 2016, 12:17:27 PM
 #31

So lets keep Satoshi Vision real. He gave us Bitcoin. To the community. We wouldn't have a discussion now if it wasn't for attacks back then. So again. Blockstream and Lighting Network are trying to take over and that's why they keep Core developers on their payroll. They want results. Forcing upon users their solution and keep the network attacked and flooded.

Let it be a free market. With 2MB upgrade side chain will exists without any issue, but this time people can choose and not be forced upon. Core will never keep their word on 2MB upgrade.
Gleb Gamow
In memoriam
VIP
Legendary
*
Offline Offline

Activity: 1428
Merit: 1145



View Profile
March 08, 2016, 12:21:18 PM
Last edit: March 08, 2016, 12:41:20 PM by Gleb Gamow
 #32

Upon a second read-through, I'm somewhat certain that SN didn't pen that email, with perhaps MH penning it himself to perhaps advance an agenda. The prose simply doesn't feel Nakamoto-esque, that and why did Satoshi Nakamoto opted for no double space after a full stop?  Shocked At first, I thought it was an anomaly due to perhaps the choice of email service used in sending the message disallowed the practice, but that doesn't seem to be the case since there's one example within depicting a DS after a FS.



EDIT: Call me crazy, but upon reading some of Mike's early posts, it feels to me like SN's email could very well have been penned by Mike. Try it yourself: Read the email again, then read a few posts by SN and few posts by MH and see what you come up with.

EDIT 2: Speaking of a double space after a full stop, it looks like Gavin mostly ended the practice of such shortly after it was revealed that SN employed the practice.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 12:23:39 PM
 #33

So lets keep Satoshi Vision real. He gave us Bitcoin. To the community. We wouldn't have a discussion now if it wasn't for attacks back then. So again. Blockstream and Lighting Network are trying to take over and that's why they keep Core developers on their payroll. They want results. Forcing upon users their solution and keep the network attacked and flooded.
Pure FUD. The mistakes in your argument:
1) Only a small portion of Core contributors are paid by Blockstream
2) The Lightning Network is not a product of Blockstream (they only have 1 developer working on it). There are other independent teams that are working on their own implementations of this.
3) The network gets attacked by third parties who either want to push their agenda (recently possibly done by the 'forkers') or harm the network overall.

Let it be a free market. With 2MB upgrade side chain will exists without any issue, but this time people can choose and not be forced upon. Core will never keep their word on 2MB upgrade.
They never promised an upgrade to 2 MB block size limit.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 08, 2016, 12:46:27 PM
 #34

So lets keep Satoshi Vision real. He gave us Bitcoin. To the community. We wouldn't have a discussion now if it wasn't for attacks back then. So again. Blockstream and Lighting Network are trying to take over and that's why they keep Core developers on their payroll. They want results. Forcing upon users their solution and keep the network attacked and flooded.
Pure FUD. The mistakes in your argument:
1) Only a small portion of Core contributors are paid by Blockstream
2) The Lightning Network is not a product of Blockstream (they only have 1 developer working on it). There are other independent teams that are working on their own implementations of this.
3) The network gets attacked by third parties who either want to push their agenda (recently possibly done by the 'forkers') or harm the network overall.

Let it be a free market. With 2MB upgrade side chain will exists without any issue, but this time people can choose and not be forced upon. Core will never keep their word on 2MB upgrade.
They never promised an upgrade to 2 MB block size limit.

Well even if you didn't wanted , you still proven my point. It would have been too obvious to go full retard. They will take Bitcoin , piece by piece slowly but surely. Do I have to learn you tactics of takeovers? You have any idea how big system or corporation is been taken over and not to trigger any big alarms? . Bitcoin will be owned by Blockstream and VC and Bank sponsored company.

We need to move on from the corrupted hands of Bitcoin Core as soon as possible for a free payment system and not their play toy.

P.S: So a roadmap is not a promise? It is in their roadmap. But somewhere in 2017 ...2018 or maybe never, enough time so their plans to get in place and gain momentum. How else they will push up their sidechains? Everyone will ignore them as long as Bitcoin chain is free and fast and cheap to use. It so obvious. It's simple and basic economics.
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 08, 2016, 01:01:12 PM
 #35

We need not appeal to authority. Just because Satoshi invented it, that does not mean that he knows the answers to everything.

just because blockstream hold the commit keys of one of the 12 implementations does not mean we should bow down to only their wishes.
or are you happy that blockstream is an authority and they have god like knowledge. by the way have you read a line of code yet to satisfy your own mind that bitcoin is not wrote in java? or do you have to go run and ask a few blockstreamers to spoonfeed you the info


I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 08, 2016, 01:11:51 PM
 #36

We need not appeal to authority. Just because Satoshi invented it, that does not mean that he knows the answers to everything.

just because blockstream hold the commit keys of one of the 12 implementations does not mean we should bow down to only their wishes.
or are you happy that blockstream is an authority and they have god like knowledge. by the way have you read a line of code yet to satisfy your own mind that bitcoin is not wrote in java? or do you have to go run and ask a few blockstreamers to spoonfeed you the info



A+++. It is me or Bitcoin Core and Core supporters are starting to be somehow in Blockstream pockets. It is so obvious. Exactly like a elephant in a small , small room.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 01:46:46 PM
 #37

Well even if you didn't wanted , you still proven my point. It would have been too obvious to go full retard. They will take Bitcoin , piece by piece slowly but surely. Do I have to learn you tactics of takeovers? You have any idea how big system or corporation is been taken over and not to trigger any big alarms? . Bitcoin will be owned by Blockstream and VC and Bank sponsored company.
I've proved nothing. Bitcoin can't be owned by anyone. Your post makes no sense if you are in support of Classic which has strong support from another nice company (Coinbase). I wonder why.

We need to move on from the corrupted hands of Bitcoin Core as soon as possible for a free payment system and not their play toy.
No. Nothing is free in life.

P.S: So a roadmap is not a promise? It is in their roadmap. But somewhere in 2017 ...2018 or maybe never, enough time so their plans to get in place and gain momentum. How else they will push up their sidechains? Everyone will ignore them as long as Bitcoin chain is free and fast and cheap to use. It so obvious. It's simple and basic economics.
In what roadmap does it say that there will be a 2 MB block size limit? They already have their commercial sidechain and the elements used in it are open source.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 01:49:48 PM
 #38

This is because they are doing what is being termed as SPV mining (not normal or should I say "proper" mining).

When you mine in this manner you risk ending up on a fork (as happened last year to the Chinese mining pools after a soft-fork).

So if you had huge blocks and incredibly good hardware (and bandwidth within your pool of miners) then you'd just play games with those that do SPV mining and make them all end up on forks (after a while they would simply give up trying).

I hope you are starting to understand that SPV mining is *not good enough* (and the more you scale things up the worse it gets).


A couple posts back I had to correct your flawed understanding of SPV mining. If there is something I have said that is factually incorrect please refute it with evidence and not opinion.

SPV mining is perfectly good at mitigating the scenario you described in which a malicious actor creates a poison block on order to "own" the blockchain and collect all the rewards. Now that you have the correct understanding of how SPV works I think it should be clear to you.

Your premise that this bad actor would play games with SPV miners, was covered by my second question:

How quickly do you think other miners will ignore your 'malicious' blocks?

You said they had to accept the blocks because they were valid. However in the situation where the bad actor is gaming SPV nodes, then it quickly becomes obvious that although the block may be valid, the miner is malicious. So whilst the block may be valid, it is in the miners own best economic interest to reject all blocks from the bad actor. It is the bad actor who is forked off the network.

The reality of whether SPV mining is good enough is complicated and involves very many variables. So neither you nor I can make subjective judgements about its overall efficacy.

I would invite you to consider what happens with transaction fees in a 64MB full block scenario, and whether the risk/reward for SPV mining still favours mining empty blocks?

In doing so consider also these fairly reasonable assumptions:

64MB is a long way off even by the most optimistic big block proposal (e.g. in BIP101 we wouldn't hit it for another decade) work has already been done reducing the bandwidth necessary for block propagation, and thus the time. For those 64MB blocks to be full, then it would have to be viable for miners to be filling them. The size of blocks is self limiting according to the risk reward of including transactions vs being orphaned.

In 2026 the block reward is predicted to be 3.125BTC

If a 64MB block could be mined in a reasonable timeframe such that the chance of orphan was low enough to be acceptable to miners.

If the fee density remained consistent at 10k satoshis/k then the total fees would be 6.4BTC

FWIW, I think the incentive to mine an empty block is somewhat reduced as a result of scaling.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 01:51:50 PM
 #39

A couple posts back I had to correct your flawed understanding of SPV mining. If there is something I have said that is factually incorrect please refute it with evidence and not opinion.

Sorry - I had to ignore the rest of your rubbish wall of text to ask you to repeat your correction of *my flawed understanding* (of anything to do with Bitcoin - little own SPV mining).

Please explain (I know - it is hard for you to even put more than one sentence of any logical value together but please try)?

(you are starting to act like that idiot @franky1 - maybe you are an alt of his - as his habit is to post walls of text starting with fallacies)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 02:07:44 PM
 #40

Hmm.. having troubles coming up with a response (or are you too busy getting your ass whipped by CarltonBanks in the other topic). Cheesy

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
calkob
Hero Member
*****
Offline Offline

Activity: 1092
Merit: 520


View Profile
March 08, 2016, 02:27:46 PM
 #41

It is prophetic when we realize that he spelled "Bitcoin" with a capital "B"

It's like he could see the future or something


He could he was from 2120.  Thats the year that the Node wars start.  as humanity battle it out of node fees being introduced.... Grin
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 02:40:30 PM
 #42

A couple posts back I had to correct your flawed understanding of SPV mining. If there is something I have said that is factually incorrect please refute it with evidence and not opinion.

Sorry - I had to ignore the rest of your rubbish wall of text to ask you to repeat your correction of *my flawed understanding* (of anything to do with Bitcoin - little own SPV mining).

Please explain (I know - it is hard for you to even put more than one sentence of any logical value together but please try)?

(you are starting to act like that idiot @franky1 - maybe you are an alt of his - as his habit is to post walls of text starting with fallacies)


In one sentence Smiley

You only broadcast that to your miners (if you are a pool) not to everyone else (you do know how this stuff works or don't you?).


"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 02:43:43 PM
 #43

In one sentence Smiley

You only broadcast that to your miners (if you are a pool) not to everyone else (you do know how this stuff works or don't you?).

And?

Let me try and enlighten you: https://en.bitcoin.it/wiki/Network

Quote
Anyone who is generating will collect valid received transactions and work on including them in a block. When someone does find a block, they send an inv containing it to all of their peers, as above. It works the same as transactions.

And then this:

Quote
Thin SPV Clients

BIP 0037 introduced support for thin or lite clients by way of Simple Payment Verification. SPV clients do not need to download the full block contents to verify the existence of funds in the blockchain, but rely on the chain of block headers and bloom filters to obtain the data they need from other nodes. This method of client communication allows high security trustless communication with full nodes, but at the expensive of some privacy as the peers can deduce which addresses the SPV client is seeking information about.

MultiBit and Bitcoin Wallet work in this fashion using the library bitcoinj as their foundation.

Perhaps that might help in particular (if you need any further "spoon feeding" then feel free to ask for it).

So not only have you demonstrated that you don't understand how the Bitcoin protocol works but you have also made a complete idiot of yourself trying to discredit me - still want to pursue this line of attack?

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 03:04:08 PM
 #44

I am always ready to learn, and quite happy to concede when I am wrong about something.

There are two things I would like specifically to learn, that you might be able to explain.

1. Where in the quoted text you provided is your assertion supported?

2. How and/or why BW.com would mine an empty block (401303) if they were not mining on just the block header from block 401302 which was mined by Antpool.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 03:06:50 PM
 #45

I am always ready to learn, and quite happy to concede when I am wrong about something.

Strange, but although you are ignorant and keep posting rubbish, you haven't conceded a single thing yet keep trying to attack me with your stupid nonsensical posts.

If you aren't @franky1's alt then you ought to team up with him as between the two of you perhaps you might almost have a quarter of a brain. Cheesy

Unfortunately it is pointless to try and explain SPV mining to someone as dense as you as you simply can't grasp it (no matter how many times it is explained to you).

Perhaps try hiring a "brain coach" or the like?
(some people might even coach you out of pity for free - but that ain't me as I have a rather low tolerance of morons)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 03:35:03 PM
 #46

I am always ready to learn, and quite happy to concede when I am wrong about something.

Strange - but although you are ignorant and keep posting rubbish you haven't conceded a single thing - yet keep trying to attack me with your stupid nonsensical posts.

If you aren't @franky1's alt then you ought to team up with him as between the two of you perhaps you might almost have a quarter of a brain. Cheesy

Unfortunately it is pointless to try and explain SPV mining to someone as dense as you as you simply can't grasp it (no matter how many times it is explained to you).

Perhaps try hiring a "brain coach" or the like?
(some people might even coach you out of pity for free - but that ain't me as I have a rather low tolerance of morons)


Yes, I haven't conceded that I am wrong because you said:

You only broadcast that to your miners (if you are a pool) not to everyone else (you do know how this stuff works or don't you?).

Which is contradicted by evidence that I have posted (rubbish you called it) that suggests this is not true.

I am open minded enough to accept that I have misunderstood something. That every source I have was equally mistaken. That the evidence I have provided I have misinterpreted. (There is a chance BW.COM actually had the full block 401302, but just decided to mine an empty block, for example)

I'm wide open to be proven wrong, and I am more than happy to concede that you were correct about block headers only being sent to miners within your pool (though this does sound a bit like some kind of block withholding attack).

Given the whole premise of your argument about 64MB blocks is based on what *appears to be* flawed understanding, I think its fairly important to establish just exactly how SPV mining works in this regard, so that others that may read this thread don't make the same mistakes as I may have done.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 03:40:16 PM
 #47

SPV mining *is not proper mining*.

Is that not clear to you?

It is irrelevant whether or not they use the SPV protocol to do that so (and in fact they don't as they use the "relay network" which existed a long time before SPV support was added to Bitcoin) so your desperate attempts at trying to discredit me are failing and just making you look even more stupid.

If you mine that way then you end up on a fork (as has already happened) and as I have already tried to explain (but seemingly you are incapable of understanding) the larger you make the blocks then the easier it would be to trick other miners into mining on such forks.

And btw - the only *flawed understanding* here is your own.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Cuidler
Sr. Member
****
Offline Offline

Activity: 294
Merit: 250


View Profile
March 08, 2016, 03:41:05 PM
 #48

The only way to find out whether Bitcoin can scale larger than the Visa network is remove blocksize limit and watch free market to find what blocks are miners able to handle. The ones who supporting artifical limit capping are regulators (tyrans) who dont believe in free market finding optimal and perfect solution by itselves, or protectionists who believe basically every piece of hardware like Rasberry PIs should be capable to be used as full nodes - ignoring the reality every computing intensive software trying to be competetive on market must have mnimum requirements set to about average home PC spec otherwise it become unused over time.

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
QuestionAuthority
Legendary
*
Offline Offline

Activity: 2156
Merit: 1393


You lead and I'll watch you walk away.


View Profile
March 08, 2016, 03:47:18 PM
 #49

I can't believe how many people are blaspheming our lord and savior Satoshi in this thread. You can't call him a visionary genius in one breath and a dumbass in another.

Just raise the blocksize already and see what happens. If it fails then it was always a stupid idea and fuck Bitcoin. If it succeeds then cheer in the streets about how brilliant you all are for investing in this wonderful new techonology. What ever you bitches do, don't spend the next 2 years pissing and moaning about the problem while taking no action.

CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 03:48:37 PM
 #50

Just raise the blocksize already and see what happens.

Why - because you say so?

Hey - let's just raise it to 64MB then see what happens. Cheesy

(who cares if billions of dollars are destroyed - what's important is that we have fun)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
QuestionAuthority
Legendary
*
Offline Offline

Activity: 2156
Merit: 1393


You lead and I'll watch you walk away.


View Profile
March 08, 2016, 03:51:22 PM
 #51

Just raise the blocksize already and see what happens.

Why - because you say so?

Hey - let's just raise it to 64MB then as see what happens. Cheesy


Well fuck dude. No one is doing shit except bitching at each other. Make some god damn decision already.

CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 03:52:30 PM
 #52

Well fuck dude. No one is doing shit except bitching at each other. Make some god damn decision already.

There is plenty of stuff going on - it is just the idiots on this forum (who are mostly seemingly paid by Gavin's supporters) who need to have a block size increase *this week*.

In the "real world" no-one is even using Bitcoin for much at all (do all your family and friends use it for daily txs?).

Supposedly according to Mike Hearn it should already have died yet strangely enough the price is still over 400 USD (higher than when he declared it to be dead in the first place).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
road to morocco
Newbie
*
Offline Offline

Activity: 56
Merit: 0


View Profile
March 08, 2016, 03:54:19 PM
 #53

Just raise the blocksize already and see what happens.

Why - because you say so?

Hey - let's just raise it to 64MB then see what happens. Cheesy

What do you think would happen? People who didn't spam 1MB blocks would suddenly want to spend 64 times as much money to spam the 64MB blocks?
QuestionAuthority
Legendary
*
Offline Offline

Activity: 2156
Merit: 1393


You lead and I'll watch you walk away.


View Profile
March 08, 2016, 03:54:33 PM
 #54

Well fuck dude. No one is doing shit except bitching at each other. Make some god damn decision already.

There is plenty of stuff going on - it is just the idiots on this forum (who are mostly seemingly paid by Gavin's supporters) who need to have a block size increase *this week*.


So what's going to happen? What can we expect? More rage quits? More secret meetings? You yourself have called for the miners to take action. Is that going to happen?

CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 03:56:08 PM
 #55

So what's going to happen? What can we expect? More rage quits? More secret meetings? You yourself have called for the miners to take action. Is that going to happen?

It seems that the miners are mostly backing Bitcoin Core but I am not acting on behalf of Bitcoin Core (as I'm sure many want to accuse me of).

At the end of the day it is up to them as to whether a fork happens or not.

One thing I do know about is Chinese (as I live in China) and that is that they do not co-operate with each other (something that many westerners that come to business here are initially rather surprised at considering that they have the world's largest population for a single country).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
QuestionAuthority
Legendary
*
Offline Offline

Activity: 2156
Merit: 1393


You lead and I'll watch you walk away.


View Profile
March 08, 2016, 03:58:23 PM
 #56

So what's going to happen? What can we expect? More rage quits? More secret meetings? You yourself have called for the miners to take action. Is that going to happen?

It seems that the miners are mostly backing Bitcoin Core but I am not acting on behalf of Bitcoin Core (as I'm sure many want to accuse me of).

At the end of the day it is up to them as to whether a fork happens or not.

One thing I do know about is Chinese (as I live in China) - they do not co-operate with each other.


Do you know how much longer this two year old discussion/decision is going to take to implement?

CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 04:00:01 PM
 #57

Do you know how much longer this two year old discussion/decision is going to take to implement?

From what I gather the SegWit stuff should be able to be done in a couple of months and then we'd expect further improvements later this year (and maybe a hard fork early to mid next year).

If this is handled properly (and not with the "kicking the can" approach that Gavin and others keep trying to push) then this should become a non-issue within a year or so.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
YarkoL
Legendary
*
Offline Offline

Activity: 996
Merit: 1012


View Profile
March 08, 2016, 04:03:41 PM
 #58

That's very interesting, so what he's saying is that the network should have no problems handling an increase in the amount of transactions then. Was this limit put in to prevent people from spamming transactions until there was enough network hashing power to cope with an increase if I'm understanding this correctly?

Here's some more interesting early history

For what it's worth: 

I'm the guy who went over the blockchain stuff in Satoshi's first cut of the bitcoin code.  Satoshi didn't have a 1MB limit in it. The limit was originally Hal Finney's idea.  Both Satoshi and I objected that it wouldn't scale at 1MB.  Hal was concerned about a potential DoS attack though, and after discussion, Satoshi agreed.  The 1MB limit was there by the time Bitcoin launched.  But all 3 of us agreed that 1MB had to be temporary because it would never scale.


Unlike current Core devs, Satoshi was not obsessed by adversaries.
He thought Moore's law would keep up with the scalability demands, and
he was ok with validation restricted to specialized nodes "with multiple GPU cards" as
he puts it in the message to Hearn.

“God does not play dice"
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 04:06:58 PM
 #59

He thought Moore's law would keep up with the scalability demands...

Moore's law has never applied to bandwidth (so if Satoshi really thought that then clearly he wasn't a genius was he).

Or perhaps you think that because Satoshi could never be wrong then if he said that "Moore's Law" applies to bandwidth it actually does?

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
QuestionAuthority
Legendary
*
Offline Offline

Activity: 2156
Merit: 1393


You lead and I'll watch you walk away.


View Profile
March 08, 2016, 04:08:10 PM
 #60

Do you know how much longer this two year old discussion/decision is going to take to implement?

From what I gather the SegWit stuff should be able to be done in a couple of months and then we'd expect further improvements later this year (and maybe a hard fork early to mid next year).

If this is handled properly (and not with the "kicking the can" approach that Gavin and others keep trying to push) then this should become a non-issue within a year or so.


If that schedule actually happens that's great. Maybe I just have too much faith in Bitcoin. I don't think it can be broken or destroyed that easily. Even by someone as self important and manipulative as Andresen.

CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 04:09:57 PM
 #61

I don't think it can be broken or destroyed that easily. Even by someone as self important and manipulative as Andresen.

I hope you are right (and do think that it is more likely that Bitcoin will get through this tough time and end up even more resilient because of that).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Cuidler
Sr. Member
****
Offline Offline

Activity: 294
Merit: 250


View Profile
March 08, 2016, 04:14:15 PM
 #62

At the end of the day it is up to them as to whether a fork happens or not.

Miners need to be sure the market accept the fork change - the ten-hundred thousands of individual Bitcoin miners act overall rationally after all, thats why no rush change is likely unless something needs to be fixed immediately.


From what I gather the SegWit stuff should be able to be done in a couple of months and then we'd expect further improvements later this year (and maybe a hard fork early to mid next year).

The changes to the code/structures are so big with SegWit, much more testing is necessary. Fortunatelly Bitcoin Classic does not have plans for SegWit in near future, thus the likelihood of SegWit activation this year is small - but it is good thing - enought time for proper testing for bugless implementation.

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 04:16:29 PM
 #63

The changes to the code/structures are so big with SegWit, much more testing is necessary. Fortunatelly Bitcoin Classic does not have plans for SegWit in near future, thus the likelihood of SegWit activation this year is small - but it is good thing - enought time for proper testing for bugless implemenatin.

Actually that is rubbish - the amount of code for SegWit is around 500 lines (which is not very much as I often write that much in a day) and it has been in testing for months.

Stop trying to shill for Bitcoin Classic - it is just another attempt by Gavin to take over control of Bitcoin.

Gavin is hoping that everyone is going to want to see "The Return Of The King" but I think that most of us actually would prefer to listen to a https://en.wikipedia.org/wiki/A_Farewell_to_Kings.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
YarkoL
Legendary
*
Offline Offline

Activity: 996
Merit: 1012


View Profile
March 08, 2016, 04:20:35 PM
 #64

He thought Moore's law would keep up with the scalability demands...

Moore's law has never applied to bandwidth (so if Satoshi really thought that then clearly he wasn't a genius was he).

Or perhaps you think that because Satoshi could never be wrong then if he said that "Moore's Law" applies to bandwidth it actually does?


You can read his thoughts in the opening post, although they are
subject to different interpretations.

Bandwidth is just one factor in validation.

Satoshi had profound insight into systemic dynamics, and I wouldn't
hesitate to call him a genius. But I don't think he was omniscient.

“God does not play dice"
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 04:24:18 PM
 #65

Bandwidth is just one factor in validation.

Moore's law has actually begun to fall down in a number of ways (perhaps "storage" being the only one that now still stands even though initially it was only supposed to refer to computing power).

This has been in part due to the 3GHz limit that CPUs can operate at (without requiring seriously expensive cooling) and the fact that bandwidth has actually never even followed such a trend (instead it tends to jump every few years by quite a bit but then stagnate for quite a while after that).

Most improvements in processing speed these days are made through multi-threading (which can be applied to block validation) but that doesn't help when you are doing serialisation of information.

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Cuidler
Sr. Member
****
Offline Offline

Activity: 294
Merit: 250


View Profile
March 08, 2016, 04:32:07 PM
 #66

The changes to the code/structures are so big with SegWit, much more testing is necessary. Fortunatelly Bitcoin Classic does not have plans for SegWit in near future, thus the likelihood of SegWit activation this year is small - but it is good thing - enought time for proper testing for bugless implemenatin.

Actually that is rubbish - the amount of code for SegWit is around 500 lines (which is not very much as I often write that much in a day) and it has been in testing for months.

What about the recent unexpected fork in SegWig testbed then?
When you find unexpected bug in your code, my experience says test everything twice again to reduce the chance every part of code can not contain bug in unusual circumstances you did not foresee but is possible.

PS: Not all 500 lines are equally complicated to test Wink

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
valiz
Sr. Member
****
Offline Offline

Activity: 471
Merit: 250


BTC trader


View Profile
March 08, 2016, 04:32:49 PM
 #67

Can someone explain to me why there is any debate when Nakamoto himself said:

Quote
The existing Visa credit card network processes about 15 million Internet purchases per day worldwide. Bitcoin can already scale much larger than that with existing hardware for a fraction of the cost. It never really hits a scale ceiling. If you're interested, I can go over the ways it would cope with extreme size.  By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.
Assumption #1 from 2009 : By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10.
If this assumption is on track, then, by now, hardware speed should be 25 times faster than in 2009. Is Satoshi correct on this one?

Quote
I don't anticipate that fees will be needed anytime soon, but if it becomes too burdensome to run a node, it is possible to run a node that only processes transactions that include a transaction fee. The owner of the node would decide the minimum fee they'll accept. Right now, such a node would get nothing, because nobody includes a fee, but if enough nodes did that, then users would get faster acceptance if they include a fee, or slower if they don't. The fee the market would settle on should be minimal. If a node requires a higher fee, that node would be passing up all transactions with lower fees. It could do more volume and probably make more money by processing as many paying transactions as it can. The transition is not controlled by some human in charge of the system though, just individuals reacting on their own to market forces.
Assumption #2 from 2009: 1 MINER = 1 NODE (all miners run full nodes and all full nodes are mining)
Is Satoshi correct on this one?

Quote
Eventually, most nodes may be run by specialists with multiple GPU cards. For now, it's nice that anyone with a PC can play without worrying about what video card they have, and hopefully it'll stay that way for a while. More computers are shipping with fairly decent GPUs these days, so maybe later we'll transition to that.
Assumption #3 from 2009: Best mining technology will be GPU's.
Is Satoshi correct on this one?

If all 3 assumptions made in 2009 by Satoshi are correct, then Bitcoin may scale like VISA. Otherwise, Satoshi is wrong and it may not scale the way it was originally intended to.

Let's elaborate on the scenarios:
#1 correct, #2 correct, #3 correct: Bitcoin scales like VISA, yay!!
#1 correct, #2 correct, #3 incorrect: Bitcoin scales like VISA, but only miners with ASIC's have nodes => few nodes = centralization = bad
#1 correct, #2 incorrect, #3 correct: Bitcoin scales like VISA, but nodes are run only by volunteers, miners don't exactly need them => few nodes = centralization = bad
#1 correct, #2 incorrect, #3 incorrect: Bitcoin scales like VISA, but a few mining pools and ASIC farms remain, nodes are ran mainly by volunteers => few nodes = centralization = bad
#1 incorrect, #2 correct, #3 correct: The miners with the fastest CPU's and bandwith get an increasing advantage as Bitcoin scales, eventually few nodes = centralization = bad
#1 incorrect, #2 correct, #3 incorrect: The miners with the best ASIC's and bandwith get an increasing advantage as Bitcoin scales, eventually few nodes = centralization = bad
#1 incorrect, #2 incorrect, #3 correct: As Bitcoin scales, nodes ran by volunteers disappear, only a few nodes ran by some of the miners remain = centralization = bad
#1 incorrect, #2 incorrect, #3 incorrect: As Bitcoin scales, nodes ran by volunteers disappear, only a few nodes ran by some big ASIC farms or mining pools remain = centralization = bad

So, if Satoshi is wrong on one of the 3 assumptions, then we have to trade "scaling" with "centralization". VISA is already scaled and centralized. So, if we scale Bitcoin like this, then Bitcoin becomes just like VISA, or an inefficient competitor to VISA.

12c3DnfNrfgnnJ3RovFpaCDGDeS6LMkfTN "who lives by QE dies by QE"
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 04:36:15 PM
 #68

Bitcoin cannot scale to work like VISA with the way it works because VISA isn't doing POW nor are its nodes designed to not trust one another.

If you want to use VISA - why not use VISA?
(it's already there and it works perfectly right now)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Kprawn
Legendary
*
Offline Offline

Activity: 1904
Merit: 1073


View Profile
March 08, 2016, 04:37:01 PM
 #69

Whatever his mistakes in what he said back in the day, he is still a brilliant entity. {He / She / It / They} If he made no mistakes, I would have classed him as a alien or a god, but he is human. The

scaled block size are there for a reason... to grow with adoption and to be adapted as the need exists. Classic proponents wants us to believe, that need is NOW and Core followers seem to think, kicking

the can down the road, will be the best option to follow. I think, the doubt about scaling solutions are a bigger threat than the actual change that needs to be done to address the current need.  Sad

THE FIRST DECENTRALIZED & PLAYER-OWNED CASINO
.EARNBET..EARN BITCOIN: DIVIDENDS
FOR-LIFETIME & MUCH MORE.
. BET WITH: BTCETHEOSLTCBCHWAXXRPBNB
.JOIN US: GITLABTWITTERTELEGRAM
valiz
Sr. Member
****
Offline Offline

Activity: 471
Merit: 250


BTC trader


View Profile
March 08, 2016, 04:46:48 PM
 #70

We need to increase the blocksize NOW or the fees rise dramatically and the transactions become unreliable, this will hinder adoption.

Issue #1: The transaction fees don't appear to be rising dramatically.
Issue #2: The transactions that have a decent fee are reliable.
Issue #3: Fees are very small compared to the amounts transacted, how would they hinder adoption?
Issue #4: Why does Bitcoin need "adoption", if it damages it's qualities? If you want "adoption", you want to get rich quick. Greed is bad.
Issue #5: They say this since many months, hasn't happened.

12c3DnfNrfgnnJ3RovFpaCDGDeS6LMkfTN "who lives by QE dies by QE"
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 04:49:09 PM
 #71

We need to increase the blocksize NOW or the fees rise dramatically and the transactions become unreliable, this will hinder adoption.

Issue #1: The transaction fees don't appear to be rising dramatically.

Let's just look at that and think about how much we should actually consider reading anything else that you might have to post.

Hmm... nope - not worth reading anything further IMO (unable to make any sense in only two lines of text as you contradicted yourself that quickly).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
March 08, 2016, 04:49:54 PM
 #72

We need to increase the blocksize NOW or the fees rise dramatically and the transactions become unreliable, this will hinder adoption.

Issue #1: The transaction fees don't appear to be rising dramatically.
Issue #2: The transactions that have a decent fee are reliable.
Issue #3: Fees are very small compared to the amounts transacted, how would they hinder adoption?
Issue #4: Why does Bitcoin need "adoption", if it damages it's qualities? If you want "adoption", you want to get rich quick. Greed is bad.
Issue #5: They say this since many months, hasn't happened.


Doing Not, would just be anti-agile and retarded....

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
YarkoL
Legendary
*
Offline Offline

Activity: 996
Merit: 1012


View Profile
March 08, 2016, 05:07:49 PM
 #73



If you want to use VISA - why not use VISA?
(it's already there and it works perfectly right now)


Quote
While the system works well enough for most transactions, it still suffers from the inherent
weaknesses of the trust based model.
Completely non-reversible transactions are not really possible, since financial institutions cannot
avoid mediating disputes. The cost of mediation increases transaction costs, limiting the
minimum practical transaction size and cutting off the possibility for small casual transactions,
and there is a broader cost in the loss of ability to make non-reversible payments for nonreversible
services....
What is needed is an electronic payment system based on cryptographic proof instead of trust,
allowing any two willing parties to transact directly with each other without the need for a trusted
third party...

I leave the source of the quote as an exercise.

“God does not play dice"
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 05:09:47 PM
 #74

I leave the source of the quote as an exercise.

That wasn't my point (and you know it).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
YarkoL
Legendary
*
Offline Offline

Activity: 996
Merit: 1012


View Profile
March 08, 2016, 05:14:44 PM
 #75


You said Visa works perfectly, so let's use that.

“God does not play dice"
rienelber
Full Member
***
Offline Offline

Activity: 154
Merit: 100


View Profile
March 08, 2016, 05:20:01 PM
 #76

this is probably the most important discussion in the bitcoin world.


scalability is crucial if we want bitcoin to survive!
Cuidler
Sr. Member
****
Offline Offline

Activity: 294
Merit: 250


View Profile
March 08, 2016, 05:36:11 PM
 #77

By the number of users who want scale Bitcoin to keep Bitcoin as pay2pay decentralized system with onchain transactions affordable versus Bitcoin as just settlement layer for more or less centralized offchain services, Im sure the dark side cannot win and original Satoshi Bitcoin vision will be followed with Bitcoin project.

If we scale onchain limit, people can still choose offchain centralized services if these are really better than onchain transactions, so why cap the blocksize limit to force people to these offchain centralized services  Huh Are you affraid Blockstream there is no real demand for such offchain centralized services among Bitcoin users  Wink

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
Meuh6879
Legendary
*
Offline Offline

Activity: 1512
Merit: 1011



View Profile
March 08, 2016, 06:08:04 PM
 #78

Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.

With a initial block limit at 33,5Mb : Yes.
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 06:09:20 PM
 #79

Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.

With a initial block limit at 33,5Mb : Yes.

And you do realise that if someone created such a block that no-one else could verify it in the 10 minutes required to create the next block right?

(or - you don't care about the fact that this just wouldn't work?)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Zarathustra
Legendary
*
Offline Offline

Activity: 1162
Merit: 1004



View Profile
March 08, 2016, 06:17:15 PM
 #80

Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.

With a initial block limit at 33,5Mb : Yes.

And you do realise that if someone created such a block that no-one else could verify it in the 10 minutes required to create the next block right?

(or - you don't care about the fact that this just wouldn't work?)

That's why nobody creates such blocks, you fool.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 06:17:53 PM
 #81

With a initial block limit at 33,5Mb : Yes.
Wrong. Let's quickly analyze this:
Visa does 2000 TPS on average with a peak capacity of around 50 000 TPS (transactions-per-second). Bitcoin has a theoretical maximum TPS of 7 at 1 MB block size limit. However, realistically this the TPS is currently around 3. Now multiply this with ~33 and you end up with a TPS of 99. It might be able to compete with snails with a 33 MB block size limit, but certainly not Visa.

And you do realise that if someone created such a block that no-one else could verify it in the 10 minutes required to create the next block right?
This problem can happen at a 2 MB block size limit.

scalability is crucial if we want bitcoin to survive!
Increasing the block size limit does not improve scalability.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 06:18:51 PM
 #82

That's why nobody creates such blocks, you fool.

I'm sorry "you fool" but why on earth would you want such a high limit then?

Seemingly you are such a "fool" you don't even understand the implications of what that might mean do you?

Cheesy

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Amph
Legendary
*
Offline Offline

Activity: 3206
Merit: 1069



View Profile
March 08, 2016, 06:21:42 PM
 #83

Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.

With a initial block limit at 33,5Mb : Yes.

And you do realise that if someone created such a block that no-one else could verify it in the 10 minutes required to create the next block right?

(or - you don't care about the fact that this just wouldn't work?)

then how it was possible at the beginning to verify it if it was there when bitcoin was born?
Zarathustra
Legendary
*
Offline Offline

Activity: 1162
Merit: 1004



View Profile
March 08, 2016, 06:22:09 PM
 #84


Increasing the block size limit does not improve scalability.

Segwit reduces scalability.
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 06:23:35 PM
 #85

then how it was possible at the beginning to verify it if it was there when bitcoin was born?

A post from the "mega-retard" - how refreshing.

If you bother to look at the entire history of Bitcoin blocks just see how many were bigger than 1MB before 2010.

(come back to me after you've done that research although I'm guessing your lazy ad-sigging ass won't bother to do so)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Zarathustra
Legendary
*
Offline Offline

Activity: 1162
Merit: 1004



View Profile
March 08, 2016, 06:25:28 PM
 #86

That's why nobody creates such blocks, you fool.

I'm sorry "you fool" but why on earth would you want such a high limit then?

Seemingly you are such a "fool" you don't even understand the implications of what that might mean do you?

Cheesy


A limit is for you fools only, to calm you down. Otherwise you would shit in your pants.
Bitcoin Unlimited is for men.
Amph
Legendary
*
Offline Offline

Activity: 3206
Merit: 1069



View Profile
March 08, 2016, 06:26:09 PM
 #87

then how it was possible at the beginning to verify it if it was there when bitcoin was born?

A post from the "mega-retard" - how refreshing.

If you bother to look at the entire history of Bitcoin blocks just see how many were bigger than 1MB before 2010.

(come back to me after you've done that research although I'm guessing your lazy ad-sigging ass won't bother to do so)


well then your point is moot, since right now there are only few blocks that are bigger than 1 mega, so again why 32 will not work for the time being?
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 06:26:22 PM
 #88

A limit is for you fools only, to calm you down. Otherwise you would shit in your pants.
Bitcoin Unlimited is for men.

I see - those with particularly big appendages I would assume then?

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 06:27:31 PM
 #89

well then your point is moot, since right now there are only few blocks that are bigger than 1 mega, so again why 32 will not work for the time being?

Because *idiot* practically no nodes could verify all the sigs that would be in a 32MB block in 10 minutes.

Can you comprehend that?

(or do we need to translate it into "baby language" that you can understand?)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 08, 2016, 06:34:47 PM
Last edit: March 08, 2016, 07:03:24 PM by franky1
 #90

ciyams answer to everything. no real data, no real stats no real scenarios. just insulting anyone whos mindset is not in blockstreams corporate pockets.

guess what. ciyam actually debated 4mb before as a doomsdays scenario.. but as soon as it was realised that segwit(his gods plan) allowed for upto 4mb of data per block.. he changed his ways.

technology didnt change overnight. but the words of his gods bible did.

its funny, any idea that is not blockstreams must be insulted. not using real knowledge or data, but purely emotion.

i really wish he would take 5 minutes out to relax and have a coffee, cool down his emotion and reply using his knowledge. because until recently, he use to actually make logical sense.

for instance. in the last 20 minutes he has made a couple replies that are basically vapour arguments with alot of insults.

if he instead took a 5 minute break to gather his thoughts into an idea. and spent the remaining 15 minutes writing a proper reply that had context. then that reply would (maybe) have been a winning reply.
yet even after asking him many times on different topics to take a break and think up a genuine rebuttle. he does not.

now for my opinion.
bitcoin does not need to reach the scales of Visa by 2017.. bitcoin can achieve this over a 10-15 period. without issue.
and once you realise that a Xblocklimit growing over the years +segwit + sidechains can easily surpass visa worldwide in 15 years, where it took visa 60 years.

that said if we just concentrate on one ledger of visa and one ledger of bitcoin (not sidechain). then bitcoin with just 2mb+segwit
(to avoid manually repeating myself)
if we apply that average 40 transactions a year* to the 104million US visa card holders**, then that is an average of 4.160billion transactions a year

which if we take 2mb+segwit (8000tx per ten minutes) is 420million transactions a year.

so bitcoin with 2mb+segwit can handle 10% of visa americas ledger.




*http://www.statista.com/statistics/279275/average-credit-card-transaction-volume-per-card-worldwide/
**http://budgeting.thenest.com/percentage-americans-credit-cards-30856.html


so imagine in 15 years bitcoin was 20mb+segwit. that would be on par with visa USA
then imagine the sidechains had similar capacity each

sidechainA comparable to Visa Europe
sidechainB comparable to Visa Asia
sidechainC comparable to Visa australia

but sticking with 1mbsegwit, trying to push hard for sidechains is pushing people into a less secre network.
especially if blockstream only wants to double blocklimit every 8 years.
meaning 4mb+segwit+sidechains in 2026 and 8mb+segwit+sidechains in 2032.. meaning bitcoin wont be on par with visa america until
well into 2040 (24 years)

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 06:48:06 PM
 #91

SPV mining *is not proper mining*.

Is that not clear to you?

It is irrelevant whether or not they use the SPV protocol to do that so (and in fact they don't as they use the "relay network" which existed a long time before SPV support was added to Bitcoin) so your desperate attempts at trying to discredit me are failing and just making you look even more stupid.

If you mine that way then you end up on a fork (as has already happened) and as I have already tried to explain (but seemingly you are incapable of understanding) the larger you make the blocks then the easier it would be to trick other miners into mining on such forks.

And btw - the only *flawed understanding* here is your own.


You are way off topic. Lets zoom back out again.

At the start of this discussion, you made an assertion that a 64MB block takes a long time to validate, and that such a block could be crafted by a miner in order to "own" the bitcoin network. Specifically that this miner could "get all the fees and block rewards".

I asked three questions, the first was the most important. The answer you gave implied that you thought block validation time was the key to the malicious actor owning the network.

You had not stated this explicitly so to clarify, I asserted that the validation time of a block has no bearing on whether other miners can start mining the next block, as they can mine knowing just the block header.

I labelled this as being "SPV mining" (but as you rightly point out in your previous post could have been obtained through the Relay Network). The salient point was that any miner has the potential to mine the next block, because they have the block header and did not need to have (yet) validated the full block.

However you refuted this, stating that "everyone else" (other miners) would not know the header, because the header was only broadcast to "your miners (if you are a pool)":

You only broadcast that to your miners (if you are a pool) not to everyone else (you do know how this stuff works or don't you?).

Your original premise that the block chain can be owned by crafting a expensive to validate block, relies upon miners having to validate the block before they can start mining relies on your refutation.

There is circumstantial evidence that your statement is incorrect, in the blockchain (401302/401303) which you have not offered any explanation for.

You have not provided any factual evidence of you claim above, and so I have to assume that it is wrong.

If the statement is wrong then you demonstrably have a flawed understanding.

If your original assertion relies on a flawed understanding then the original assertion cannot be held true.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
Amph
Legendary
*
Offline Offline

Activity: 3206
Merit: 1069



View Profile
March 08, 2016, 06:48:57 PM
 #92

well then your point is moot, since right now there are only few blocks that are bigger than 1 mega, so again why 32 will not work for the time being?

Because *idiot* practically no nodes could verify all the sigs that would be in a 32MB block in 10 minutes.

Can you comprehend that?

(or do we need to translate it into "baby language" that you can understand?)

lol are you following me or what, then again how it was possible to verify it back then?

you're not understanding apparently, having 32 mb right now will not change anything because we are almost in the same condition as before, we need only to verify less than 1mb, because the block is not even full for 1 mb let alone for 32

also about your signature concern https://en.bitcoin.it/wiki/Scalability
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 06:50:55 PM
 #93

You are way off topic. Lets zoom back out again.

You are an idiot (and most likely @franky1 who I've already put on ignore).

I am putting you on ignore as well so feel free to post your rubbish (I don't think that anyone is actually taking you seriously anyway but I don't actually care).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 06:53:43 PM
 #94

you're not understanding apparently, having 32 mb right now will not change anything because we are almost in the same condition as before, we need only to verify less than 1mb, because the block is not even full for 1 mb let alone for 32

Amph - you obviously are unable to reason (I think that is why you are known as the "useless poster" is it not?) - if we made the block size so big you don't think that some group could decide to create huge blocks (with their own txs even) to fuck up everyone else (that would be struggling even to verify such blocks in 10 minutes)?

Don't compare what we have seen up until now when the limit of 1MB has been in place since before any 1MB block was ever created (of course this probably all is too hard for you to understand so perhaps just post somewhere else to keep up your ad sig posting count).

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Amph
Legendary
*
Offline Offline

Activity: 3206
Merit: 1069



View Profile
March 08, 2016, 06:57:34 PM
 #95

the signature concern is not even a concern, in the link wiki that i've posted it explain how with some optimization it is possible to increment the number of signature per second

but well i don't like to argue with someone that only know how to insult people
CIYAM
Legendary
*
Offline Offline

Activity: 1890
Merit: 1075


Ian Knowles - CIYAM Lead Developer


View Profile WWW
March 08, 2016, 06:58:40 PM
 #96

the signature concern is not even a concern, in the link wiki that i've posted it explain how with some optimization it is possible to increment the number of signature per second

Aha - and that optimisation can do that 10x or 100x can it?

(I am only rude to people who are rude to me btw and if you want to be taken seriously then please dump your ad sig for a start)

With CIYAM anyone can create 100% generated C++ web applications in literally minutes.

GPG Public Key | 1ciyam3htJit1feGa26p2wQ4aw6KFTejU
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 06:59:31 PM
 #97

Amph - you obviously are unable to reason (I think that is why you are known as the "useless poster" is it not?) - if we made the block size so big you don't think that some group could decide to create huge blocks (with their own txs even) to fuck up everyone else (that would be struggling even to verify such blocks in 10 minutes)?
You're wasting your time with him. Obviously there is a good reason for which a 2 MB block size limit is dangerous (validation time); proposing 32 MB right now would be a bad joke at best (now).

Aha - and that optimisation can do that 10x or 100x can it?
No. He read something on the Wiki that he doesn't even understand.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
Zarathustra
Legendary
*
Offline Offline

Activity: 1162
Merit: 1004



View Profile
March 08, 2016, 07:02:01 PM
 #98

ciyams answer to everything. no real data, no real stats no real scenarios. just insulting anyone whos mindset is not in blockstreams corporate pockets.

guess what. ciyam actually debated 4mb before as a doomsdays scenario.. but as soon as it was realised that segwit(his gods plan) allowed for upto 4mb of data per block.. he changed his ways.

technology didnt change overnight. but the words of his gods bible did.

its funny, any idea that is not blockstreams must be insulted. not using real knowledge or data, but purely emotion.

i really wish he would take 5 minutes out to relax and have a coffee, cool down his emotion and reply using his knowledge. because until recently, he use to actually make logical sense.

for instance. in the last 20 minutes he has made a couple replies that are basically vapour arguments with alot of insults.

if he instead took a 5 minute break to gather his thoughts into an idea. and spent the remaining 15 minutes writing a proper reply that had context. then that reply would (maybe) have been a winning reply.
yet even after asking him many times on different topics to take a break and think up a genuine rebuttle. he does not.

now for my opinion.
bitcoin does not need to reach the scales of Visa by 2017.. bitcoin can achieve this over a 10-15 period. without issue.
and once you realise that a Xblocklimit growing over the years +segwit + sidechains can easily surpass visa worldwide in 15 years, where it took visa 60 years.

that said if we just concentrate on one ledger of visa and one ledger of bitcoin (not sidechain). then bitcoin with just 2mb+segwit
(to avoid manually repeating myself)
if we apply that average 40 transactions a year* to the 104million US visa card holders**, then that is an average of 4.160billion transactions a year

which if we take 2mb+segwit (8000tx per ten minutes) is 420million transactions a year.

so bitcoin with 2mb+segwit can handle 10% of visa americas ledger.




*http://www.statista.com/statistics/279275/average-credit-card-transaction-volume-per-card-worldwide/
**http://budgeting.thenest.com/percentage-americans-credit-cards-30856.html

Zarathustra
Legendary
*
Offline Offline

Activity: 1162
Merit: 1004



View Profile
March 08, 2016, 07:03:56 PM
 #99

Amph - you obviously are unable to reason (I think that is why you are known as the "useless poster" is it not?) - if we made the block size so big you don't think that some group could decide to create huge blocks (with their own txs even) to fuck up everyone else (that would be struggling even to verify such blocks in 10 minutes)?
You're wasting your time with him. Obviously there is a good reason for which a 2 MB block size limit is dangerous; proposing 32 MB right now would be a bad joke at best (now).


LOL, 2MB dangerous but the 4MB of your idols not.
Dumb - dumber - small blockers.
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 08, 2016, 07:04:54 PM
 #100

You're wasting your time with him. Obviously there is a good reason for which a 2 MB block size limit is dangerous;

and there is the hat trick
2mb dangerous?? ?? ?

yet strangely segwits 4mb not dangerous??

you cant have it both ways

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 07:28:11 PM
 #101

I would be willing to run a full node on a testnet to see if my system could handle larger blocks, i.e. verify a large block in less than the average time between blocks.

I have a question:  The total amount work to verify N 1MB blocks is about the same as single N-MB block, right?  For example, 32 1MB blocks take about the same amount of work to verify as a single 32MB block, right?  Just please ignore the live delivery of blocks for the moment.  Or is there some advantage to large blocks where less headers have to be processed.  Imagine a full node was off the air for a day or two and is just trying to catch up as fast as possible.  What block size facilitates that best?

To me it seems fees tend to be inversely proportional to block size, i.e. with smaller blocks fees rise as folks compete to get into blocks, with larger blocks fees get smaller with less competition to get into blocks.  What does it cost a bad actor (if there is truly such a thing in this realm) to clog up the works?  I suppose we are looking for the right size of block to cause them to expend their resources most quickly.  Make the block size very small and the fee competition would rise high enough to deplete the bad actor very fast; everyone suffers higher fees until they are run out of town (so to speak).  Hmm, but if the block size is very small then even when there aren't any bad actors on the scene, regular legit users would be forced to compete.  At the other end of the spectrum; make the block size very large and with such low competition fees would diminish.  The real question here is what happens to the fees/MB across the spectrum of block sizes.

Is there *anyone* preferring a smaller than 1MB block size right now?  I haven't heard of any but you never know.  I do think some miners do artificially constrain the block size they produce to like 900KB or so (I'm not sure of their motivation).  Even if the block size were increased then such miners could still constrain the ones they produce, right?

A transaction cannot span multiple blocks, right?  I suppose the block size creates a functional limit on transaction sizes.  Or is the size of a transaction constrained some other way?
Amph
Legendary
*
Offline Offline

Activity: 3206
Merit: 1069



View Profile
March 08, 2016, 07:31:27 PM
 #102

the signature concern is not even a concern, in the link wiki that i've posted it explain how with some optimization it is possible to increment the number of signature per second

Aha - and that optimisation can do that 10x or 100x can it?

(I am only rude to people who are rude to me btw and if you want to be taken seriously then please dump your ad sig for a start)


what sig have to do with our discussion, there are many troll even worse among those without sign...and anyway i'm not payd for monere than 100 post per week, and guess what this is the 200°+ post....

and anyway yes those optimization can reach 10x increasing and even higher



obviously lauda is joking because in 2017 they are going to implement 2mega hard fork anyway

there is no real valid concern against 2mega all i see is a non-sense fud
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 07:34:00 PM
 #103

One thing that seems apparent to me is the lack of willingness to compromise.  Appeasement is a powerful marketing tool.  Could we reasonably raise the block size limit to 1.1MB without wrecking Bitcoin?  Wouldn't the good will generated be worth it?  Along the way we might learn something important.  I fully realize the 2MB being bandied about is already a giant compromise down from the 32MB or 8MB sizes being proposed before.  Is there something special about doubling?  It can be set to 1.1MB easily, right?
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 07:39:40 PM
 #104

I have a question:  The total amount work to verify N 1MB blocks is about the same as single N-MB block, right?  For example, 32 1MB blocks take about the same amount of work to verify as a single 32MB block, right?  
No. The scaling of validation time is quadratic (look up quadratic growth if unsure what this means). In other words, 32 1 MB blocks != a single 32 MB block. Segwit aims to scale down the validation time and make it linear. Classic (BIP109) adds a sigops limitation to prevent this from happening (so not a solution, but limitation to size of TX IIRC). If anyone claims that this is false or whatever, that means they are saying that all the people who signed the Core roadmap are wrong/lying (2 MB block size limit is mentioned there IIRC).

It can be set to 1.1MB easily, right?
That would work.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 08, 2016, 07:44:10 PM
 #105

I have a question:  The total amount work to verify N 1MB blocks is about the same as single N-MB block, right?  For example, 32 1MB blocks take about the same amount of work to verify as a single 32MB block, right?  Just please ignore the live delivery of blocks for the moment.  Or is there some advantage to large blocks where less headers have to be processed.  Imagine a full node was off the air for a day or two and is just trying to catch up as fast as possible.  What block size facilitates that best?

a much easier way. to get an initial bases number to then compare is to start from scratch. and time how long it takes to resync from 0 to the latest block
..then do the maths

EG
someone pointed out to lauda that at a 1mb internet connection http://bitcoinstats.com/irc/bitcoin-dev/logs/2016/01/17#l1453064029.0
it would take 12 days to resync and validate 400,000 blocks

so basing it on a very slow connection is a good bases of capability
which is basically 400,000 /12 /24/ 60 = 0.39 blocks a minute. so lets call that 1 block in under 3 minutes.

that is the basic total propogation time including download time using a 1mb connection speed.

though it would be useful to work out how long it takes to validate the data without the connection speed hindering it.
and also know the total propogation time at a varying amount of internet speeds too

so i wish you luck with your investigations and i hope your results give some conclusive results

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 08, 2016, 07:49:22 PM
 #106

One thing that seems apparent to me is the lack of willingness to compromise.  Appeasement is a powerful marketing tool.  Could we reasonably raise the block size limit to 1.1MB without wrecking Bitcoin?  Wouldn't the good will generated be worth it?  Along the way we might learn something important.  I fully realize the 2MB being bandied about is already a giant compromise down from the 32MB or 8MB sizes being proposed before.  Is there something special about doubling?  It can be set to 1.1MB easily, right?

1.1mb is an empty gesture. and solves nothing long term. meaning its just a poke in the gut knowing that more growth would be needed soon.

the 2mb is not forcing miners to make 2mb blocks. its a BUFFER to allow for growth without having to demand that core keep chaing the rules every month.

just like in 2013 when miners were fully able to use the 1mb buffer the miners did not jump to 0.95mb, they grew slowly over months and months. without having to ask core to change it endlessly from 0.5 to 0.55 to 0.6 to 0.65

meaning even with 2mb buffer. miners can set a preferential limit of 1.1mb and still have 45% of growth potential(the 2mb hard limit) not even tapped into and not needing to beg core to alter anything for months-years, instead of 1.1mb hard limit which requires endless debates

imagine it. 2mb buffer and miners grow slowly month after month growing by 0.1mb when they are happy to. without hindrance or demands

and while you are investigating validation times.. please validate a 450kb block vs a 900kb block and see if the whole quadratic buzzword holds wait.
as it would be an interesting answer (as a comparison between 900kb vs 1800kb which is not measurable on the bitcoin network yet)


I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 07:49:29 PM
 #107

I have a question:  The total amount work to verify N 1MB blocks is about the same as single N-MB block, right?  For example, 32 1MB blocks take about the same amount of work to verify as a single 32MB block, right?  
No. The scaling of validation time is quadratic (look up quadratic growth if unsure what this means). In other words, 32 1 MB blocks != a single 32 MB block. Segwit aims to scale down the validation time and make it linear. Classic (BIP109) adds a sigops limitation to prevent this from happening (so not a solution, but limitation to size of TX IIRC). If anyone claims that this is false or whatever, that means they are saying that all the people who signed the Core roadmap are wrong/lying (2 MB block size limit is mentioned there IIRC).

It can be set to 1.1MB easily, right?
That would work.
Thank you Lauda:  https://en.wikipedia.org/wiki/Quadratic_growth ... ouchie; so if a 1MB block with y transactions in it takes x seconds to validate then 32 similar 1MB blocks will take about 32x seconds but a 32MB block can be expected to take about (32y)²x seconds.  Or is the quadratic growth on something other than transaction count?
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 08, 2016, 07:52:21 PM
 #108

You are way off topic. Lets zoom back out again.

You are an idiot (and most likely @franky1 who I've already put on ignore).

I am putting you on ignore as well so feel free to post your rubbish (I don't think that anyone is actually taking you seriously anyway but I don't actually care).


As you are not refuting anything I have said then I will assume it is because you cannot.

The only logical conclusion is that is because you were in fact wrong, and so your comments about my lack of understanding are demonstrably false.

As I have now cleared up any misconceptions about what I do or do not know. I don't think there is anything left to discuss

If you do come up with any factual evidence I'll be happy to reopen a dialogue.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 07:54:30 PM
 #109

One thing that seems apparent to me is the lack of willingness to compromise.  Appeasement is a powerful marketing tool.  Could we reasonably raise the block size limit to 1.1MB without wrecking Bitcoin?  Wouldn't the good will generated be worth it?  Along the way we might learn something important.  I fully realize the 2MB being bandied about is already a giant compromise down from the 32MB or 8MB sizes being proposed before.  Is there something special about doubling?  It can be set to 1.1MB easily, right?
1.1mb is an empty gesture. and solves nothing long term. meaning its just a poke in the gut knowing that more growth would be needed soon.

the 2mb is not forcing miners to make 2mb blocks. its a BUFFER to allow for growth without having to demand that core keep chaing the rules every month.

meaning even with 2mb buffer. miners can set a preferential limit of 1.1mb and still have 45% of growth potential(the 2mb hard limit) not even tapped into and not needing to beg core to alter for months-years

imagine it. 2mb buffer and miners grow slowly month after month growing by 0.1mb when they are happy to. without hindrance or demands

just like in 2013 when miners were fully able to use the 1mb buffer the miners did not jump to 0.95mb, they grew slowly over months and months. without having to ask core to change..
Thank you franky1:  I understand but a 10% jump is at least something and then if all goes well the stage would be set to jump to something more, e.g. 1.2MB.  Breaking the standoff seems more important to me at this time; the world is watching us.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 07:57:21 PM
 #110

Thank you Lauda:  https://en.wikipedia.org/wiki/Quadratic_growth ... ouchie; so if a 1MB block with y transactions in it takes x seconds to validate then 32 similar 1MB blocks will take about 32x seconds but a 32MB block can be expected to take about (32y)²x seconds.  Or is the quadratic growth on something other than transaction count?
It can be a bit tough to understand (I feel you). The number of transactions is irrelevant from what I understand. it is possible to construct a single transaction that would fill up a block. From the Core roadmap:
Quote
In 2MB blocks, a 2MB transaction can be constructed that may take over 10 minutes to validate which opens up dangerous denial-of-service attack vectors.
The validation time of the block itself is what has quadratic scaling. From one of the workshops last year (might help understand the problem):
Quote
So how bad is it? There was a block consisting of one transaction in the coinbase, it was broadcasted about one month ago. It was f2pool that was clearing up a bunch of spam on the blockchain. The problem is that this block takes 30 seconds to validate. It's a 990 kilobyte transaction. It contains about 500,000 signature operations, which each time serializes the entire 1 MB transaction out, moves buffers around, then serializes it, then hashes it, it creates 1.25 GB. The bitcoin serializer is not fast, it's about where 60% of the validation time was. So 30 seconds to validate and that's on a beefy computer, I don't know what that is on a raspberrypi, it's not linear it's quadratic scaling... If you are doing 8 MB, then it's 2 hours and 8 minutes. There are some ways that gavinandresen has proposed that this can be fixed. This transaction would be considered non-standard on the network now, but miners can still generate these transactions.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
March 08, 2016, 07:59:18 PM
 #111

One thing that seems apparent to me is the lack of willingness to compromise.  Appeasement is a powerful marketing tool.  Could we reasonably raise the block size limit to 1.1MB without wrecking Bitcoin?  Wouldn't the good will generated be worth it?  Along the way we might learn something important.  I fully realize the 2MB being bandied about is already a giant compromise down from the 32MB or 8MB sizes being proposed before.  Is there something special about doubling?  It can be set to 1.1MB easily, right?
1.1mb is an empty gesture. and solves nothing long term. meaning its just a poke in the gut knowing that more growth would be needed soon.

the 2mb is not forcing miners to make 2mb blocks. its a BUFFER to allow for growth without having to demand that core keep chaing the rules every month.

meaning even with 2mb buffer. miners can set a preferential limit of 1.1mb and still have 45% of growth potential(the 2mb hard limit) not even tapped into and not needing to beg core to alter for months-years

imagine it. 2mb buffer and miners grow slowly month after month growing by 0.1mb when they are happy to. without hindrance or demands

just like in 2013 when miners were fully able to use the 1mb buffer the miners did not jump to 0.95mb, they grew slowly over months and months. without having to ask core to change..
Thank you franky1:  I understand but a 10% jump is at least something and then if all goes well the stage would be set to jump to something more, e.g. 1.2MB.  Breaking the standoff seems more important to me at this time; the world is watching us.

Great, watching...

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 07:59:27 PM
 #112

As hard as it might be to see, I really believe the crisis in front of us is one of perception as opposed to anything technical.  Perceptions are manageable while the real work of sorting through the technical issues is taken out of the limelight.
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 08, 2016, 08:01:02 PM
 #113

Thank you franky1:  I understand but a 10% jump is at least something and then if all goes well the stage would be set to jump to something more, e.g. 1.2MB.  Breaking the standoff seems more important to me at this time; the world is watching us.

since summer 2015 people have been asking for a buffer increase. the roadmap plans one for 2 years (summer 2017) after a year of grace(starting summer 2016, we hope).
that means when miners get to 1.1 they then have to beg for a year and then wait a grace period of a year before getting to 1.2.

its much easier to have 2mb buffer and let the miners themselves slowly increase 1.1 then 1.2 when they are ready. as a soft rule within their own code below the hard rule of consensus
after all they are not going to risk pushing too fast as their rewards would be at risk due to not only competition but also orphans, so even with a 2mb buffer we wont see miners pushing to 1.950 anytime soon, just like in 2013.. when they realised they had 50% of growth potential they could fill.. they didnt straight away

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 08:07:05 PM
 #114

Thank you Lauda:  https://en.wikipedia.org/wiki/Quadratic_growth ... ouchie; so if a 1MB block with y transactions in it takes x seconds to validate then 32 similar 1MB blocks will take about 32x seconds but a 32MB block can be expected to take about (32y)²x seconds.  Or is the quadratic growth on something other than transaction count?
It can be a bit tough to understand (I feel you). The number of transactions is irrelevant from what I understand. it is possible to construct a single transaction that would fill up a block. From the Core roadmap:
Quote
In 2MB blocks, a 2MB transaction can be constructed that may take over 10 minutes to validate which opens up dangerous denial-of-service attack vectors.
The validation time of the block itself is what has quadratic scaling. From one of the workshops last year (might help understand the problem):
Quote
So how bad is it? There was a block consisting of one transaction in the coinbase, it was broadcasted about one month ago. It was f2pool that was clearing up a bunch of spam on the blockchain. The problem is that this block takes 30 seconds to validate. It's a 990 kilobyte transaction. It contains about 500,000 signature operations, which each time serializes the entire 1 MB transaction out, moves buffers around, then serializes it, then hashes it, it creates 1.25 GB. The bitcoin serializer is not fast, it's about where 60% of the validation time was. So 30 seconds to validate and that's on a beefy computer, I don't know what that is on a raspberrypi, it's not linear it's quadratic scaling... If you are doing 8 MB, then it's 2 hours and 8 minutes. There are some ways that gavinandresen has proposed that this can be fixed. This transaction would be considered non-standard on the network now, but miners can still generate these transactions.
Sincerely, thank you Lauda:  I really do try hard to understand and your patience is much appreciated.  It did seem unreasonable for the scaling to hinge on transaction count.  Serialization is a classic scaling killer.  So, one very large transaction with numerous sigops leads to the quadratic growth.  Hmm, so blindly increasing the block size is asking for trouble.  How many of these troublesome transactions are launched against us recently?  Or is it an unexploited vulnerability?

Perhaps we could increase the block size *and* constrain sigops/transaction until we get SegWit out the door?
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 08:13:44 PM
 #115

but miners can still generate these transactions.
but the block would get orphaned?

Cuidler
Sr. Member
****
Offline Offline

Activity: 294
Merit: 250


View Profile
March 08, 2016, 08:30:40 PM
 #116

Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.

With a initial block limit at 33,5Mb : Yes.

And you do realise that if someone created such a block that no-one else could verify it in the 10 minutes required to create the next block right?

(or - you don't care about the fact that this just wouldn't work?)


Not true, with BIP 109 activated, there cannot be transactions with extremly long validation times. Actually this reducting SigOp to 1.3 GB operations is reccomended ever for 1 MB limit. Dont worry, keep solving problems to experts, not to those saying its impossible to scale onchain, because all the evidence points onchain scalling is possible for many years to keep the demand and necessary decentralization with full nodes running at home computers (not necessary every obsolute low spec home computer though!)

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 08:31:33 PM
Last edit: March 08, 2016, 08:42:36 PM by adamstgBit
 #117

As hard as it might be to see, I really believe the crisis in front of us is one of perception as opposed to anything technical.  Perceptions are manageable while the real work of sorting through the technical issues is taken out of the limelight.

The network is working and it's still relatively cheap (5cents pre TX)
But we have threads that are titled " why is my TX not confirming ?? " or somthing to that effect
Newbies are using bitcoin for the first time and are having a hard time, with their BTC tie up seemly never to confrim, and they conclude that bitcoin is not all that it's cracked up to be....
is this a problem?

I remember when i was a newbie, i would check over and over waiting for the first 6 confirmations, everything went smoothly, but I didnt fully trust that it would go smoothly, i was afraid my money would get lost or somthing. slowly my confidence in the system grew as i used it more and understood it more.

not sure i would have be able to build any confidence had i started using bitcoin today....

SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 08, 2016, 08:40:50 PM
 #118

He thought Moore's law would keep up with the scalability demands...

Moore's law has never applied to bandwidth (so if Satoshi really thought that then clearly he wasn't a genius was he).

Or perhaps you think that because Satoshi could never be wrong then if he said that "Moore's Law" applies to bandwidth it actually does?


Bandwidth is not a technological issues is a centralized market control by big players on Internet market. I pay 10$ per month on a fiber optic in-house connection 1gbit without any issue. Except that the fiber optic is kinda sensitive I got it broken a few times *lol*. No bandwidth issue. The same computer that is also running a Bitcoin Node I play via Steam Stream full HD games. So again what did you said about bandwidth? I downloaded the blockchain with 25mbit/sec. It was finished in exactly 3 and half hours.

My phone will be able to keep the Bitcoin blockchain right now without any issues my internet connection can do that. If miners bandwidth is an issue they they should be out of business. As they don't want power outages , they don't want bad internet connection. That's their business.
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
March 08, 2016, 08:47:44 PM
 #119

As hard as it might be to see, I really believe the crisis in front of us is one of perception as opposed to anything technical.  Perceptions are manageable while the real work of sorting through the technical issues is taken out of the limelight.

The network is working and it's still relatively cheap (5cents pre TX)
But we have threads that are titled " why is my TX not confirming ?? " or somthing to that effect
Newbies are using bitcoin for the first time and are having a hard time, with their BTC tie up seemly never to confrim, and they conclude that bitcoin is not all that it's cracked up to be....
is this a problem?

I remember when i was a newbie, i would check over and over waiting for the first 6 confirmations, everything went smoothly, but I didnt fully trust that it would go smoothly, i was afraid my money would get lost or somthing. slowly my confidence in the system grew as i used it more and understood it more.

not sure i would have be able to build any confidence had i started using bitcoin today....

Good spoken! The newbies are the masses.

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 08:56:39 PM
 #120

So, one very large transaction with numerous sigops leads to the quadratic growth.  Hmm, so blindly increasing the block size is asking for trouble.  How many of these troublesome transactions are launched against us recently?  Or is it an unexploited vulnerability?
Perhaps we could increase the block size *and* constrain sigops/transaction until we get SegWit out the door?
Well the problem is only minor at a 1 MB block size limit. Segwit should scale it down and make it linear. It should be released in April (from the initial estimates), so I don't see how you plan to deploy the block size limit and a sigops limitation in <2 months.

but the block would get orphaned?
No, why would they?

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 08:59:33 PM
 #121

but the block would get orphaned?
No, why would they?
because the 1 TX the miner included, is obviously an attack to try and get the other miners to run in circles validating for minutes while he starts on the new block?

let's ignore the fact that miners have a strong incentive to keep the network running smoothly... and think about this WILD speculation that with 2MB blocks a miners can block all mining on the network by creating extremely complex transaction that take a long time to validate.

why can't the other miners simply orphen these blocks?

franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 08, 2016, 09:09:07 PM
 #122

So, one very large transaction with numerous sigops leads to the quadratic growth.  Hmm, so blindly increasing the block size is asking for trouble.  How many of these troublesome transactions are launched against us recently?  Or is it an unexploited vulnerability?
Perhaps we could increase the block size *and* constrain sigops/transaction until we get SegWit out the door?
Well the problem is only minor at a 1 MB block size limit. Segwit should scale it down and make it linear. It should be released in April (from the initial estimates), so I don't see how you plan to deploy the block size limit and a sigops limitation in <2 months.

and now you see why a 2mb+segwit released in april with 6month grace is not a problem

i love it when lauda debunks his own doomsday scenario

having the 2mb block limit included in aprils release is easy. plus it incentivises more people to download the april version ensuring a real chance of no contention, instead of having upgrades every couple months EG march april july. which just mess with the community too much

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 09:14:26 PM
 #123

So, one very large transaction with numerous sigops leads to the quadratic growth.  Hmm, so blindly increasing the block size is asking for trouble.  How many of these troublesome transactions are launched against us recently?  Or is it an unexploited vulnerability?
Perhaps we could increase the block size *and* constrain sigops/transaction until we get SegWit out the door?
Well the problem is only minor at a 1 MB block size limit. Segwit should scale it down and make it linear. It should be released in April (from the initial estimates), so I don't see how you plan to deploy the block size limit and a sigops limitation in <2 months.
Another thank you Lauda:  If only more people took the time to make clear their positions like you do.  Yep, I see it; with SegWit so close it would be disruptive to overlap it with a block size limit increase combined with a sigops limitation.  I do feel it was a missed opportunity not to provide a modest block increase (even without a sigops limitation) many months ago but there's no going back now.  *Hopefully* SegWit will come out on time, functioning well, and be embraced quickly without a lot of angst.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 09:23:45 PM
 #124

Another thank you Lauda:  If only more people took the time to make clear their positions like you do.  Yep, I see it; with SegWit so close it would be disruptive to overlap it with a block size limit increase combined with a sigops limitation.  I do feel it was a missed opportunity not to provide a modest block increase (even without a sigops limitation) many months ago but there's no going back now.  *Hopefully* SegWit will come out on time, functioning well, and be embraced quickly without a lot of angst.
You're very welcome. I do hope that there will be a hard fork proposal after Segwit which will gain consensus. However, if Segwit is adopted quickly by both the miners and users we should see a good increase in transaction capacity that should hopefully get us through 2016. The problem caused in this conflict is that there are a lot of people who aren't willing to listen to reason and facts. Time should not be wasted on them.

why can't the other miners simply orphen these blocks?
Exactly how do you plan on "simply" detecting these blocks and why would somebody orphan them? How do you classify this as an attack; was the TX that F2Pool did to clear the spam also an attack?

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 09:23:49 PM
 #125

So, one very large transaction with numerous sigops leads to the quadratic growth.  Hmm, so blindly increasing the block size is asking for trouble.  How many of these troublesome transactions are launched against us recently?  Or is it an unexploited vulnerability?
Perhaps we could increase the block size *and* constrain sigops/transaction until we get SegWit out the door?
Well the problem is only minor at a 1 MB block size limit. Segwit should scale it down and make it linear. It should be released in April (from the initial estimates), so I don't see how you plan to deploy the block size limit and a sigops limitation in <2 months.
and now you see why a 2mb+segwit released in april with 6month grace is not a problem

i love it when lauda debunks his own doomsday scenario

having the 2mb block limit included in aprils release is easy. plus it incentivises more people to download the april version ensuring a real chance of no contention, instead of having upgrades every couple months EG march april july. which just mess with the community too much
Well, franky1, that's really interesting (although it could have been said just as well without the provocative words); 2MB + SegWit.  We do want to be careful -- releasing multiple things at the same time can lead to confusion if things don't go perfectly well.  I'd hate to see something going wrong and each side blaming the other.  Also, if there would be a need to retract a feature/function then that would likely cause a media stir; something Bitcoin doesn't need anymore of at this time.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 09:30:38 PM
Last edit: March 08, 2016, 09:45:44 PM by adamstgBit
 #126

why can't the other miners simply orphen these blocks?
Exactly how do you plan on "simply" detecting these blocks and why would somebody orphan them? How do you classify this as an attack; was the TX that F2Pool did to clear the spam also an attack?

somthing like  if TX has >10,000 inputs it's no good?

why not!? I wouldn't mind if F2Pool's tiny blocks were orphaned by other miners. maybe he'll stop making these tiny blocks if they do punish him for it?

oh wait i read that wrong....

again, WHY NOT!? the TX in question WAS an attack, why include in the blockchain?

if you can't be sure if its spam or not ( <1$ TX with 1cent fee ) fine included in the block, but if you are sure it is spam ( >10K inputs ) why include it?

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 09:50:18 PM
 #127

As hard as it might be to see, I really believe the crisis in front of us is one of perception as opposed to anything technical.  Perceptions are manageable while the real work of sorting through the technical issues is taken out of the limelight.

The network is working and it's still relatively cheap (5cents pre TX)
But we have threads that are titled " why is my TX not confirming ?? " or somthing to that effect
Newbies are using bitcoin for the first time and are having a hard time, with their BTC tie up seemly never to confrim, and they conclude that bitcoin is not all that it's cracked up to be....
is this a problem?

I remember when i was a newbie, i would check over and over waiting for the first 6 confirmations, everything went smoothly, but I didnt fully trust that it would go smoothly, i was afraid my money would get lost or somthing. slowly my confidence in the system grew as i used it more and understood it more.

not sure i would have be able to build any confidence had i started using bitcoin today....
adamstgBit, you've hit it right on the head.  We must do everything we can to help folks adopt Bitcoin without pain/anxiety.  Our marketing messages need to set expectations appropriately; gone are the days of leading with the misleading "no fees".  It's ok to indicate lower fees than the competition, e.g. wire transfers, VISA, *if* it is indeed true but don't touch the topic if it is not -- if 5¢/transaction is the going fee then transactions less than say $5, i.e. 1%, make less sense.  Instead lead with our indisputable strengths, e.g. ~1 hour (6 blocks) to guarantee transfers of even large amounts anywhere in the world.  It is just unbelievable that there are wallets out there that don't set an appropriate fee automatically and by default such that transactions get through quickly despite the dynamic environment.  Once a new user grows accustom then they could dig in and find the overrides to try transactions with low/zero fees and see the natural consequences of long delays.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 09:55:37 PM
 #128

why can't the other miners simply orphen these blocks?
Exactly how do you plan on "simply" detecting these blocks and why would somebody orphan them? How do you classify this as an attack; was the TX that F2Pool did to clear the spam also an attack?

somthing like  if TX has >10,000 inputs it's no good?

why not!? I wouldn't mind if F2Pool's tiny blocks were orphaned by other miners. maybe he'll stop making these tiny blocks if they do punish him for it?

oh wait i read that wrong....

again, WHY NOT!? the TX in question WAS an attack, why include in the blockchain?

if you can't be sure if its spam or not ( <1$ TX with 1cent fee ) fine included in the block, but if you are sure it is spam ( >10K inputs ) why include it?
That's interesting adamstgBit.  *Is* it *always* the case that >10K inputs is indeed spam?  *Is* there *ever* a case where it is not?  Can the same movements be accomplished by splitting it up into multiple transactions to avoid triggering the spam rejection?
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 09:59:34 PM
 #129

As hard as it might be to see, I really believe the crisis in front of us is one of perception as opposed to anything technical.  Perceptions are manageable while the real work of sorting through the technical issues is taken out of the limelight.

The network is working and it's still relatively cheap (5cents pre TX)
But we have threads that are titled " why is my TX not confirming ?? " or somthing to that effect
Newbies are using bitcoin for the first time and are having a hard time, with their BTC tie up seemly never to confrim, and they conclude that bitcoin is not all that it's cracked up to be....
is this a problem?

I remember when i was a newbie, i would check over and over waiting for the first 6 confirmations, everything went smoothly, but I didnt fully trust that it would go smoothly, i was afraid my money would get lost or somthing. slowly my confidence in the system grew as i used it more and understood it more.

not sure i would have be able to build any confidence had i started using bitcoin today....
adamstgBit, you've hit it right on the head.  We must do everything we can to help folks adopt Bitcoin without pain/anxiety.  Our marketing messages need to set expectations appropriately; gone are the days of leading with the misleading "no fees".  It's ok to indicate lower fees than the competition, e.g. wire transfers, VISA, *if* it is indeed true but don't touch the topic if it is not -- if 5¢/transaction is the going fee then transactions less than say $5, i.e. 1%, make less sense.  Instead lead with our indisputable strengths, e.g. ~1 hour (6 blocks) to guarantee transfers of even large amounts anywhere in the world.  It is just unbelievable that there are wallets out there that don't set an appropriate fee automatically and by default such that transactions get through quickly despite the dynamic environment.  Once a new user grows accustom then they could dig in and find the overrides to try transactions with low/zero fees and see the natural consequences of long delays.
I feel the core dev team is not willing to make the appropriate trade offs.
I do believe we could of had the 2MB limit inplace months ago and all this  pain/anxiety avoided.
but the Core dev team doesn't seem very concerned with end users pain/anxiety, as much as they are with theoretically more elegant scaling.
I feel they are programmers not qualified to manage and direct the project, and they are making bad decision, maybe the decision they make are technically more elegant, but they do not create a more elegant user experience. but they are convinced that the users dont matter because its Bitcoin's birth right to replace central banking.

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 10:02:06 PM
Last edit: March 08, 2016, 10:22:50 PM by David Rabahy
 #130

Listen everyone: *** Bitcoin is a fantastic thing!!! ***  Our internal debates are seen; they are taken as a measure of our ability to provide governance or not.  There are folks lurking in here; picking up on the mood and reporting on it.  Every posting should take this into account.

Championing Bitcoin first before espousing one technical aspect or another will help.

Avoiding the practice of nitpicking or worse will help.

Compromise and appeasement are valuable tools.
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 08, 2016, 10:04:08 PM
Last edit: March 08, 2016, 10:18:31 PM by franky1
 #131

Well, franky1, that's really interesting (although it could have been said just as well without the provocative words); 2MB + SegWit.  We do want to be careful -- releasing multiple things at the same time can lead to confusion if things don't go perfectly well.  I'd hate to see something going wrong and each side blaming the other.  Also, if there would be a need to retract a feature/function then that would likely cause a media stir; something Bitcoin doesn't need anymore of at this time.

then we move onto the other debate. blockstream devs not only want segwit in april but some other changes within the same release.

one of them is actually debunks the "hard fork is doomsday".. because it, itself is a hardfork.
by which i mean luke Jr's proposal for code to be added in april with just a 3 month grace (activating at block 420,000) in the attempt to unnaturally drop the difficulty to allow miners to solve blocks easier. basically forcing blocks to be made in 5 minutes instead of 10, to allow an extra 2 weeks of similar income before natural biweekly difficulty adjustments raise the difficulty.

which apart from knowing its just a feature that has no long term purpose. the suggestion that a hardfork can be added in april with a 3 month grace totally contradicts the same devs who say that a hardfork which is not on the roadmap needs 12months grace..

but the ultimate thing that defies logic, is the solid faith that all code done by blockstream is perfect and any proposal outside of blockstream must be ruled out, veto'd, debated to cause contention and delay.

like i said a 2mb buffer is like the 1mb buffer in 2013. it wont cause a doubling of blocksize over night. as the miners will have preferential settings to grow slowly within the hard limit.

the code itself is simple to implement. and the only thing that can cause harm would be contention by those foolhardily refusing to upgrade when the majority have actually already upgraded.

thats why its better to have the code available to all and then let the community decide if they want it..
if no one wants it. it doesnt activate, its that simple rather then avoid and delay the code and cause the contention that they doomsday speak about. by never letting the community have access to it.(self fulfilling prophecy they created)

now that has been said. going back to the other stuff you have said
i would like to see the results of the tests you intend to do. as real results always outweigh opinion and guesswork. so your tests can actually help out alot of people

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 10:12:07 PM
 #132

I feel the core dev team is not willing to make the appropriate trade offs.
I do believe we could of had the 2MB limit inplace months ago and all this pain/anxiety avoided.
but the Core dev team doesn't seem very concerned with end users pain/anxiety, as much as they are with theoretically more elegant scaling.
I feel they are programmers not qualified to manage and direct of the project, and they are making bad decision, maybe the decision they make are technically more elegant, but they are currently create a more elegant user experience. but they are convinced that the users dont matter because its Bitcoin birth right to replace central banking.
Bitcoin is up and running well (assuming one uses reasonable fees ~5¢/transaction) despite an onslaught of ill-intention persons.  Shame on all of us for not adjusting our marketing messages earlier to set expectations better.  Find me another system that has withstood as much and moves millions of dollars a day.

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free?

My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.

Lauda has done a yeoman's job representing the positions; thank goodness someone has the patience.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 10:17:33 PM
 #133

i would like to see the results of the tests you intend to do. as real results always outweigh opinion and guesswork. so your tests can actually help out alot of people
Although I am willing to participate in a test to measure the SigOp quadratic growth, I am not able without help.  If someone will build the code then I will install, run, and report back on it.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 10:41:37 PM
 #134

I feel the core dev team is not willing to make the appropriate trade offs.
I do believe we could of had the 2MB limit inplace months ago and all this pain/anxiety avoided.
but the Core dev team doesn't seem very concerned with end users pain/anxiety, as much as they are with theoretically more elegant scaling.
I feel they are programmers not qualified to manage and direct of the project, and they are making bad decision, maybe the decision they make are technically more elegant, but they are currently create a more elegant user experience. but they are convinced that the users dont matter because its Bitcoin birth right to replace central banking.
Bitcoin is up and running well (assuming one uses reasonable fees ~5¢/transaction) despite an onslaught of ill-intention persons.  Shame on all of us for not adjusting our marketing messages earlier to set expectations better.  Find me another system that has withstood as much and moves millions of dollars a day.

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free?

My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.

Lauda has done a yeoman's job representing the positions; thank goodness someone has the patience.

give it a few more days of debate, you'll see your optimism and praise will turn to anger and disgust. LOL  Grin

your a really nice guy, quite refreshing!

LEAVE THIS PLACE IMMEDIATELY, this is for your own good.

Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 10:48:32 PM
 #135

That's interesting adamstgBit.  *Is* it *always* the case that >10K inputs is indeed spam?  *Is* there *ever* a case where it is not?  Can the same movements be accomplished by splitting it up into multiple transactions to avoid triggering the spam rejection?
These limitations are very bad. Wasn't Bitcoin supposed to be censorship free? Who gets to decide what kind of transactions we are going to limit? As an example, the sigops limitation that Gavin implemented in Classic is not a solution of any kind. For example, if we had confidential transactions today they would not work due to this. There are so many potential use cases that it is nearly impossible for us to consider everything.

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free? My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.
They're a group of volunteers. Some are employed by MIT (Wladimir IIRC), and some are employed by Blockstream (Maxwell, Wuille) and such. However, most of them are just volunteers from what I know.

Lauda has done a yeoman's job representing the positions; thank goodness someone has the patience.
I try my best, as long as the other person (especially when lacking knowledge) is willing to listen to reason, facts, data and such.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 10:51:36 PM
 #136

why can't the other miners simply orphen these blocks?
Exactly how do you plan on "simply" detecting these blocks and why would somebody orphan them? How do you classify this as an attack; was the TX that F2Pool did to clear the spam also an attack?

somthing like  if TX has >10,000 inputs it's no good?

why not!? I wouldn't mind if F2Pool's tiny blocks were orphaned by other miners. maybe he'll stop making these tiny blocks if they do punish him for it?

oh wait i read that wrong....

again, WHY NOT!? the TX in question WAS an attack, why include in the blockchain?

if you can't be sure if its spam or not ( <1$ TX with 1cent fee ) fine included in the block, but if you are sure it is spam ( >10K inputs ) why include it?
That's interesting adamstgBit.  *Is* it *always* the case that >10K inputs is indeed spam?  *Is* there *ever* a case where it is not?  Can the same movements be accomplished by splitting it up into multiple transactions to avoid triggering the spam rejection?

someone can probably come up with some highly speculative scenario where such a transaction could potently be useful
is that excuse a good reason to not protect the network against such an attack?

SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 08, 2016, 10:59:06 PM
 #137

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free? My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.
They're a group of volunteers. Some are employed by MIT (Wladimir IIRC), and some are employed by Blockstream (Maxwell, Wuille) and such. However, most of them are just volunteers from what I know.



They are mostly Blockstream members. They are on payroll of bankers from VC's. Before somebody call me a liar. You can check out https://www.blockstream.com/team/ and then compare with https://bitcoin.org/en/development#bitcoin-core-contributors . I will stop coming into a conversation without proof. I hope that's good enough. It will never have an 2MB hardfork they will just give us breadcrumbs to shut up a bit while they are building solutions so people will go in wave to their solutions rather then stay on main chain. Transforming it in a settlement layer and absolute the P2P payment system and take it off-chain to their solution.

That's the last thing I had to say. No more involving in this subject, it drains me out of energy, it drains me out of patience. All I can do is keep my Bitcoin Classic nodes up and hope slowly other will join as more will be aware of the breadcrumbs were given. Dramatizing the whole increase of block size just because it's dangerous without any real and statistical answer while they forked when testing SegWit it says a lot.

Also they are a lot of Bitcoin Core supporters here while the whole board bitcointalk is 100% Bitcoin Core by just not offering people alternative solutions and advertising every update of Bitcoin Core while not offering people alternatives with proper explanation prooves again the control of mass and keeping it them blind against recent Bitcoin discussion. It absolute the whole idea of community of an free & open source project made by people for people. So Bitcoin Classic must somehow pay for advertising and inform people about the news and their right to choose, because other channels are trying to hide as much as possible the right to choose.

Thanks,
I wish both sides good luck and may the best chain win.

adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 11:09:21 PM
 #138


These limitations are very bad. Wasn't Bitcoin supposed to be censorship free? Who gets to decide what kind of transactions we are going to limit?


this is a prime example of ivory tower thinking getting in the way of real solutions that matter to real poeple.


As an example, the sigops limitation that Gavin implemented in Classic is not a solution of any kind. For example, if we had confidential transactions today they would not work due to this. There are so many potential use cases that it is nearly impossible for us to consider everything.

this is a prime example of bad project management, we can't achieve any kind of consensus without first agreeing to what the main goals of the project is.

of course Gavin is doing things differently in classic his primary goal is to scale the blockchain as much as possible.

this is NOT the main goal of Core, which is fine, there second layer solution is fine, but it's simply not what the majority want...

Gleb Gamow
In memoriam
VIP
Legendary
*
Offline Offline

Activity: 1428
Merit: 1145



View Profile
March 08, 2016, 11:11:50 PM
 #139

I would be willing to run a full node on a testnet to see if my system could handle larger blocks, i.e. verify a large block in less than the average time between blocks.

I have a question:  The total amount work to verify N 1MB blocks is about the same as single N-MB block, right?  For example, 32 1MB blocks take about the same amount of work to verify as a single 32MB block, right?  Just please ignore the live delivery of blocks for the moment.  Or is there some advantage to large blocks where less headers have to be processed.  Imagine a full node was off the air for a day or two and is just trying to catch up as fast as possible.  What block size facilitates that best?

To me it seems fees tend to be inversely proportional to block size, i.e. with smaller blocks fees rise as folks compete to get into blocks, with larger blocks fees get smaller with less competition to get into blocks.  What does it cost a bad actor (if there is truly such a thing in this realm) to clog up the works?  I suppose we are looking for the right size of block to cause them to expend their resources most quickly.  Make the block size very small and the fee competition would rise high enough to deplete the bad actor very fast; everyone suffers higher fees until they are run out of town (so to speak).  Hmm, but if the block size is very small then even when there aren't any bad actors on the scene, regular legit users would be forced to compete.  At the other end of the spectrum; make the block size very large and with such low competition fees would diminish.  The real question here is what happens to the fees/MB across the spectrum of block sizes.

Is there *anyone* preferring a smaller than 1MB block size right now?  I haven't heard of any but you never know.  I do think some miners do artificially constrain the block size they produce to like 900KB or so (I'm not sure of their motivation).  Even if the block size were increased then such miners could still constrain the ones they produce, right?

A transaction cannot span multiple blocks, right?  I suppose the block size creates a functional limit on transaction sizes.  Or is the size of a transaction constrained some other way?

Odd! I find this post more similar to Satoshi Nakamoto's writings that the letter in the OP supposedly sent to Mike Hearn by SN which I contend was written by MH. (see my earlier post in this thread)
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 11:29:39 PM
 #140

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free? My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.
They're a group of volunteers. Some are employed by MIT (Wladimir IIRC), and some are employed by Blockstream (Maxwell, Wuille) and such. However, most of them are just volunteers from what I know.

They are mostly Blockstream members. They are on payroll of bankers from VC's. Before somebody call me a liar. You can check out https://www.blockstream.com/team/ and then compare with https://bitcoin.org/en/development#bitcoin-core-contributors . I will stop coming into a conversation without proof. I hope that's good enough. It will never have an 2MB hardfork they will just give us breadcrumbs to shut up a bit while they are building solutions so people will go in wave to their solutions rather then stay on main chain. Transforming it in a settlement layer and absolute the P2P payment system and take it off-chain to their solution.

That's the last thing I had to say. No more involving in this subject, it drains me out of energy, it drains me out of patience. All I can do is keep my Bitcoin Classic nodes up and hope slowly other will join as more will be aware of the breadcrumbs were given. Dramatizing the whole increase of block size just because it's dangerous without any real and statistical answer while they forked when testing SegWit it says a lot.

Also they are a lot of Bitcoin Core supporters here while the whole board bitcointalk is 100% Bitcoin Core by just not offering people alternative solutions and advertising every update of Bitcoin Core while not offering people alternatives with proper explanation prooves again the control of mass and keeping it them blind against recent Bitcoin discussion. It absolute the whole idea of community of an free & open source project made by people for people. So Bitcoin Classic must somehow pay for advertising and inform people about the news and their right to choose, because other channels are trying to hide as much as possible the right to choose.

Thanks,
I wish both sides good luck and may the best chain win. 

+1

@David Rabahy, this is all true, read it and understand why we are out of patience.

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 11:59:18 PM
 #141

They are mostly Blockstream members. They are on payroll of bankers from VC's. Before somebody call me a liar. You can check out https://www.blockstream.com/team/ and then compare with https://bitcoin.org/en/development#bitcoin-core-contributors. I will stop coming into a conversation without proof.
From the former;

Adam Back, Ph.D.
Matt Corallo
Johnny Dilley
Alex Fowler
Mark Friedenbach
Ben Gorlick
Francesca Hall
Austin Hill
Gregory Maxwell *195
James Murdock
Jonas Nick
Rusty Russell
Gregory Sanders
Patrick Strateman *37
Erik Svenson
Warren Togami *12
Jorge Timón *112
Jonathan Wilkins
Glenn Willen *1
Pieter Wuille, Ph.D. *966

* found in the latter; (# of commits).
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 12:13:46 AM
 #142

Bitcoin Core contributors (Ordered by number of commits);

Wladimir J. van der Laan (3398)
Gavin Andresen (1100)
Pieter Wuille (966)
Cory Fields (332)
TheBlueMatt (288)
jonasschnelli (246)
Luke-Jr (239)
Gregory Maxwell (195)
MarcoFalke (128)
fanquake (118)
jtimon (112)
Peter Todd (92)
cozz (70)
sdaftuar (69)
morcos (59)
paveljanik (52)
pstratem (37)
muggenhor (34)
Eric Lombrozo (32)
rebroad (28)
domob1812 (25)
Michagogo (24)
dooglus (22)
dexX7 (20)
dgenr8 (18)
super3 (16)
kdomanski (15)
casey (15)
xanatos (14)
ENikS (13)
wtogami (12)
codler (10)
btcdrak (10)
wizeman (10)
maaku (10)
rnicoll (9)
ajweiss (8)
roques (8)
jamesob (8)
jmcorgan (8)
jordanlewis (8)
21E14 (8)
devrandom (8)
joshtriplett (8)
Nils Schneider (7)
forrestv (7)
freewil (7)
rat4 (7)
sinetek (7)
dcousens (7)
sje397 (7)
celil-kj (7)
sandakersmann (7)
runeksvendsen (7)
mrbandrews (7)
OttoAllmendinger (6)
mgiuca (6)
Matoking (6)
vegard (6)
zw (6)
p2k (6)
JoelKatz (6)
jrmithdobbs (6)
ashleyholman (6)
dertin (6)
Andreas Schildbach (6)
mndrix (5)
r000n (5)
roybadami (5)
vinniefalco (5)
Whit Jack (5)
fcicq (5)
ptschip (5)
maraoz (5)
federicobond (5)
alexanderkjeldaas (5)
robbak (5)
rdponticelli (5)
...
sorry I ran out of gas/time.  So, pretty clearly "most" don't appear on the link from Blockstream.
Crazygreek
Sr. Member
****
Offline Offline

Activity: 378
Merit: 251


View Profile
March 09, 2016, 01:36:39 AM
 #143

In order to be able to scale larger than Visa network, Bitcoin needs to offer something more concurrent than Visa network. Right now even the block size problem has no chosen way of resolving, and Visa offers absolutely incomparable amount of transactions per hour. Until this problem and many others get resolved, Bitcoin has nothing to offer against Visa.

chopstick
Legendary
*
Offline Offline

Activity: 992
Merit: 1000


View Profile
March 09, 2016, 01:52:18 AM
 #144

In order to be able to scale larger than Visa network, Bitcoin needs to offer something more concurrent than Visa network. Right now even the block size problem has no chosen way of resolving, and Visa offers absolutely incomparable amount of transactions per hour. Until this problem and many others get resolved, Bitcoin has nothing to offer against Visa.

From the man satoshi himself:

"The existing Visa credit card network processes about 15 million Internet purchases per day worldwide. Bitcoin can already scale much larger than that with existing hardware for a fraction of the cost. It never really hits a scale ceiling." - Satoshi

"The eventual solution will be to not care how big the blockchain gets." - Satoshi

https://forum.bitcoin.com/bitcoin-discussion/the-eventual-solution-will-be-to-not-care-how-big-it-the-bitcoin-blockchain-gets-t6196.html

Satoshi intended for everything to be on the blockchain. He knew node operators would eventually need to be datacenter-size, but still, it's a problem that is many many years off. There is no reason not to increase the blocksize right now to 2mb, since it won't hurt node operators right now at that level. Unless of course your whole business model is to profit eternally off sidechains *cough borgstream cough*.




chopstick
Legendary
*
Offline Offline

Activity: 992
Merit: 1000


View Profile
March 09, 2016, 01:53:56 AM
 #145

By the way, the satoshi quotes posted in this thread have now been deleted numerous times off of r/bitcoin

Censorship in action.

All hail our new Blockstream overlords... resistance is futile

I think that is what  they are trying to say anyway.



johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
March 09, 2016, 02:11:08 AM
 #146

Everyone that is against core's road map is wrong, including Satoshi

adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 02:11:54 AM
 #147

Bitcoin Core contributors (Ordered by number of commits);

Wladimir J. van der Laan (3398)
Gavin Andresen (1100)
Pieter Wuille (966)
Cory Fields (332)
TheBlueMatt (288)
jonasschnelli (246)
Luke-Jr (239)
Gregory Maxwell (195)
MarcoFalke (128)
fanquake (118)
jtimon (112)
...
...
sorry I ran out of gas/time.  So, pretty clearly "most" don't appear on the link from Blockstream.

notice how Gavin Andresen appears in your list but is no longer allowed to commit anything to Blockstream i mean bitcoinCore

adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 02:15:46 AM
 #148

I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.

I believe the good in poeple, gets ripped away when there's less than 50$ on the table.

franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 02:32:56 AM
 #149


notice how Gavin Andresen appears in your list but is no longer allowed to commit anything to Blockstream i mean bitcoinCore


notice how adam back(blockstream) has not been shown to be involved in bitcoin... yet.. well shhhh lets not say

there are othernames too. but shhh lets not say, lets instead pretend that blockstream doesnt have veto power

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
AliceWonderMiscreations
Full Member
***
Offline Offline

Activity: 182
Merit: 107


View Profile WWW
March 09, 2016, 03:01:27 AM
 #150

Gavin is hoping that everyone is going to want to see "The Return Of The King" but I think that most of us actually would prefer to listen to a https://en.wikipedia.org/wiki/A_Farewell_to_Kings.

Indeed, one of their best albums.

I hereby reserve the right to sometimes be wrong
AliceWonderMiscreations
Full Member
***
Offline Offline

Activity: 182
Merit: 107


View Profile WWW
March 09, 2016, 03:14:49 AM
 #151

So, one very large transaction with numerous sigops leads to the quadratic growth.  Hmm, so blindly increasing the block size is asking for trouble.  How many of these troublesome transactions are launched against us recently?  Or is it an unexploited vulnerability?
Perhaps we could increase the block size *and* constrain sigops/transaction until we get SegWit out the door?
Well the problem is only minor at a 1 MB block size limit. Segwit should scale it down and make it linear. It should be released in April (from the initial estimates), so I don't see how you plan to deploy the block size limit and a sigops limitation in <2 months.

and now you see why a 2mb+segwit released in april with 6month grace is not a problem

i love it when lauda debunks his own doomsday scenario

having the 2mb block limit included in aprils release is easy. plus it incentivises more people to download the april version ensuring a real chance of no contention, instead of having upgrades every couple months EG march april july. which just mess with the community too much

That is a valid point. There is no reason why the code can't be in the April client, with miners only triggered to make the big blocks if there is both consensus (95%) *and* nearly full blocks.

That way if segwit does what we believe it will, the 2MB blocks never happen. But if number of transactions making it into blocks is large enough that even with segwit the blocks are full, the code is already in the clients.

I hereby reserve the right to sometimes be wrong
AliceWonderMiscreations
Full Member
***
Offline Offline

Activity: 182
Merit: 107


View Profile WWW
March 09, 2016, 03:18:59 AM
 #152

For example, trigger the 2MB fork iff the last 1000 blocks used > 920 MB. That's about a weeks worth of blocks.

EDIT
Specify the check to happen every 1000 blocks starting from block X.

Once it happens, the next version of the client can just hard code 2 MB since checking every 1k blocks from X will always result in a true.

I hereby reserve the right to sometimes be wrong
BTCBinary
Hero Member
*****
Offline Offline

Activity: 504
Merit: 500


View Profile
March 09, 2016, 03:51:46 AM
 #153

It seems to me that this explains a lot and why a block size increase is perfectly clear to solve the blocksize problem as well as the fee charge increase.
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 04:22:06 AM
 #154

For example, trigger the 2MB fork iff the last 1000 blocks used > 920 MB. That's about a weeks worth of blocks.

EDIT
Specify the check to happen every 1000 blocks starting from block X.

Once it happens, the next version of the client can just hard code 2 MB since checking every 1k blocks from X will always result in a true.

easier to code something like. current block minus 1000 that way its checking the last 1000 blocks constantly. rather then a precise group (which can be attacked)

EG if it was check block 415000 to 416000 then miners could play around and make small 415000-415500 then have 500 large blocks.
then the following week have 416000-416500 as large blocks and then 500 small blocks. which would not cause the trigger because
415000-416000 only has 500large blocks
416000-417000 only has 500large blocks

but it it is checking the last recent 1000 blocks. then at 416500 it sees there are 1000 large blocks going all the way back to 415500

and as for the 95%.. well no where is it ever possible to get 95%. not even core v0.12 combined with v0.11 has 95% yet they claim they are the consensus.

a fairer measure is 75%, because. after all 75% is not the active trigger, it is just the loading of the gun. giving people 3months(lukejr's estimates) 6months(my preference) 12months(blockstream preference) to upgrade and become statistics above 75% before the trigger is pulled

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 09, 2016, 04:34:26 AM
 #155

Everyone that is against core's road map is wrong, including Satoshi

See? This is the ideology that has poisoned the whole Blockstream supporters. I can't call it Bitcoin Core anymore. Cause it's to obvious.

This kind of ideology. This requires a change! We need to switch to Bitcoin Classic as soon as possible. SegWit will be there also.
AliceWonderMiscreations
Full Member
***
Offline Offline

Activity: 182
Merit: 107


View Profile WWW
March 09, 2016, 05:11:59 AM
 #156

and as for the 95%.. well no where is it ever possible to get 95%. not even core v0.12 combined with v0.11 has 95% yet they claim they are the consensus.

a fairer measure is 75%, because. after all 75% is not the active trigger, it is just the loading of the gun. giving people 3months(lukejr's estimates) 6months(my preference) 12months(blockstream preference) to upgrade and become statistics above 75% before the trigger is pulled

Clients don't matter all that much, it's 95% of miners that really matter. With only a small percentage of miners reject 2MB blocks the fork that rejects > 1 MB blocks will be incredibly slow causing clients to update.

I hereby reserve the right to sometimes be wrong
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 06:59:16 AM
 #157

this is a prime example of ivory tower thinking getting in the way of real solutions that matter to real poeple.
No. You don't get to define what we allow in the system and what we don't, certainly not when it was possible all this time. What Gavin proposed is a hacky workaround, nothing more.

this is a prime example of bad project management, we can't achieve any kind of consensus without first agreeing to what the main goals of the project is.
Stop being greedy and stop wanting everyone to use Bitcoin just because the price would be high. This would help you see things more clearly.

Censorship in action.
If censorship was present, you would have been banned long ago for mentioning anything, ergo it isn't.

This kind of ideology. This requires a change! We need to switch to Bitcoin Classic as soon as possible. SegWit will be there also.
No. Contentious HF's are what got us here in the first place.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
ATguy
Sr. Member
****
Offline Offline

Activity: 423
Merit: 250



View Profile
March 09, 2016, 07:48:28 AM
 #158

this is a prime example of bad project management, we can't achieve any kind of consensus without first agreeing to what the main goals of the project is.
Stop being greedy and stop wanting everyone to use Bitcoin just because the price would be high. This would help you see things more clearly.

The problem is there are people who want regulate how many people can use Bitcoin by keeping current blocksize limit. The most obvious conflict of interest are altcoins which can only get used more at the Bitcoin expence, and those who invested in Bitcoin (speculative or in infrastructure) are mostly not in favour of artifically limited blocksize because it restrict how many people can conveniently use Bitcoin.

The question is not whether Bitcoin can scale to the Visa network but whether Bitcoin can scale to just keep up the demand, because there is not such high demand for Bitcoin to be used for Visa levels in near future. And im sure the answer is yes.

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 09, 2016, 08:05:18 AM
Last edit: March 09, 2016, 08:16:45 AM by SpiryGolden
 #159


This kind of ideology. This requires a change! We need to switch to Bitcoin Classic as soon as possible. SegWit will be there also.
No. Contentious HF's are what got us here in the first place.


Oh that's Core plan if Classic will have the majority?. Other then that I do not see any danger if people could get a long and let Bitcoin , be Bitcoin! That's it. For real. Lets be humans for a moment. This is what Bitcoin was meant to be. You guys just want it to be your play toy with weak arguments pushing off-chain solution promoted ...be we know who. So yes. Let Bitcoin be pure Bitcoin. And other off-chain solution to be off-chain in their own eco-system.

What is so hard to be understood? This is the Bitcoin experience. When you first found out about Bitcoin, I am sure you were aware of all this and the fact that Hard Forks are a necessity in evolution of a great software.

SegWit forked, that was it! Simple as that, they didn't even knew what was the reason behind it, it won't be ready for April and the roadmap will remain just dust in eyes.

We need to see the reality and again to respect Bitcoin vision since it's creation. Block Size increase via Hard fork was on roadmap since Satoshi made the limit itself. Classic already gained 25% of nodes + other nodes in a quantum of 31% of network. While Bitcoin Core is losing ground every day. Once it goes 50/50 I can say Bitcoin Core is doomed to be left behind. And I can see it happen in maximum 1 month more probably when Bitcoin Core will fail to launch Segwit on time.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 08:19:20 AM
 #160

What is so hard to be understood? This is the Bitcoin experience. When you first found out about Bitcoin, I am sure you were aware of all this and the fact that Hard Forks are a necessity in evolution of a great software.
They aren't. I have no idea where you got that from.

SegWit forked, that was it! Simple as that, they didn't even knew what was the reason behind it, it won't be ready for April and the roadmap will remain just dust in eyes.
This is a very bad FUD attempt. It happened due to some people running an older version (there were changes between v2 and v3). In other words, it is irrelevant as this won't and can't happen on the main net.

The question is not whether Bitcoin can scale to the Visa network but whether Bitcoin can scale to just keep up the demand, because there is not such high demand for Bitcoin to be used for Visa levels in near future. And im sure the answer is yes.
You can't know that because of two reasons (as an example, they are more): 1) You don't know what the demand is represented as (e.g. TX volume? Not necessarily as somebody could be creating a lot of TXs themselves) nor how much demand there is going to be; 2) You don't know how the technology is going to improve over the years. Anyhow, with Segwit around the corner I don't understand any 'urgency' for a 2 MB block size limit.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
ATguy
Sr. Member
****
Offline Offline

Activity: 423
Merit: 250



View Profile
March 09, 2016, 08:33:24 AM
 #161

of course Gavin is doing things differently in classic his primary goal is to scale the blockchain as much as possible.

this is NOT the main goal of Core, which is fine, there second layer solution is fine, but it's simply not what the majority want...


I agree second layer is fine, but people should choose it because it is better than decentralized onchain transactions, not just because the the blocksize is artifically limited so they have no other choice.



Bitcoin is up and running well (assuming one uses reasonable fees ~5¢/transaction) despite an onslaught of ill-intention persons.  Shame on all of us for not adjusting our marketing messages earlier to set expectations better.  Find me another system that has withstood as much and moves millions of dollars a day.

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free?

My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.

Lauda has done a yeoman's job representing the positions; thank goodness someone has the patience.

give it a few more days of debate, you'll see your optimism and praise will turn to anger and disgust. LOL  Grin


This, unless one know the big picture it is easy to tell Bitcoin is working well. Because it is now, but it is supposed to be just settlement layer for offchain transactions with hundred times more expensive onchain fees because of artifficial blocksize limit. So say good bye affordable decentralized onchain transactions if this vision comes true, and be ready for just another centralized offchain solutions!

Here is the suggested plan explained by Bitmain's Jihan Wu at 8btc he got from core representatives at recent Hong Kong meeting:

https://np.reddit.com/r/BitcoinMarkets/comments/48kf18/daily_discussion_wednesday_march_02_2016/d0krl0w

Quote
During the Hong Kong meeting, the answer provided by Core reps is that the future Lightning Network would increase capacity a thousandfold - that up to tens of thousands of transactions can be completed on the lightning network and settled with one on chain transaction. Assuming that current transaction fees are 0.3 RMB, and assuming that 1000 lightning transactions can be settled by one blockchain transaction, then we can raise fees for on chain transactions to 30 RMB (100x increase), while each transaction on the lightning network would only cost a tenth of current fees and increase miner revenue a hundredfold.

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 09, 2016, 08:37:47 AM
 #162

What is so hard to be understood? This is the Bitcoin experience. When you first found out about Bitcoin, I am sure you were aware of all this and the fact that Hard Forks are a necessity in evolution of a great software.
They aren't. I have no idea where you got that from.

SegWit forked, that was it! Simple as that, they didn't even knew what was the reason behind it, it won't be ready for April and the roadmap will remain just dust in eyes.
This is a very bad FUD attempt. It happened due to some people running an older version (there were changes between v2 and v3). In other words, it is irrelevant as this won't and can't happen on the main net.


Resistence is futile my friend. We don't want to rag down Bitcoin, like an old car trying to make engine work with a simple rag. That's what SegWit is now for a solution that must be implemented. You guys are so delusional that you actually dissed Satoshi Nakamoto, it's clearly that your vision is not about Bitcoin anymore. If you don't like that, please move to any shillcoins you wish and play with those. But this is Bitcoin. Even more shocking I heard from Core supporters dismissing Moore Law's. So what's next you got the solution to E=MC2 and Einstein was an idiot?.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 08:44:37 AM
 #163

Resistence is futile my friend. We don't want to rag down Bitcoin, like an old car trying to make engine work with a simple rag. That's what SegWit is now for a solution that must be implemented. You guys are so delusional that you actually dissed Satoshi Nakamoto, it's clearly that your vision is not about Bitcoin anymore. But this is Bitcoin.
This is among the worst attempts at rebuttal that I've recently seen. Segwit is an improvement to scalability, a 2 MB block size limit isn't. You can't change the facts regardless of what nonsense you try to feed to the majority. The debate has become nonsense. 'Big blockists' wanted more capacity -> Core provides (will) this capacity with Segwit -> 'Big blockists' continue complaining. This does not make much sense?

If you don't like that, please move to any shillcoins you wish and play with those.
Which is exactly what Classic is.

Even more shocking I heard from Core supporters dismissing Moore Law's. So what's next you got the solution to E=MC2 and Einstein was an idiot?.
This has to be a joke as you obviously have very limited knowledge in this field. Even Moore himself said that it not a law but rather a self-fulfilling prophecy.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 09, 2016, 08:55:06 AM
 #164


If you don't like that, please move to any shillcoins you wish and play with those.
Which is exactly what Classic is.


You're ignorance is amusing me. So that means Bitcoin Core = Shillcoin also? Cause it isn't Bitcoin .... is Bitcoin "Core" by Blockstream. So basically ....is ? C'mon say it. You trapped yourself into this one.  Cheesy

Good luck mate. For real I am sick of this. Bitcoin Classic will succeed I am sure of that and even more the whole community will he happy to get rid of them. 18% of nodes to convert to Bitcoin Classic and Core is over, losing majority. That simple and easy.

Kudos and don't get drunk with plain water is not good.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 08:59:07 AM
 #165

Good luck mate. For real I am sick of this. Bitcoin Classic will succeed I am sure of that and even more the whole community will he happy to get rid of them.


Good luck, you will surely need it. Now since you don't use technical arguments or anything I'd ask you kindly to not derail this thread further. These is at least one person that seems decent in it and worth talking to.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
BlindMayorBitcorn
Legendary
*
Offline Offline

Activity: 1260
Merit: 1115



View Profile
March 09, 2016, 09:07:15 AM
 #166


Even more shocking I heard from Core supporters dismissing Moore Law's. So what's next you got the solution to E=MC2 and Einstein was an idiot?.
This has to be a joke as you obviously have very limited knowledge in this field. Even Moore himself said that it not a law but rather a self-fulfilling prophecy.

http://arstechnica.com/information-technology/2016/02/moores-law-really-is-dead-this-time/

First slowing in the mid-’90s to two-year gaps, the rate at which tiny transistors compute isn’t accelerating. Soon, too, they’ll have to be so small they’re just a few molecules, perhaps not even effective. And they’re not getting cheaper. Moore saw the future 50 years ago, but we may soon need a different rubric for predicting progress.

Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
March 09, 2016, 09:13:04 AM
 #167


Even more shocking I heard from Core supporters dismissing Moore Law's. So what's next you got the solution to E=MC2 and Einstein was an idiot?.
This has to be a joke as you obviously have very limited knowledge in this field. Even Moore himself said that it not a law but rather a self-fulfilling prophecy.

http://arstechnica.com/information-technology/2016/02/moores-law-really-is-dead-this-time/

First slowing in the mid-’90s to two-year gaps, the rate at which tiny transistors compute isn’t accelerating. Soon, too, they’ll have to be so small they’re just a few molecules, perhaps not even effective. And they’re not getting cheaper. Moore saw the future 50 years ago, but we may soon need a different rubric for predicting progress.


.. just wait for the next quantum jump. It's coming ...

          ( Hope for Bitcoin as well ! )

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
BlindMayorBitcorn
Legendary
*
Offline Offline

Activity: 1260
Merit: 1115



View Profile
March 09, 2016, 09:16:36 AM
 #168


Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
Zarathustra
Legendary
*
Offline Offline

Activity: 1162
Merit: 1004



View Profile
March 09, 2016, 09:27:56 AM
 #169

Segwit is an improvement to scalability, a 2 MB block size limit isn't.

An improvement to scale offchain (altchain).
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 09, 2016, 10:41:21 AM
 #170

No. You don't get to define what we allow in the system and what we don't, certainly not when it was possible all this time. What Gavin proposed is a hacky workaround, nothing more

Setting a block size limit of 1MB was, and continues to be a hacky workaround.

Theory drives development, but in practice sometimes hacky workarounds are needed.

I write code, I'd prefer it was all perfect. I run a business which means sometimes I have to consider bottom line. If a risk is identified and a quick fix is available it makes economic sense to apply the quick fix whilst working on a more robust long term solution.

That this has not been done inevitably leads people to question why. It's the answers that have been given to those questions that are causing the most difficulty. The fact that when those answers are challenged the story changes. The fact that the answers are inconsistent with what seems logical to any reasonably minded impartial observer.

The most important thing is that until about a year ago there was near unanimous agreement on what the purpose of the block size limit was, and how it would be dealt with. yet here we are today with this action having not been taken and a group of people actively trying to convince everyone that centralised enforcement of a block size limit is somehow the natural bahaviour if the system, despite it having never been so in its entire history.

The block size limit was a hacky workaround to the expensive to validate issue. An issue that is now mitigated by other much better solutions, not least a well incentivised distributed mining economy. That is now smart enough to route around such an attack, making it prohibitively expensive to maintain.

Individual economic self interest is how Bitcoin is supposed to work.

It's time to remove the bandaid.

When the curtain is pulled back you will see how powerful the wizard really isn't.


"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
March 09, 2016, 12:10:48 PM
 #171

Bitcoin Core contributors (Ordered by number of commits);

Wladimir J. van der Laan (3398)
Gavin Andresen (1100)
Pieter Wuille (966)
Cory Fields (332)
TheBlueMatt (288)
jonasschnelli (246)
Luke-Jr (239)
Gregory Maxwell (195)
MarcoFalke (128)
fanquake (118)
jtimon (112)
Peter Todd (92)
cozz (70)
sdaftuar (69)
morcos (59)
paveljanik (52)
pstratem (37)
muggenhor (34)
Eric Lombrozo (32)
rebroad (28)
domob1812 (25)
Michagogo (24)
dooglus (22)
dexX7 (20)
dgenr8 (18)
super3 (16)
kdomanski (15)
casey (15)
xanatos (14)
ENikS (13)
wtogami (12)
codler (10)
btcdrak (10)
wizeman (10)
maaku (10)
rnicoll (9)
ajweiss (Cool
roques (Cool
jamesob (Cool
jmcorgan (Cool
jordanlewis (Cool
21E14 (Cool
devrandom (Cool
joshtriplett (Cool
Nils Schneider (7)
forrestv (7)
freewil (7)
rat4 (7)
sinetek (7)
dcousens (7)
sje397 (7)
celil-kj (7)
sandakersmann (7)
runeksvendsen (7)
mrbandrews (7)
OttoAllmendinger (6)
mgiuca (6)
Matoking (6)
vegard (6)
zw (6)
p2k (6)
JoelKatz (6)
jrmithdobbs (6)
ashleyholman (6)
dertin (6)
Andreas Schildbach (6)
mndrix (5)
r000n (5)
roybadami (5)
vinniefalco (5)
Whit Jack (5)
fcicq (5)
ptschip (5)
maraoz (5)
federicobond (5)
alexanderkjeldaas (5)
robbak (5)
rdponticelli (5)
...
sorry I ran out of gas/time.  So, pretty clearly "most" don't appear on the link from Blockstream.


Page 43 you'll find some graph. And this doc is very interesting at all (also Visa can be found in that ...)

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/492972/gs-16-1-distributed-ledger-technology.pdf

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 01:11:29 PM
 #172

Setting a block size limit of 1MB was, and continues to be a hacky workaround.
It is certainly not a hacky workaround. It is a limit that was needed (it still is for the time being).

Theory drives development, but in practice sometimes hacky workarounds are needed.
If it can be avoided, not really.

The block size limit was a hacky workaround to the expensive to validate issue. An issue that is now mitigated by other much better solutions, not least a well incentivised distributed mining economy. That is now smart enough to route around such an attack, making it prohibitively expensive to maintain.
So exactly what is the plan, replace one "hacky workaround" with another? Quite a lovely way forward. Segwit is being delivered and it will ease the validation problem and increase the transaction capacity. What is the problem exactly?


"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 02:42:52 PM
 #173

What is so hard to be understood? This is the Bitcoin experience. When you first found out about Bitcoin, I am sure you were aware of all this and the fact that Hard Forks are a necessity in evolution of a great software.
They aren't. I have no idea where you got that from.

hard forks are not needed?
so the 2013 event due to the database bug. if not hard forked we would be stuck at 500kb blocks
so the earlier event where extra bitcoins were made if we didnt hard fork the rules of the 21mill cap would be broken

if any bug appears in the future? should we just live with it? by your logic yes, by my logic no
use logic not blockstream fanboyism

SegWit forked, that was it! Simple as that, they didn't even knew what was the reason behind it, it won't be ready for April and the roadmap will remain just dust in eyes.
This is a very bad FUD attempt. It happened due to some people running an older version (there were changes between v2 and v3). In other words, it is irrelevant as this won't and can't happen on the main net.
so hard forks can happen by some upgrading to cause a hard fork by voting for the change.. and some hard forks happen because people dont want the change.

so are you advocating that people should upgrade and accept change, or not upgrade and stick with old rules.. in both cases you have proven that a hard fork would still happen.
again use logic not blockstream fanboyism

The question is not whether Bitcoin can scale to the Visa network but whether Bitcoin can scale to just keep up the demand, because there is not such high demand for Bitcoin to be used for Visa levels in near future. And im sure the answer is yes.
You can't know that because of two reasons (as an example, they are more): 1) You don't know what the demand is represented as (e.g. TX volume? Not necessarily as somebody could be creating a lot of TXs themselves) nor how much demand there is going to be; 2) You don't know how the technology is going to improve over the years. Anyhow, with Segwit around the corner I don't understand any 'urgency' for a 2 MB block size limit.
and now you know why we need a BUFFER. to allow for natural growth when it happens instead of endless begging for minimal growth every 2 years.

2mb+segwit offers 4times the POTENTIAL.. meaning instead of 2000tx potential. there can be 8000tx potential. it does not mean blocks need to be filled by 8000 transactions as of summer 2017. it just means that blocks can grow slowly and naturally from 2000 to 8000 as and when needed at thier own natural pace without having to cry to blockstream every few months asking for an upgrade from 2000 to 2200, 2200 to 2400 but only getting them small movements every couple years

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 02:55:31 PM
 #174

This has to be a joke as you obviously have very limited knowledge in this field. Even Moore himself said that it not a law but rather a self-fulfilling prophecy.

have you read a single line of bitcoin code yet or do you need to ask blockstream what language it is wrote in.


have you even got a full node running or still unsure how long it takes to sync the blockchain


would you run that future intended full node you want to have on your own computer, or run it remotely on amazon server?

i think Lauda has consumed too many people into his rhetoric, when in actual fact he knows very little and is just a mouth piece for blockstream

here is a summary of the debunks of literally every doomsday scenario Lauda has attempted to inflict on the community to sway people away from blockstreams agenda(roadmap)

though classic is one implementation. there is alot of background drama involved. so what could be done better is to get the programmers of bitcoinj, btcd, and the other main implementations to go for 2mb aswell. and find a way to get blockstream to come to their senses to have 2mb aswell.

things like debunking the 12month grace hard fork contention argument, by using luke jr's proposal of a different hard fork(difficulty drop) that he feels can happily become active in 3 months after code release.(if luke thinks 3months is acceptable. then no reason to go for 12, if luke wants hiscode in april then 2mb can be in april too)

things like debunking validation issues by highlighting that libsecp256k1 offers 5x validation speeds. making total of 10,000 signatures validate in april 2016, in the same time it takes 2000 signatures to validate in january 2016. thus allowing for more then a small bit of growth

things like debunking the hard drive storage bloat, with stats that 1mb has maxmimum yearly 100% filled blocks rate of 52.5gb. 2mb=105gb 4mb=210mb
so a 2tb hard drive at $100 can store 40 years of 1mb, 20 years of 2mb and 10 years of 4mb(2mb+segwit)

things like debunking user upload speeds causing relay delays. by stating that millions of people can happily play an online game, while in a voiceoverIP group chat. while livestreaming the game to youtube or twitch, all of which are upload activities. 750kbps= ~93 kByte/s = ~56mb every 10 minutes

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
Laosai
Sr. Member
****
Offline Offline

Activity: 462
Merit: 250


View Profile
March 09, 2016, 02:57:15 PM
 #175

Satoshi was wrong about so many things.

And yet so many people are keeping to his words like it's some kind of sacred text :/

pedrog
Legendary
*
Offline Offline

Activity: 2786
Merit: 1031



View Profile
March 09, 2016, 03:06:52 PM
 #176

Satoshi was wrong about so many things.

And yet so many people are keeping to his words like it's some kind of sacred text :/

That's because we came to bitcoin on Satoshi's vision, and we still care for that vision, Bitcoin should fail or succeed by that vision.

The people currently in charge have a different vision for bitcoin, they should instead, like so many other people, build their own alternative system and let bitcoin be what it was supposed to be.

franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 03:14:11 PM
 #177

Satoshi was wrong about so many things.

And yet so many people are keeping to his words like it's some kind of sacred text :/

because bitcoins vision is about a decentralized currency not run by corporations who intend to screw users over.
the funny thing is that those denouncing satoshi's vision are doing so not because of logic of satoshi being wrong overall.. but because they are on the corporate/capitalist bandwagon and want to profit from other peoples misery

eventhis very topic, thinking bitcoin needs to be like Visa.. instead of just decentralized cash(cheques to be more precise), shows that people want to move away from the decentralized zero control premiss

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 04:04:11 PM
 #178

A transaction will have one or more sigops (signature operations).  Let's denote these as S1, S2, ..., Sn.  Apparently verifying these involves considering pairs, e.g. (S1, S2), (S1, S3), ..., (S1, Sn), (S2, S3), (S2, S4), ..., (S2, Sn), ..., (Sn-1, Sn).  This leads to quadratic growth of the compute time.  To avoid running behind the live stream of blocks, transactions are limited in size; they cannot span multiple blocks.  Furthermore, the block size is limited; currently to 1MB.

The least compute intensive block is composed of a single transaction with only one sigop.  The most computational intensive block is also composed of a single transaction but with as many sigops as will fit.  A block with two or more transactions will be less computational intense.  A block full of many transactions, each with just a single sigop, is minimal in terms of compute power required to verify it.

Let's denote transactions, Tk, where k is the number of sigops.  A block is composed of a set of transactions, (Tk1, Tk2, ..., Tkn).  The verification time is then approximated by K12+K22+...+Kn2.

To scale we must either;

1) have fast enough hardware
2) improve the algorithm, i.e. verify without pairing
3) verify in the background (although the verification has to happen sometime)
4) constrain the sum of sigops/transaction in a block

Can someone please provide an example of transaction that requires two or more sigops?
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 04:18:03 PM
 #179

A transaction will have one or more sigops (signature operations).  Let's denote these as S1, S2, ..., Sn.  Apparently verifying these involves considering pairs, e.g. (S1, S2), (S1, S3), ..., (S1, Sn), (S2, S3), (S2, S4), ..., (S2, Sn), ..., (Sn-1, Sn).  This leads to quadratic growth of the compute time.  To avoid running behind the live stream of blocks, transactions are limited in size; they cannot span multiple blocks.  Furthermore, the block size is limited; currently to 1MB.

The least compute intensive block is composed of a single transaction with only one sigop.  The most computational intensive block is also composed of a single transaction but with as many sigops as will fit.  A block with two or more transactions will be less computational intense.  A block full of many transactions, each with just a single sigop, is minimal in terms of compute power required to verify it.

Let's denote transactions, Tk, where k is the number of sigops.  A block is composed of a set of transactions, (Tk1, Tk2, ..., Tkn).  The verification time is then approximated by K12+K22+...+Kn2.

To scale we must either;

1) have fast enough hardware
2) improve the algorithm, i.e. verify without pairing
3) verify in the background (although the verification has to happen sometime)
4) constrain the sum of sigops/transaction in a block

Can someone please provide an example of transaction that requires two or more sigops?

this is a none issue
#1 #2 is has already been done

it's not like these crazy computational intensive TX are legit, you only get into trouble if you try to allow this crazy spam like TX with thousands of inputs.

miners are allowed to orphen a block for ANY reason. I think it's perfectly valid to not allow spam TX designed to slow down validation time of the block.

adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 04:32:04 PM
 #180

everything you hear about why we shouldn't increase blocksize is a bullshit excuse to delay the inevitable. the lighting network will require bigger blocks too....
we simply have to do this, the only thing to discuss now is when, thats all.
core says 1 year because they want the fee market to grow, and get poeple to come to there second layer solution sooner rather than later.
they would have us keep bitcoin first layer always 1 step behind, so that the second layer solution is attractive from day 1.

if you can't see this reality then you haven't done enough digging.

Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 04:51:00 PM
Last edit: March 09, 2016, 05:43:33 PM by Lauda
 #181

everything you hear about why we shouldn't increase blocksize is a bullshit excuse to delay the inevitable.
Nope. There are true and valid concerns in regards to it. You can't just play around with numbers as you wish.

the lighting network will require bigger blocks too....
According to the wallpaper it will be able to accommodate 30 million users at a 1 MB block size limit.

we simply have to do this, the only thing to discuss now is when, thats all.
When, how much and test. Not as simple as you think.

core says 1 year because they want the fee market to grow, and get poeple to come to there their second layer solution sooner rather than later.
There is no such thing as  "their" second layer solution. You could develop LN yourself, if you had the necessary skills.


Update: Removed completely false statement.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 04:57:29 PM
 #182

the lighting network will require bigger blocks too....
According to the wallpaper it will be able to accommodate 30 million users at a 1 MB block size limit.

with the 1MB blocks currently all full, i seriously doubt thats true.

what happens when a payment channel with 10,000TX needs to be settled on the blockchain?

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 04:57:53 PM
 #183

To scale we must either;

1) have fast enough hardware
2) improve the algorithm, i.e. verify without pairing
3) verify in the background (although the verification has to happen sometime)
4) constrain the sum of sigops/transaction in a block

Can someone please provide an example of transaction that requires two or more sigops?
this is a none issue
#1 #2 is has already been done

it's not like these crazy computational intensive TX are legit, you only get into trouble if you try to allow this crazy spam like TX with thousands of inputs.

miners are allowed to orphen a block for ANY reason. I think it's perfectly valid to not allow spam TX designed to slow down validation time of the block.
1) Oh?  What speed hardware can handle what number of sigops?  With quadratic growth eventually some number of sigops will exceed the capacity of any conceivable hardware.
2) Please help me understand; I am not aware of this development yet.

So, each input comes with a sigop; I see, actually that makes perfect sense.  Hmm, would it make any sense to create a number of smaller transactions that consolidate the many inputs?  Perhaps it would be possible/better to avoid creating addresses with small amounts in the first place.  Just trying my best to understand.  I appreciate your patience and efforts.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 05:00:29 PM
 #184

To scale we must either;

1) have fast enough hardware
2) improve the algorithm, i.e. verify without pairing
3) verify in the background (although the verification has to happen sometime)
4) constrain the sum of sigops/transaction in a block

Can someone please provide an example of transaction that requires two or more sigops?
this is a none issue
#1 #2 is has already been done

it's not like these crazy computational intensive TX are legit, you only get into trouble if you try to allow this crazy spam like TX with thousands of inputs.

miners are allowed to orphen a block for ANY reason. I think it's perfectly valid to not allow spam TX designed to slow down validation time of the block.
1) Oh?  What speed hardware can handle what number of sigops?  With quadratic growth eventually some number of sigops will exceed the capacity of any conceivable hardware.
2) Please help me understand; I am not aware of this development yet.

So, each input comes with a sigop; I see, actually that makes perfect sense.  Hmm, would it make any sense to create a number of smaller transactions that consolidate the many inputs?  Perhaps it would be possible/better to avoid creating addresses with small amounts in the first place.  Just trying my best to understand.  I appreciate your patience and efforts.

i dont know the details...

I assume hardware is always getting better because HELLO! and libsecp256k1 offers 5x validation speeds.

adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 05:05:11 PM
 #185

we need to get Gavin and Vladimir in the same room and ask them these questions.
I think you'll see that the reason they disagree isn't because they disagree about technical limitations but rather different visions.
One believes bitcoin full nodes need to run on a Raspberry pie the other doesn't think thats necessary.

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 05:19:51 PM
 #186

I assume hardware is always getting better because HELLO! and libsecp256k1 offers 5x validation speeds.
Ah, I shall research these but they both sound like software to me.  Hmm, unless "HELLO!" isn't software and you were just indicating that faster hardware is just so obvious.

Certainly faster hardware is coming at us; but a quadratic growth problem will *always* out scale even the greatest conceivable hardware.

A linear improvement in the software is always appreciated but again it is *only* linear (even if it is a massive 5x) as compared to quadratic growth, i.e. n².  For example, suppose it takes t amount of time to process one sigop.  Then a transaction with n sigops will take approximately n²*t amount time.  Now we unleash the mighty libsecp256k1 5x improvement.  So, we have n²*(t/5).  When n is small this is great news.  For example, 1²*t vs. 1²*(t/5) or 1t vs. t/5 gets us the full 5x advantage but 10²*t vs. 10²*(t/5) or 100t vs. 20t is still taking 20t and not 10t let alone 2t to do those 10 sigops.  Moreover, 100 sigops works out to 10,000t vs. 2,000t; and who wants to compute 2,000t for just 100 sigops?  Honestly/sincerely I am utterly delighted at the 5x offered by libsecp256k1 but against quadratic growth it pales.
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 09, 2016, 05:22:43 PM
 #187

Satoshi was wrong about so many things.

He was right about things too.

<snip>
Bitcoin users might get increasingly tyrannical about limiting the size of the chain so it's easy for lots of users and small devices.
</snip>


"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
pedrog
Legendary
*
Offline Offline

Activity: 2786
Merit: 1031



View Profile
March 09, 2016, 05:24:04 PM
 #188

According to the wallpaper it will be able to accommodate 30 million users at a 1 MB block size limit.


30 million users?

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 05:31:54 PM
 #189

we need to get Gavin and Vladimir in the same room and ask them these questions.
I think you'll see that the reason they disagree isn't because they disagree about technical limitations but rather different visions.
One believes bitcoin full nodes need to run on a Raspberry pie the other doesn't think thats necessary.
Ah, vision; this is always worthy of lively debate.  As much as I love a deep technical dive into the underpinnings and the passions let loose, they are often misunderstood by the observers.

My personal hope is those that are able to do are doing and so not available to debate.  They will prove themselves in the only way that matters; deliver working code.

The rest of us that can't (or won't) are left to do our best at picking up the ideas of the doers and representing them as well as we can.

In my humble personal vision, I see Bitcoin being adopted very widely long before the fees climb.  But blindly cranking up the block size without constraining the impact of transactions with many sigops is taking an unwise risk.
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 05:38:22 PM
 #190

everything you hear about why we shouldn't increase blocksize is a bullshit excuse to delay the inevitable.
Nope. There are true and valid concerns in regards to it. You can't just play around with numbers as you wish.


the reason they want to wait a year is not to give bitcoin more buffer space to grow and allow more ONCHAIN, they want to delay it so that they can then increase it marginally, only enough to fill the buffer space with the flags and opcode bloats that LN needs. forcing people away from onchain. and allowing the ONCHAIN fee to increase due to being nearly full. as more incentive to divert people away from bitcoins ONCHAIN.

as for the raspberry pi debate
RasPi 2. 900mhz
1gm ram

RasPi 3 1.2ghz (33% increase)
1gb ram

33% increase in processing speed in just 1 year..
say rasPi2 could handle 2000 signature verification's every couple minutes.(understatement used for basic maths)
rasPi3 can handle 2660 signature verification in the same time.

now add on libsecp256k1's 5x performance.

and a RasPi 3 can handle 13300 verification in after april 2016 that rasPi 2 could handle 2000 just a couple months ago

and to mr david rabahy, i know you do love your maths. but please dont take Lauda's word for it because he heard it from someone who heard it from someone. because thats just chinese whispers.

though i myself think the quadratic debate is a moot point because we are talking about 2mb buffer code to be added in april (not now) which will activate after quadratic doomsday scenario no longer becomes a thing. so to all intense and purposes by the time 2mb is activated the quadratic debate does not even need to be mentioned.

but i would still like to see some real maths using real bitcoin data to see if quadratic debate was even a thing during 2009-april2016

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 05:41:17 PM
 #191

30 million users?
My god, I have made a HUGE mistake (as I've said this in a few places). I've just re-read the wallpaper and my statement is completely wrong:
Quote
If we presume that a decentralized payment network exists and one person will make 3 blockchain transactions per year on average, Bitcoin will be able to support over 35 million users with 1MB blocks in ideal circumstances (assuming 2000 transactions per MB).
I must apologize. If anyone sees this mistake anywhere please notify me and I'll delete it.

I assume hardware is always getting better because HELLO! and libsecp256k1 offers 5x validation speeds.
Libsec256k1 does not have an effect on the validation problem (Gavin confirmed this somewhere on Reddit but I can't find it).

with the 1MB blocks currently all full, i seriously doubt thats true.

what happens when a payment channel with 10,000TX needs to be settled on the blockchain?
My statement is wrong (read above). However, the answer to your question is: It needs 1 transaction on-chain (to close the channel).

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 05:42:18 PM
 #192

Page 43 you'll find some graph. And this doc is very interesting at all (also Visa can be found in that ...)

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/492972/gs-16-1-distributed-ledger-technology.pdf
Thank you so much for the link.  I am reading through it but it is long and will take time to digest.  But honestly I will most likely tire of it and not finish; sorry.  I did find the chart on page 43 showing lines of code by developer.  Please help me understand the conclusion we are meant to derive.
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 05:45:20 PM
 #193

laudas evidence is not science or technical analysys.. but an image he is advertising..

when you go to your doctor do you ask for a glossy leaflet or to ask the doctor for specific details and maybe ask for a second opinion..

lauda tried to base his whole segwit beliefs on another image from last year, and got proven wrong on multiple times.

i advise everyone to have an open mind and do your own investigations. do not follow lauda because his information is flawed. "he is a good cars salesman but he is no mechanic"

he is a good pharmacist dispensing drugs but he is no doctor

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 05:48:26 PM
 #194

with the 1MB blocks currently all full, i seriously doubt thats true.

what happens when a payment channel with 10,000TX needs to be settled on the blockchain?
My statement is wrong (read above). However, the answer to your question is: It needs 1 transaction on-chain (to close the channel).

30 million users making a transaction..each.. ONCHAIN to lock their funds into LN

then 1 transaction ONCHAIN filled with 30million-60million outputs to settle the balances with the destination while also returning the 'change' to the originator meaning atleast 2 outputs. destination and origin, for every input.

now do the maths

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
pedrog
Legendary
*
Offline Offline

Activity: 2786
Merit: 1031



View Profile
March 09, 2016, 05:54:22 PM
 #195

LN whitepaper is based on assumptions?

franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 05:56:35 PM
 #196

LN whitepaper is based on assumptions?

yep LN code for bitcoin is not really active.. otherwise we would be using it.

many coders think of best case scenarios. but hardly ever hack it.

even during 2009-2016 the coders say bitcoin works because other people cant hack it, meaning they leave it for others to find out its weaknesses and pretend everything is perfect until a weakness is found.

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 05:58:24 PM
 #197

as for the raspberry pi debate
RasPi 2. 900mhz
1gm ram

RasPi 3 1.2ghz (33% increase)
1gb ram

33% increase in processing speed in just 1 year..
say rasPi2 could handle 2000 signature verification's every couple minutes.(understatement used for basic maths)
rasPi3 can handle 2660 signature verification in the same time.
Wait; that doesn't take into account the quadratic growth aspect.  The correct calculation is, if rasPi2 can do 1 sigop in t and 2000²*t in a couple of minutes then rasPi3 can handle 1 sigop in 0.75t (maybe) and so in those same couple of minutes we have to solve;

X²*(0.75t) = 2000²*t
X=2309

Certainly a nice increase but not 2660.  Does sigop verification scale linearly with GHz?
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 06:07:16 PM
 #198

laudas evidence is not science or technical analysys.. but an image he is advertising..
Gosh, I'm pretty sure that's what we are all doing (me included).  I do try hard to get the math right but I am hand waving a lot along the way trying to impress everyone; sorry.  I do examine each and everyone's posting with a critical eye; nothing is taken as gospel as is.
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 06:08:32 PM
Last edit: March 09, 2016, 06:27:03 PM by franky1
 #199

as for the raspberry pi debate
RasPi 2. 900mhz
1gm ram

RasPi 3 1.2ghz (33% increase)
1gb ram

33% increase in processing speed in just 1 year..
say rasPi2 could handle 2000 signature verification's every couple minutes.(understatement used for basic maths)
rasPi3 can handle 2660 signature verification in the same time.
Wait; that doesn't take into account the quadratic growth aspect.  The correct calculation is, if rasPi2 can do 1 sigop in t and 2000²*t in a couple of minutes then rasPi3 can handle 1 sigop in 0.75t (maybe) and so in those same couple of minutes we have to solve;

X²*(0.75t) = 2000²*t
X=2309

Certainly a nice increase but not 2660.  Does sigop verification scale linearly with GHz?

your assumption that quadric wont be solved in april.
also if 1000 has 33% increase=1333
2000 =2666
so i was using single sig before april(i even bracketed that i understated it for simple maths) and comparing it to scaling after april using technology available in april and code changes after april which make th quadratic debate meaningless

but ultimately my maths was simplified so will leave you to do the more detailed maths


I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
Yakamoto
Legendary
*
Offline Offline

Activity: 1218
Merit: 1007


View Profile
March 09, 2016, 06:11:05 PM
 #200

I think that it is entirely possible for the Botcoin network to be scaled so that it can handle all the transactions that would be occurring during a normal day, but there has to be the implementation of the 2MB blocks for right now.

I'm not sure why this has to be such an argument and why it can't be scaled. As long as Bitcoin remains as decentralized as it currently is, would that not be beneficial to the network?
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 06:14:29 PM
 #201

your assumption that quadric wont be solved in april.
True enough, I did assume the same software running on both rasPi2 and 3 (which you did too to get to your 2660 number, so I guess that was fair, no?).

Do we want to then make an effort to recalculate the rest using my 2309 number and ignore the question about linear scaling with GHz?
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 06:18:48 PM
 #202

I think that it is entirely possible for the Botcoin network to be scaled so that it can handle all the transactions that would be occurring during a normal day, but there has to be the implementation of the 2MB blocks for right now.

I'm not sure why this has to be such an argument and why it can't be scaled. As long as Bitcoin remains as decentralized as it currently is, would that not be beneficial to the network?
Agreed; we just have to watch out for the quadratic growth sigop verification issue.  Also, there is the issue of releasing multiple changes over a short period of time to be managed.  Also, a fee of 5¢/transaction isn't so burdensome, is it?  Finally, we *really* need *all* wallet software to automatically and by default provide for a fee which will get the transaction blocked pretty quick and set user expectations accordingly.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 06:20:28 PM
 #203

I assume hardware is always getting better because HELLO! and libsecp256k1 offers 5x validation speeds.
Ah, I shall research these but they both sound like software to me.  Hmm, unless "HELLO!" isn't software and you were just indicating that faster hardware is just so obvious.

Certainly faster hardware is coming at us; but a quadratic growth problem will *always* out scale even the greatest conceivable hardware.

A linear improvement in the software is always appreciated but again it is *only* linear (even if it is a massive 5x) as compared to quadratic growth, i.e. n².  For example, suppose it takes t amount of time to process one sigop.  Then a transaction with n sigops will take approximately n²*t amount time.  Now we unleash the mighty libsecp256k1 5x improvement.  So, we have n²*(t/5).  When n is small this is great news.  For example, 1²*t vs. 1²*(t/5) or 1t vs. t/5 gets us the full 5x advantage but 10²*t vs. 10²*(t/5) or 100t vs. 20t is still taking 20t and not 10t let alone 2t to do those 10 sigops.  Moreover, 100 sigops works out to 10,000t vs. 2,000t; and who wants to compute 2,000t for just 100 sigops?  Honestly/sincerely I am utterly delighted at the 5x offered by libsecp256k1 but against quadratic growth it pales.

1)how insane would a TX have to be to make it so validation time is hindered?
2)can we simply ignore TX that are that crazy?
3) is it conceivable that a legit TX would be that crazy?

i have no clue my guess is

1) extremely insane
2) yes
3) no

but guessing is not cool, would be nice if some math could back up me up.

phibay
Sr. Member
****
Offline Offline

Activity: 448
Merit: 250



View Profile
March 09, 2016, 06:22:43 PM
 #204

I don't understand why this wasn't implemented in the first place? Since Satoshi had already predicted this why didn't he put the necessary codes in. Sorry I am no computer expert so I hope someone could shed some light on this, thanks.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 06:27:31 PM
 #205

a fee of 5¢/transaction isn't so burdensome, is it?
5¢ makes things like mixing coins expensive
and suddenly anything less then 5$ TX is not as fun to do
also its not so much the cost that bothers me its the fact that it make newbies lives harder.
we never use to have " My TX is taking forever to confrim " threads before, and i wish they would go away.
lets not forget that ~90% of TX on the network today are low value... i would assume the 5¢ fee is already prohibiting all kinds of legit TX that would otherwise take place on the blockchain.

franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 06:44:02 PM
Last edit: March 09, 2016, 06:57:26 PM by franky1
 #206

your assumption that quadric wont be solved in april.
True enough, I did assume the same software running on both rasPi2 and 3 (which you did too to get to your 2660 number, so I guess that was fair, no?).

Do we want to then make an effort to recalculate the rest using my 2309 number and ignore the question about linear scaling with GHz?

we are currently throwing random scenario numbers around.
my assumption was rasp2 using old software where quadratics was an issue compared to april rasp3 where it wasnt an issue
making not only a ghz performance increase (ill yield to your 2309) but then a multiple gain due to the code efficiency increase

what would be best is to use real bitcoin data as the ultimate goal rather then random scenario speculation

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 07:00:17 PM
 #207

LN whitepaper is based on assumptions?
No. This was a part explaining how many users could theoretically use Bitcoin at a 1 MB block size limit under certain circumstances (I quoted as I mistakenly read it and spread false information (corrected it everywhere already, I hope). Please read the white-paper before making assumptions.


and libsecp256k1 offers 5x validation speeds.
Okay, I've asked around on IRC and got this:
Quote
in order to sign or verify the tx, each input has to construct a special version of the tx and hash it. so if there are n inputs there are nxn hashes to be done. hence quadratic scaling.
the TLDR I believe is: ecdsa operations are the most computationally expensive part of verifying transactions, for normal, small size transactions, but they scale linearly with the size (number of inputs).whereas if a transaction in current bitcoin has tons of inputs, the bottleneck moves over to the hashing/preparing data to to be signed, because that time depends on the *square* of the number of inputs.
so usually it's ultra small, but it blows up for large N inputs.
Why doesn't libsecp256k1 have an effect on this?
Quote
because libsecp256k1 is an ECC library so it's only the "ecdsa" part in the above.
Hopefully this helps, albeit I doubt that many are going to understand it. It certainly isn't easy.



"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 07:05:23 PM
 #208

if 1000 has 33% increase=1333
2000 =2666
Yes, of course, you are correct, allowing for rounding, increasing 1000 by 33% is 1000 + 1000 * 33% = 1333 and increasing 2000 by 33% is 2000 + 200 * 33% = 2666.

Hmm, I'll try to explain;

1200MHz - 900Mhz = 300MHz
300Mhz / 900MHz = 33% -- so we say we have increased the processor frequency by 33%.

Also,

900MHz / 1200MHz = 75% -- so we say instructions will take only 75% as long to run.

So, for a fixed amount of work, we can calculate how long it will take to run.  If a 2000 sigop transaction takes t amount of time on the 900MHz processor then we expect it to only take 0.75t on the faster one.  There are potential hazards in this assertion.  The software might not be strictly compute bound.  Also, not every instruction will necessarily get the same advantage from the speedup, e.g. branches, etc.

*But* this does not indicate how much faster more work will run unless the software scales linearly with frequency.  If the software does indeed scale linearly then the amount of time it takes to get a fix amount of work done will decrease linearly and the amount of work that can get done in the same amount of time will increase linearly.

For some unexplained (to me yet) reason, signature verification does not scale linearly.  Instead it scales as the square;

11
24
39
416
525
......
n
......

So, yes, one signature verification could indeed get done in 0.75t but 2000 will take 2000²*(0.75t).

Gosh, I am terribly sorry if I haven't explained this well.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 07:26:06 PM
 #209

a fee of 5¢/transaction isn't so burdensome, is it?
5¢ makes things like mixing coins expensive
and suddenly anything less then 5$ TX is not as fun to do
also its not so much the cost that bothers me its the fact that it make newbies lives harder.
we never use to have " My TX is taking forever to confrim " threads before, and i wish they would go away.
lets not forget that ~90% of TX on the network today are low value... i would assume the 5¢ fee is already prohibiting all kinds of legit TX that would otherwise take place on the blockchain.
Couldn't agree more.  As long as we are careful about the signature verification scaling issue, and the releasing multiple features around the same time issue, and improving the wallet software to include a big enough fee to get transactions through in a timely fashion; I'm all for it.

My personal proposal would be to check on the progress of SegWit (give us the real story here).  If it is indeed really ready for primetime in April then, fine, roll with it.  If there's any chance of slipping then delay it until it is soup and in the meantime, put out a very simple block increase *with* a check to reject transactions with more than some limit of signatures/inputs.  Those needing to push through work with more inputs can split their work into multiple transactions.  Either way, I'll be happy.  Both ways should see the normal fees drop to earlier levels, reducing the pain/anxiety.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 07:30:25 PM
 #210

ill yield to your 2309
Outstanding.  Thank you; I could not ask for more.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 07:38:06 PM
 #211

and libsecp256k1 offers 5x validation speeds.
Okay, I've asked around on IRC and got this:
Quote
in order to sign or verify the tx, each input has to construct a special version of the tx and hash it. so if there are n inputs there are nxn hashes to be done. hence quadratic scaling.
the TLDR I believe is: ecdsa operations are the most computationally expensive part of verifying transactions, for normal, small size transactions, but they scale linearly with the size (number of inputs).whereas if a transaction in current bitcoin has tons of inputs, the bottleneck moves over to the hashing/preparing data to to be signed, because that time depends on the *square* of the number of inputs.
so usually it's ultra small, but it blows up for large N inputs.
Why doesn't libsecp256k1 have an effect on this?
Quote
because libsecp256k1 is an ECC library so it's only the "ecdsa" part in the above.
Hopefully this helps, albeit I doubt that many are going to understand it. It certainly isn't easy.
Luada, again we owe you a debt of gratitude.  You do the work the rest of us are too lazy to do.  Now I am beginning to understand why the verification process scales quadratic; not that me understanding matters per se but it is nice to know.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 07:44:05 PM
 #212

what are the implications of this "quadratic TX validation" you guys are talking about?

we can't have TX with a huge amount of inputs? or somthing?

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 07:48:52 PM
 #213

what are the implications of this "quadratic TX validation" you guys are talking about?

we can't have TX with a huge amount of inputs? or somthing?
Exactly.  If/when a transaction comes in with zillions of inputs then everyone verifying it will be subjected to a long computation.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 07:50:39 PM
 #214

If/when a transaction comes in with zillions of inputs then everyone verifying it will be subjected to a long computation.
Example of such transaction a can be seen here (from last year).

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 07:52:41 PM
 #215

Luada, again we owe you a debt of gratitude.  You do the work the rest of us are too lazy to do.  Now I am beginning to understand why the verification process scales quadratic; not that me understanding matters per se but it is nice to know.
Example of such transaction a can be seen here (from last year).
Holy cow!
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 07:53:56 PM
 #216

what are the implications of this "quadratic TX validation" you guys are talking about?

we can't have TX with a huge amount of inputs? or somthing?
Exactly.  If/when a transaction comes in with zillions of inputs then everyone verifying it will be subjected to a long computation.

zillions of inputs!  Grin this i can understand

adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 07:55:38 PM
 #217

Luada, again we owe you a debt of gratitude.  You do the work the rest of us are too lazy to do.  Now I am beginning to understand why the verification process scales quadratic; not that me understanding matters per se but it is nice to know.
Example of such transaction a can be seen here (from last year).
Holy cow!
now does this constitute a legitimate TX?

can't we just ignore this type of TX?

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 08:28:30 PM
 #218

Luada, again we owe you a debt of gratitude.  You do the work the rest of us are too lazy to do.  Now I am beginning to understand why the verification process scales quadratic; not that me understanding matters per se but it is nice to know.
Example of such transaction a can be seen here (from last year).
Holy cow!
now does this constitute a legitimate TX?

can't we just ignore this type of TX?
Depends; sorry.  In a sense it is absolutely legit; it follows all of the rules.  But it easily could have been coded less abusively.

Let's first look at a much more classic transaction https://blockchain.info/tx/3a6a7d2456bfd6816ee1164e7c11307fa1c6855ee3116b6a1f8e6a14a98b04c4;

address (amount)
12kDK8snhBD6waJ2NaMB7QvSf4DzMcE9ad (0.21963496 BTC - Output) {call it what you want; this is an input}

1FmUPrnZTBymMT3ktgMx61hQ3JRyWD7NPY - (Unspent) 0.14193496 BTC
13dhrUuUe2MrZsufYGAZnwnyE7k97FRZ1v - (Unspent) 0.0776 BTC

Nice and easy; one input, two outputs; classic.

Now compare that to the big one;

19MxhZPumMt9ntfszzCTPmWNQeh6j6QqP2 (Brainwallet - dog) (0.00001 BTC - Output)
19MxhZPumMt9ntfszzCTPmWNQeh6j6QqP2 (Brainwallet - dog) (0.00001 BTC - Output)
...
19MxhZPumMt9ntfszzCTPmWNQeh6j6QqP2 (Brainwallet - dog) (0.00001 BTC - Output) {this is the 1598 occurrence}
...
{followed by zillions of more inputs}

This could have been codes as;

19MxhZPumMt9ntfszzCTPmWNQeh6j6QqP2 (Brainwallet - dog) (0.01598 BTC - Output)
...

and saved 1597 inputs on this batch of inputs alone.  Ok, sure, we could ignore this one; um, need to figure out how to recognize such without rejecting non-abusive transactions.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 08:31:11 PM
 #219

Luada, again we owe you a debt of gratitude.  You do the work the rest of us are too lazy to do.  Now I am beginning to understand why the verification process scales quadratic; not that me understanding matters per se but it is nice to know.
Example of such transaction a can be seen here (from last year).
Holy cow!
now does this constitute a legitimate TX?

can't we just ignore this type of TX?
Depends; sorry.  In a sense it is absolutely legit; it follows all of the rules.  But it easily could have been coded less abusively.

Let's first look at a much more classic transaction https://blockchain.info/tx/3a6a7d2456bfd6816ee1164e7c11307fa1c6855ee3116b6a1f8e6a14a98b04c4;

address (amount)
12kDK8snhBD6waJ2NaMB7QvSf4DzMcE9ad (0.21963496 BTC - Output) {call it what you want; this is an input}

1FmUPrnZTBymMT3ktgMx61hQ3JRyWD7NPY - (Unspent) 0.14193496 BTC
13dhrUuUe2MrZsufYGAZnwnyE7k97FRZ1v - (Unspent) 0.0776 BTC

Nice and easy; one input, two outputs; classic.

Now compare that to the big one;

19MxhZPumMt9ntfszzCTPmWNQeh6j6QqP2 (Brainwallet - dog) (0.00001 BTC - Output)
19MxhZPumMt9ntfszzCTPmWNQeh6j6QqP2 (Brainwallet - dog) (0.00001 BTC - Output)
...
19MxhZPumMt9ntfszzCTPmWNQeh6j6QqP2 (Brainwallet - dog) (0.00001 BTC - Output) {this is the 1598 occurrence}
...
{followed by zillions of more inputs}

This could have been codes as;

19MxhZPumMt9ntfszzCTPmWNQeh6j6QqP2 (Brainwallet - dog) (0.01598 BTC - Output)
...

and saved 1597 inputs on this batch of inputs alone.  Ok, sure, we could ignore this one; um, need to figure out how to recognize such without rejecting non-abusive transactions.

that is the key

i say keep it simple stupid

if it has more than a zillion it's not getting included.

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 08:36:48 PM
 #220

that is the key

i say keep it simple stupid

if it has more than a zillion it's not getting included.
Smiley  You and I think very much alike.  Lauda, can you point us at a really big but totally legit/non-abusive transaction?
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 09, 2016, 08:43:34 PM
 #221

Setting a block size limit of 1MB was, and continues to be a hacky workaround.
It is certainly not a hacky workaround. It is a limit that was needed (it still is for the time being).

Theory drives development, but in practice sometimes hacky workarounds are needed.
If it can be avoided, not really.

The block size limit was a hacky workaround to the expensive to validate issue. An issue that is now mitigated by other much better solutions, not least a well incentivised distributed mining economy. That is now smart enough to route around such an attack, making it prohibitively expensive to maintain.
So exactly what is the plan, replace one "hacky workaround" with another? Quite a lovely way forward. Segwit is being delivered and it will ease the validation problem and increase the transaction capacity. What is the problem exactly?

Problem: an attacker can create a block that is so expensive to validate that other miners would get stuck validation the block.
Hack: Set an arbitrary limit which is way above what we need right now, but closes the attack vector.
Solution: 1 transaction blocks.

Problem: the block size limit is causing transactions to get stuck in the mempool
Hack: raise the block size limit to 2MB
Solution: remove the block size limit

Segwit isn't a solution designed to fix the block size limit. Its a solution to another problem that right now is undefined, that is being sold as a solution to a problem that is being actively curated by those who refuse to remove a prior temporary hack.

What problem is it that requires signatures to be segregated into another data structure and not counted towards the fees. Nobody can give a straight answer to that very simple question. Why is witness data priced differently?

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 08:46:13 PM
 #222

You and I think very much alike.  Lauda, can you point us at a really big but totally legit/non-abusive transaction?
I don't think that there are many transactions that are so large in nature (both 'abusive' and non). This is the one that I'm aware of. However, you'd also have to define what you mean by "big". Do you mean something quite unusually big (e.g. 100kB) or something that fills up the entire block? I'd have to a lot more analysis to try and find one (depending on the type).

Segwit isn't a solution designed to fix the block size limit. Its a solution to another problem that right now is undefined, that is being sold as a solution to a problem that is being actively curated by those who refuse to remove a prior temporary hack.
TX malleability (e.g.) is 'undefined'? Segwit provides additional transaction capacity while carrying other benefits. How exactly is this bad?

What problem is it that requires signatures to be segregated into another data structure and not counted towards the fees. Nobody can give a straight answer to that very simple question. Why is witness data priced differently?
The question would have to be correct for one to be able to answer it. In this case, I have no idea what you are trying to ask.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 08:47:34 PM
 #223

Oh, I was wrong; get over it, I am.  Smiley  We can't just add together inputs.  Here's an example address https://blockchain.info/unspent?active=1Gx8ivf4xSCqNNtUXQxoyBFd4FeGZvwCHT&format=html with multiple outputs, 7 in this case.  To spend the entire lot would involve a transaction with 7 inputs, i.e. not one with just 1 input with the net amount.  Bummer.

So, then the question is what happened to 19MxhZPumMt9ntfszzCTPmWNQeh6j6QqP2 that it had so many tiny outputs in it?

Still the owner could have created multiple smaller transactions instead of one large one.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 08:55:49 PM
 #224

Imagine this; you agree to sell something to someone and they will pay you in Bitcoins.  It turns out they have an address with a zillion little outputs in it.  So, they go to launch a send to you and find the fee is going to be huge (to cover the cost of all those inputs in a timely fashion).  The deal falls through; Bitcoin loses.

Now we can wonder how their address ended up so fragmented but what does it matter?  Maybe they were collecting a zillion little drips from faucets.  Whatever; they can't spent it like a large output.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 09:01:11 PM
 #225

Imagine this; you agree to sell something to someone and they will pay you in Bitcoins.  It turns out they have an address with a zillion little outputs in it.  So, they go to launch a send to you and find the fee is going to be huge (to cover the cost of all those inputs in a timely fashion).  The deal falls through; Bitcoin loses.

Now we can wonder how their address ended up so fragmented but what does it matter?  Maybe they were collecting a zillion little drips from faucets.  Whatever; they can't spent it like a large output.
the needs the of the many outweigh the needs of the few, or the spammer.
the TX inquestion comes from a "stress test", someone wanted to see how much SPAM bitcoin could swallow at once.
if the TX size was made to be >1MB what would've happened then?
finding a good limit shouldn't be very hard.


Set an arbitrary limit which is way above what we need right now, but closes the attack vector.
agreed.

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 09:04:52 PM
 #226

Naively increasing the block size isn't the be all answer.  Sure, when the workload (mempool) is just a bunch of classic small transactions with few inputs then it's great for low fees.  But when a transaction comes along with a huge number of inputs (innocent or malevolent) it will clog up the works forcing everyone to perform a long computation to verify it.  One of these monsters can ruin your day if the calculation takes a significantly longer than 1 block interval.  Or does it?  So, we're behind for a little while but then we catch up.  Or are we saying there are monsters out there that could take hours or even days to verify?

Is there a tendency over time for transactions to become bushier?  When the exchange rate is much larger then the Bitcoin amounts in transaction will tend to be smaller.  Does this lead to fragmentation?
ATguy
Sr. Member
****
Offline Offline

Activity: 423
Merit: 250



View Profile
March 09, 2016, 09:07:22 PM
Last edit: March 09, 2016, 09:32:25 PM by ATguy
 #227

what are the implications of this "quadratic TX validation" you guys are talking about?

we can't have TX with a huge amount of inputs? or somthing?
Exactly.  If/when a transaction comes in with zillions of inputs then everyone verifying it will be subjected to a long computation.

zillions of inputs!  Grin this i can understand


This is what BIP109 fixes and why 2 MB hard fork is usefull to be activated as soon as possible. For more info why reducing to 1.3 GB Signature operations in BIP109  2 MB hard fork used by Bitcoin Classic is necessary and SegWit does not help with:


https://www.reddit.com/r/btc/comments/47f0b0/f2pool_testing_classic/d0deh29

Quote
The worst case block validation costs that I know of for a 2.2 GHz CPU for the status quo, SegWit SF, and the Classic 2 MB HF (BIP109) are as follows (estimated):

1 MB (status quo):  2 minutes 30 seconds (19.1 GB hashed)
1 MB + SegWit:      2 minutes 30 seconds (19.1 GB hashed)
2 MB Classic HF:              10 seconds (1.3 GB hashed)

SegWit makes it possible to create transactions that don't hash a lot of data, but it does not make it impossible to create transactions that do hash a lot of data.

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 09:08:01 PM
 #228

Anyone know where someone is tracking average transaction size (# of inputs) over time?
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 09:09:48 PM
 #229

what are the implications of this "quadratic TX validation" you guys are talking about?

we can't have TX with a huge amount of inputs? or somthing?
Exactly.  If/when a transaction comes in with zillions of inputs then everyone verifying it will be subjected to a long computation.

zillions of inputs!  Grin this i can understand


This is what BIP109 fixes and why 2 MB hard fork is usefull to be activated as soon as possible. For more info why reducing to 1.3 GB Signature operations in BIP109  2 MB hard fork used by Bitcoin Classic is necessary:


http://8btc.com/forum.php?mod=viewthread&tid=29511&page=1#pid374998

Quote
The worst case block validation costs that I know of for a 2.2 GHz CPU for the status quo, SegWit SF, and the Classic 2 MB HF (BIP109) are as follows (estimated):

1 MB (status quo):  2 minutes 30 seconds (19.1 GB hashed)
1 MB + SegWit:      2 minutes 30 seconds (19.1 GB hashed)
2 MB Classic HF:              10 seconds (1.3 GB hashed)

SegWit makes it possible to create transactions that don't hash a lot of data, but it does not make it impossible to create transactions that do hash a lot of data.

Whoa.  Hmm, is there a 0.12 version of Classic yet?
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 09:10:47 PM
 #230

Naively increasing the block size isn't the be all answer.  Sure, when the workload (mempool) is just a bunch of classic small transactions with few inputs then it's great for low fees.  But when a transaction comes along with a huge number of inputs (innocent or malevolent) it will clog up the works forcing everyone to perform a long computation to verify it.  One of these monsters can ruin your day if the calculation takes a significantly longer than 1 block interval.  Or does it?  So, we're behind for a little while but then we catch up.  Or are we saying they are monsters out there that could take hours or even days to verify?
we can't allow any transactions whether or not they are innocent or malevolent to clog up the network. there's no debating this.

Is there a tendency over time for transactions to become bushier?  When the exchange rate is much larger than the Bitcoin amounts in transaction will tend to be smaller.  Does this lead to fragmentation?
yes i believe this is the case, one day, the coins will be way too fragmented.

some kind of "defragmention" will need to take place at one point.

i dont blieve this is a problem for us to worry about... its to far in the future. ( i'm guessing , i do a lot of guesswork )

sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 09, 2016, 09:13:39 PM
 #231

You and I think very much alike.  Lauda, can you point us at a really big but totally legit/non-abusive transaction?
I don't think that there are many transactions that are so large in nature (both 'abusive' and non). This is the one that I'm aware of. However, you'd also have to define what you mean by "big". Do you mean something quite unusually big (e.g. 100kB) or something that fills up the entire block? I'd have to a lot more analysis to try and find one (depending on the type).

Segwit isn't a solution designed to fix the block size limit. Its a solution to another problem that right now is undefined, that is being sold as a solution to a problem that is being actively curated by those who refuse to remove a prior temporary hack.
TX malleability (e.g.) is 'undefined'? Segwit provides additional transaction capacity while carrying other benefits. How exactly is this bad?

What problem is it that requires signatures to be segregated into another data structure and not counted towards the fees. Nobody can give a straight answer to that very simple question. Why is witness data priced differently?
The question would have to be correct for one to be able to answer it. In this case, I have no idea what you are trying to ask.

Fixing TX Malleability is beneficial to everyone.

This *other benefits* - they include giving the ability to introduce consensus changes without hard forking. This is because we are told that a contentious hard fork is a terrible thing. How does anyone know this for sure!?

A hard fork is good. (Note the absence of the word contentious). A hard fork establishes Nakamoto consensus, and is the only consensus vital to the ongoing successful operation of the bitcoin network. The incentives that drive this consensus mechanism are sound. The fear from those that do not see this is overwhelming. To subvert this is to destroy fundamental parts of bitcoins architecture.

I thought you would understand what I meant when I asked the question, sorry if I have used the wrong terminology or something. I can make it a broader question, then perhaps we can investigate the specifics.

Why does segregated witness change the tx fee calculation?

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 09:17:13 PM
 #232

https://github.com/bitcoinclassic/bitcoinclassic/releases/tag/v0.12.0cl1
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 09:22:53 PM
 #233

This is what BIP109 fixes and why 2 MB hard fork is usefull to be activated as soon as possible. For more info why reducing to 1.3 GB Signature operations in BIP109 2 MB hard fork used by Bitcoin Classic is necessary:

http://8btc.com/forum.php?mod=viewthread&tid=29511&page=1#pid374998

Quote
The worst case block validation costs that I know of for a 2.2 GHz CPU for the status quo, SegWit SF, and the Classic 2 MB HF (BIP109) are as follows (estimated):

1 MB (status quo):  2 minutes 30 seconds (19.1 GB hashed)
1 MB + SegWit:      2 minutes 30 seconds (19.1 GB hashed)
2 MB Classic HF:              10 seconds (1.3 GB hashed)

SegWit makes it possible to create transactions that don't hash a lot of data, but it does not make it impossible to create transactions that do hash a lot of data.
Hmm, the link took me to a whole lot of Asian looking characters; am I meant to use a translator?  As such I couldn't find the quoted material.

Question:  Can the BIP109 magic be applied if we have the 1MB block size limit?  If not, why not?
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 09, 2016, 09:23:59 PM
 #234

Set an arbitrary limit which is way above what we need right now, but closes the attack vector.
agreed.

and last I heard its exactly how the attack remains mitigated in classic...

AS BU supporter though, we don't need limits!

IMHO the financial incentives are strong enough that block size (in terms of both bandwidth to transmit/ and CPU to process) is self limiting. Propagation time is a combination of the two things and to (over)simplify propagation time vs orphan risk is enough to make sure miners don't do stupid things, unless they want to lose money.

The full math is here - David you would probably be interested in this if you haven't already seen it.

http://www.bitcoinunlimited.info/resources/1txn.pdf

The paper also describes how the sigops attack is mitigated through miners simply mining 1tx blocks whilst validating then pushing that out to other miners whilst they are still validating the 'poison' block. Rational miners will validate the smaller block, and they also be able to mine another block on top of this, orphaning the poison block.

The attacker would get one shot, and would quickly be shut out. If you have enough hash rate to be mining blocks yourself its really much more profitable to behave!

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
YarkoL
Legendary
*
Offline Offline

Activity: 996
Merit: 1012


View Profile
March 09, 2016, 09:25:06 PM
 #235


Why does segregated witness change the tx fee calculation?

My guess: To incentivize users to upgrade into segwit.
That is the carrot, and the raising fees of regular txs, the stick.

“God does not play dice"
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 09:26:00 PM
 #236

Oh, I see, per https://github.com/bitcoin/bips/blob/master/bip-0109.mediawiki, it is just artificial.  The same sigop and hash limits could, in theory, be used at any block size limit.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 09:28:50 PM
 #237

The full math is here - David you would probably be interested in this if you haven't already seen it.

http://www.bitcoinunlimited.info/resources/1txn.pdf

The paper also describes how the sigops attack is mitigated through miners simply mining 1tx blocks whilst validating then pushing that out to other miners whilst they are still validating the 'poison' block. Rational miners will validate the smaller block, and they also be able to mine another block on top of this, orphaning the poison block.

The attacker would get one shot, and would quickly be shut out. If you have enough hash rate to be mining blocks yourself its really much more profitable to behave!
Yummy; thanks.
ATguy
Sr. Member
****
Offline Offline

Activity: 423
Merit: 250



View Profile
March 09, 2016, 09:31:21 PM
 #238


Hmm, the link took me to a whole lot of Asian looking characters; am I meant to use a translator?  As such I couldn't find the quoted material.

Question:  Can the BIP109 magic be applied if we have the 1MB block size limit?  If not, why not?

Sorry try this:
https://www.reddit.com/r/btc/comments/47f0b0/f2pool_testing_classic/d0deh29

It can, but it needs to be in hardfork, so 2MB is usefull anyway.



Yes, already available few days. Note the sigop is reduced to 1.3 GB only after the 2 MB hard fork is activated and grace period over, blocks with more sigops will become invalid the same way blocks over 1MB are invalid now.

Notable changes from Bitcoin Core version 0.12.0:


Quote
Bitcoin Classic 0.12.0 is based on Bitcoin Core version 0.12.0, and is compatible with its blockchain files and wallet.
For a full list of changes in 0.12.0, visit Core’s website here.
Additionally, this release includes all changes and additions made in Bitcoin Classic 0.11.2, most notably the increase of the block size limit from one megabyte to two megabytes.

    Opt-in RBF is set to disabled by default. In the next release, opt-in RBF will be completely removed.
    The RPC command "getblockchaininfo" now displays BIP109's (2MB) status.
    The chainstate obfuscation feature from Bitcoin Core is supported, but not enabled

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 09:33:14 PM
 #239

Fixing TX Malleability is beneficial to everyone. This *other benefits* - they include giving the ability to introduce consensus changes without hard forking. This is because we are told that a contentious hard fork is a terrible thing. How does anyone know this for sure!?
So being able to run multiple soft forks at once is a bad thing for you? Include the ability to introduce consensus changes without a HF? Source please.

Why does segregated witness change the tx fee calculation?
I don't really have an answer to this question. This might do:
My guess: To incentivize users to upgrade into segwit.
That is the carrot, and the raising fees of regular txs, the stick.

The same sigop and hash limits could, in theory, be used at any block size limit.
Replacing 1 limit with another is anything, but a nice way of solving problems.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 09:34:43 PM
Last edit: March 09, 2016, 09:47:32 PM by franky1
 #240

Naively increasing the block size isn't the be all answer.  Sure, when the workload (mempool) is just a bunch of classic small transactions with few inputs then it's great for low fees.  But when a transaction comes along with a huge number of inputs (innocent or malevolent) it will clog up the works forcing everyone to perform a long computation to verify it.  One of these monsters can ruin your day if the calculation takes a significantly longer than 1 block interval.  Or does it?  So, we're behind for a little while but then we catch up.  Or are we saying there are monsters out there that could take hours or even days to verify?

Is there a tendency over time for transactions to become bushier?  When the exchange rate is much larger then the Bitcoin amounts in transaction will tend to be smaller.  Does this lead to fragmentation?

thats under the assumtion that with a 2mb buffer.. miners will allow themselves to jump to 1.995mb of data instantly.

the real assumtion is however just like in 2013. miners knew they suddenly became able to grow passed the 500k bug and utilize the 1mb buffer. but it took a couple years for them to slowly grow,
and that was the decision of the miners.

we should not leave it to blockstream to set a 1.1mb limit every 2 years knowing that miners will be at the max in maybe 4 months.
instead it should be a 2mb buffer and then let the miners have their own separate preferential rules to grow slowly and just ignore obvious spam transactions until they drop out of the mempool after 48 hours.
knowing that they can happily grow by 0.1mb very 4months+ without needing to ask blockstream for permission or receive abuse or insults

analogy
knowing one day you are going to have 19 children in the next xx years(you already have 9 and live in a 10bedroom house).
(1.9mb data in x years time, currently at 900k data with a 1mb buffer)
would you go through the headache of 2 years of mortgages and legal stuff to get an 11 bedroom house then another 2 years of headaches for a 12 bedroom house
or would you:
go through one headache and get a 20 bedroom house and then spend the next 20 years impregnating your wife 10 times, slowly gaining a child once every couple years.

i know segwit tries to say, lets stay with 10 rooms and fit in some bunkbeds.. so more kids can fit into the 10 rooms. but the problem is that blockstreams other features. like confidential transaction codes. makes all the kids obese with twice the amount of clothing that needs storing too.. so the house becomes overcrowded and slow to get everyone ready in the morning.
which leads to blockstream to instead of expanding to a 20 bedroom house. pushes some of the kids to get adopted by the neighbours(sidechains). and only allowed to visit the real family home if they pay rent

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
YarkoL
Legendary
*
Offline Offline

Activity: 996
Merit: 1012


View Profile
March 09, 2016, 09:37:00 PM
 #241


Replacing 1 limit with another is anything, but a nice way of solving problems.

Tend to agree.. but there is a difference between a hard limit that wards
off an attacker and a hard limit that restricts normal transacting.

“God does not play dice"
ATguy
Sr. Member
****
Offline Offline

Activity: 423
Merit: 250



View Profile
March 09, 2016, 09:46:40 PM
 #242


Why does segregated witness change the tx fee calculation?

My guess: To incentivize users to upgrade into segwit.
That is the carrot, and the raising fees of regular txs, the stick.

This is called discount, eg for the same work (1KB transactions) miners are supposed to get less in fees if it is SegWit transaction within Bitcoin Core.

Fortunatelly miners are free to set their fee policy and hopefully there will be full node client available requiring for the same work (1KB transactions) the same fees whether it is normal or SegWit transaction (if soft fork SegWit gets activated which is uncertain).

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 10:11:55 PM
 #243

When I play this out in my mind I see this;

1) eventually SegWit gets out the door and is adopted but if it doesn't reduce verification times then what was the point?
2) even if SegWit does reduce verification times it won't ultimately be enough and a hard limit on sigops will be required
3) the block size limit is adjusted up and up but validation times ultimately dominate the mining process
4) multiple block chain-based applications besides Bitcoin are required to handle the workload in a timely fashion
BlindMayorBitcorn
Legendary
*
Offline Offline

Activity: 1260
Merit: 1115



View Profile
March 09, 2016, 10:24:18 PM
 #244


Why does segregated witness change the tx fee calculation?

My guess: To incentivize users to upgrade into segwit.
That is the carrot, and the raising fees of regular txs, the stick.

This is called discount, eg for the same work (1KB transactions) miners are supposed to get less in fees if it is SegWit transaction within Bitcoin Core.

Fortunatelly miners are free to set their fee policy and hopefully there will be full node client available requiring for the same work (1KB transactions) the same fees whether it is normal or SegWit transaction (if soft fork SegWit gets activated which is uncertain).


Afaik Classic has no plans to adopt SegWit. Is this correct? Just how malleable are transactions, exactly?

Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 10:29:56 PM
 #245


Why does segregated witness change the tx fee calculation?

My guess: To incentivize users to upgrade into segwit.
That is the carrot, and the raising fees of regular txs, the stick.

This is called discount, eg for the same work (1KB transactions) miners are supposed to get less in fees if it is SegWit transaction within Bitcoin Core.

Fortunatelly miners are free to set their fee policy and hopefully there will be full node client available requiring for the same work (1KB transactions) the same fees whether it is normal or SegWit transaction (if soft fork SegWit gets activated which is uncertain).


Afaik Classic has no plans to adopt SegWit. Is this correct? Just how malleable are transactions, exactly?

its part of their road map

when core has it ready they will adopt it.

thats the plan anyway.

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 10:31:31 PM
 #246

The full math is here - David you would probably be interested in this if you haven't already seen it.

http://www.bitcoinunlimited.info/resources/1txn.pdf

The paper also describes how the sigops attack is mitigated through miners simply mining 1tx blocks whilst validating then pushing that out to other miners whilst they are still validating the 'poison' block. Rational miners will validate the smaller block, and they also be able to mine another block on top of this, orphaning the poison block.

The attacker would get one shot, and would quickly be shut out. If you have enough hash rate to be mining blocks yourself its really much more profitable to behave!
Thank you so much sgbett; that was a really great read.  I must admit I didn't work through all of the math yet but at first blush it appears ok until;
Quote
The Bitcoin network is naturally limited by block validation and construction times. This puts an upper limit on the network bandwidth of 60KB/sec to transmit the block data to one other peer.
Hmm, really?  There's no way ever to improve on block validation times?  Quantum computers?  Nothing?  That doesn't ring true.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 09, 2016, 10:34:16 PM
 #247

Whatever; I've removed Core 0.12 and am now running Classic 0.12.  I am unsettled.
ATguy
Sr. Member
****
Offline Offline

Activity: 423
Merit: 250



View Profile
March 09, 2016, 10:34:58 PM
 #248

Afaik Classic has no plans to adopt SegWit. Is this correct? Just how malleable are transactions, exactly?

Classic Roadmap proposal:
https://github.com/bitcoinclassic/documentation/blob/master/roadmap/roadmap2016.md

Phase 3 (Q3-Q4 2016)
Simplified version of Segregated Witness from Core, when it is available.
Incorporate segregated witness work from Core (assuming it is ready), but no special discount for segwit transactions to keep fee calculation and economics simpl


When I play this out in my mind I see this;

1) eventually SegWit gets out the door and is adopted but if it doesn't reduce verification times then what was the point?
2) even if SegWit does reduce verification times it won't ultimately be enough and a hard limit on sigops will be required
3) the block size limit is adjusted up and up but validation times ultimately dominate the mining process
4) multiple block chain-based applications besides Bitcoin are required to handle the workload in a timely fashion


1) malleability fix + trick to get bit more transactions in block by separating signatures to special data structure for SegWit txs
2) SegWit does not help these attacks
3) you can decrease SigOp limits more for future bigger block sizes to be certain no attack is possible for any kind of blocksize increase
4) it is not unless everyone need to use decentralized currency soon. Time + tech advacement allows onchain scalling for Bitcoin unless we get worldwide demand within 10-20 years (not likely mankind replaces fiat so soon)

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 10:37:11 PM
 #249

Quote
The worst case block validation costs that I know of for a 2.2 GHz CPU for the status quo, SegWit SF, and the Classic 2 MB HF (BIP109) are as follows (estimated):

1 MB (status quo):  2 minutes 30 seconds (19.1 GB hashed)
1 MB + SegWit:      2 minutes 30 seconds (19.1 GB hashed)
2 MB Classic HF:              10 seconds (1.3 GB hashed)
Two things:
1) These are apparently estimations; so this is inadequate data. Blocks differ in size, and types of transactions in them.
2) The comparison makes little sense as they've added a hard limit in Classic.

When I play this out in my mind I see this;

1) eventually SegWit gets out the door and is adopted but if it doesn't reduce verification times then what was the point?
Segwit:
1) More transaction capacity
2) Fixes TX mallaeability
3) New mechanism for adding OPcodes
4) More flexible security model (fraud proofs)
5) Potential bandwidth decrease for SPV nodes.

Linear scaling of sighash operations

A major problem with simple approaches to increasing the Bitcoin blocksize is that for certain transactions, signature-hashing scales quadratically rather than linearly.
Quote
Segwit resolves this by changing the calculation of the transaction hash for signatures so that each byte of a transaction only needs to be hashed at most twice. This provides the same functionality more efficiently, so that large transactions can still be generated without running into problems due to signature hashing, even if they are generated maliciously or much larger blocks (and therefore larger transactions) are supported.

Who benefits?

Removing the quadratic scaling of hashed data for verifying signatures makes increasing the block size safer. Doing that without also limiting transaction sizes allows Bitcoin to continue to support payments that go to or come from large groups, such as payments of mining rewards or crowdfunding services.

The modified hash only applies to signature operations initiated from witness data, so signature operations from the base block will continue to require lower limits.


Whatever; I've removed Core 0.12 and am now running Classic 0.12.  I am unsettled.
I thought better of you at times. I guess I was wrong.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
ATguy
Sr. Member
****
Offline Offline

Activity: 423
Merit: 250



View Profile
March 09, 2016, 10:48:08 PM
 #250

Quote
The worst case block validation costs that I know of for a 2.2 GHz CPU for the status quo, SegWit SF, and the Classic 2 MB HF (BIP109) are as follows (estimated):

1 MB (status quo):  2 minutes 30 seconds (19.1 GB hashed)
1 MB + SegWit:      2 minutes 30 seconds (19.1 GB hashed)
2 MB Classic HF:              10 seconds (1.3 GB hashed)
Two things:
1) These are apparently estimations; so this is inadequate data. Blocks differ in size, and types of transactions in them.
2) The comparison makes little sense as they've added a hard limit in Classic.

Your wrong, follow the posts in the thread url. The hard sigop limit is the way to fight such attacks. All times are for worst case blocks than can be used for attacks and the same computer to make the comparsion fair.

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 09, 2016, 10:53:03 PM
 #251

Your wrong, follow the thread url. The hard sigop limit is the way to fight such attacks. All times are for worst case blocks than can be used for attacks and the same computer to make the comparsion fair.
I don't have time to follow that thread. So this is the worst case scenario? If we assume that this is true, then then Segwit seems pretty good. Added capacity without an increase in the amount of hashed data and no additional limitations. Did I understand this correctly?

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 09, 2016, 10:56:46 PM
 #252

Whatever; I've removed Core 0.12 and am now running Classic 0.12.  I am unsettled.
I thought better of you at times. I guess I was wrong.
there is no wrong choice.
he asked questions, thought about things, weighed the pros and cons, and made a choice, Classic.

franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 09, 2016, 10:58:30 PM
 #253


1) More transaction capacity
2) Fixes TX mallaeability
3) New mechanism for adding OPcodes
4) More flexible security model (fraud proofs)
5) Potential bandwidth decrease for SPV nodes.


point 2 will be fixed, yet blockstream introduced RBF as the new way to 'con' merchants. ultimately there is still a problem

point 3 actually reduces point 1. its like having a 10 bedroom house each room has a bunk bed. but then you let the neighbours kids take the top bunk. such as the fat kid known as confidential payment codes(250bytes extra bloat per tx). while making your own kids get adopted by the neighbours(sidechains) and you charge them 60cents each time they want to come home for the day, hoping they eventually stay at the neighbours because the family home is too crowded

point 5, along with no-witness mode, along with pruned mode reduces the real fullnode count which makes point 4 more of a headache if there are less real honest fullnodes.


I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
ATguy
Sr. Member
****
Offline Offline

Activity: 423
Merit: 250



View Profile
March 09, 2016, 11:04:04 PM
 #254

Your wrong, follow the thread url. The hard sigop limit is the way to fight such attacks. All times are for worst case blocks than can be used for attacks and the same computer to make the comparsion fair.
I don't have time to follow that thread. So this is the worst case scenario? If we assume that this is true, then then Segwit seems pretty good. Added capacity without an increase in the amount of hashed data and no additional limitations.

Yes, your right, these are worst case blocks known for such attacks, and it cant be worse with SegWit, only better. But because block can still be created without any SegWit transactions then the attack threat remains the same with future SegWit as is today.

SegWit is usefull and Bitcoin Core should start cooperate with Bitcoin Classic to get both activated, SegWit and BIP109. Without this, both activations will be much more difficult Wink

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
AlexGR
Legendary
*
Offline Offline

Activity: 1708
Merit: 1049



View Profile
March 09, 2016, 11:28:17 PM
 #255

Quote
By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.
Did it increase by tenfold in 5 years? Not even close. Satoshi did not have the adequate data here.

There is a lot of processing power being untapped right now. This is typically found in GPUs though:

The max radeon single card (not a 2x) of 2008 was, the 4870 doing

1.2 TFLOP (single) / 0.24 TFLOP (double) / 115gb/s ram

The max radeon single card (not a 2x) of 2015 was R9 Fury 9 doing:

8.6 TFLOP (single) / 0.53 TFLOP (double) / 512 gb/s ram

----
Data from: https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units

---

And two imgs from NVIDIA (2-3 y. old but anyway)



sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 09, 2016, 11:41:31 PM
 #256

Thank you so much sgbett; that was a really great read.  I must admit I didn't work through all of the math yet but at first blush it appears ok until;
Quote
The Bitcoin network is naturally limited by block validation and construction times. This puts an upper limit on the network bandwidth of 60KB/sec to transmit the block data to one other peer.
Hmm, really?  There's no way ever to improve on block validation times?  Quantum computers?  Nothing?  That doesn't ring true.

I am totally with you. I read 60kb/s more as a theoretical *minimum* rate we can achieve Smiley

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 09, 2016, 11:48:18 PM
 #257

Your wrong, follow the thread url. The hard sigop limit is the way to fight such attacks. All times are for worst case blocks than can be used for attacks and the same computer to make the comparsion fair.
I don't have time to follow that thread. So this is the worst case scenario? If we assume that this is true, then then Segwit seems pretty good. Added capacity without an increase in the amount of hashed data and no additional limitations. Did I understand this correctly?

You didn't logic it correctlly.

the block size limit was used to protect against "poison" blocks.

The block size limit is in the way of natural growth.

A different solution to "poison" blocks exists now so block size limit is redundant.

Segwit has nothing to do with any of this

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
AliceWonderMiscreations
Full Member
***
Offline Offline

Activity: 182
Merit: 107


View Profile WWW
March 10, 2016, 12:37:58 AM
 #258

Quote
By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.
Did it increase by tenfold in 5 years? Not even close. Satoshi did not have the adequate data here.

There is a lot of processing power being untapped right now. This is typically found in GPUs though:

Not in my GPUs.

I typically use Intel HD when my processor / board supports it.

When I build a Xeon box without Intel HD GPU, well then *if* I install a video card at all it is Geforce 405 w/ 512MB of RAM, clearly not a powerhouse as it gets all the power it needs from the PCIe bus meaning it at most pulls 512W (that model is intended for OEMs but you can find it you search. Also comes in 1GB of RAM but 512 MB suffices for me)

I hereby reserve the right to sometimes be wrong
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 10, 2016, 01:31:30 AM
 #259

Quote
By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.
Did it increase by tenfold in 5 years? Not even close. Satoshi did not have the adequate data here.

There is a lot of processing power being untapped right now. This is typically found in GPUs though:

Not in my GPUs.

I typically use Intel HD when my processor / board supports it.

When I build a Xeon box without Intel HD GPU, well then *if* I install a video card at all it is Geforce 405 w/ 512MB of RAM, clearly not a powerhouse as it gets all the power it needs from the PCIe bus meaning it at most pulls 512W (that model is intended for OEMs but you can find it you search. Also comes in 1GB of RAM but 512 MB suffices for me)

even that low power GPU is probably better than any CPU no?

SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 10, 2016, 02:33:38 AM
 #260

Whatever; I've removed Core 0.12 and am now running Classic 0.12.  I am unsettled.

You are a great example. You studied, you did your math, you weighed in all sides. Even more, you really tried to side with Bitcoin Core and understand them as much as possible. In the end you made your choice. This is a great example of Bitcoin freedom of choice. You just took part of the amazing democratic system that Bitcoin has build in since it's creation. The right too choose what you think is best for the network. But with your decision comes great implication one of them is being wrongfully judged by other sides. I didn't judged you for siding with Bitcoin Core. Yet they will judge you harsh for siding with Classic.


This should be the right of every users to do. The right too choose the best possible code. At the moment Bitcoin Classic is one, and yes they will include SegWit once the code is finished. Why is hidden from people view? Well it's in best interest of Blockstream. And it will spend as much as it can to be it like that. To hide people right too chose and to mask this debate as much as possible.

AliceGored
Member
**
Offline Offline

Activity: 117
Merit: 10


View Profile
March 10, 2016, 02:58:31 AM
 #261

Whatever; I've removed Core 0.12 and am now running Classic 0.12.  I am unsettled.

You are a great example. You studied, you did your math, you weighed in all sides. Even more, you really tried to side with Bitcoin Core and understand them as much as possible. In the end you made your choice. This is a great example of Bitcoin freedom of choice. You just took part of the amazing democratic system that Bitcoin has build in since it's creation. The right too choose what you think is best for the network. But with your decision comes great implication one of them is being wrongfully judged by other sides. I didn't judged you for siding with Bitcoin Core. Yet they will judge you harsh for siding with Classic.


This should be the right of every users to do. The right too choose the best possible code. At the moment Bitcoin Classic is one, and yes they will include SegWit once the code is finished. Why is hidden from people view? Well it's in best interest of Blockstream. And it will spend as much as it can to be it like that. To hide people right too chose and to mask this debate as much as possible.

It's almost like he doesn't have $76 million of other peoples' money riding on not understanding. If miners blindly bloat blocks with free tx, the incentives of the system never worked to begin with. No sense in trying to rube goldberg yourself out of that mess, unless you've got pre-ipo stock [and personal pride(!)] riding on it.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 10, 2016, 06:26:59 AM
 #262

SegWit is usefull and Bitcoin Core should start cooperate with Bitcoin Classic to get both activated, SegWit and BIP109. Without this, both activations will be much more difficult Wink
Cooperate with a high school science project? Interesting suggestion. A combination of both would be too much and BIP109 is really useless with its added limitations (a 2 MB block size limit without those is better).

Did it increase by tenfold in 5 years? Not even close. Satoshi did not have the adequate data here.
There is a lot of processing power being untapped right now. This is typically found in GPUs though:

The max radeon single card (not a 2x) of 2008 was, the 4870 doing
1.2 TFLOP (single) / 0.24 TFLOP (double) / 115gb/s ram
The max radeon single card (not a 2x) of 2015 was R9 Fury 9 doing:
8.6 TFLOP (single) / 0.53 TFLOP (double) / 512 gb/s ram
Two things: 1) I didn't ask for GPUs; 2) Still not tenfold.

You didn't logic it correctlly. the block size limit was used to protect against "poison" blocks. The block size limit is in the way of natural growth.
There's nothing natural about it.

A different solution to "poison" blocks exists now so block size limit is redundant. Segwit has nothing to do with any of this
It isn't a solution. It is a workaround that prevents certain types of transactions.

You are a great example. You studied, you did your math, you weighed in all sides. Even more, you really tried to side with Bitcoin Core and understand them as much as possible. In the end you made your choice. This is a great example of Bitcoin freedom of choice. .
1) He certainly didn't "study" enough and is unaware of a lot of things (e.g. Segwit benefits).
2) The math is high school level and provides inadequate data.
3) It is freedom of choice as long as one sides with Classic, right? Nonsense. Wake me up when the project gets real developers. Seems switching people over was easier than I thought.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
AliceWonderMiscreations
Full Member
***
Offline Offline

Activity: 182
Merit: 107


View Profile WWW
March 10, 2016, 06:57:29 AM
 #263

Quote
By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.
Did it increase by tenfold in 5 years? Not even close. Satoshi did not have the adequate data here.

There is a lot of processing power being untapped right now. This is typically found in GPUs though:

Not in my GPUs.

I typically use Intel HD when my processor / board supports it.

When I build a Xeon box without Intel HD GPU, well then *if* I install a video card at all it is Geforce 405 w/ 512MB of RAM, clearly not a powerhouse as it gets all the power it needs from the PCIe bus meaning it at most pulls 512W (that model is intended for OEMs but you can find it you search. Also comes in 1GB of RAM but 512 MB suffices for me)

even that low power GPU is probably better than any CPU no?

Not sure, it sure as hell isn't better with the crappy open source nvidia driver. Maybe with the proprietary kernel tainting driver it is.

I hereby reserve the right to sometimes be wrong
BlindMayorBitcorn
Legendary
*
Offline Offline

Activity: 1260
Merit: 1115



View Profile
March 10, 2016, 06:59:39 AM
 #264

SegWit is usefull and Bitcoin Core should start cooperate with Bitcoin Classic to get both activated, SegWit and BIP109. Without this, both activations will be much more difficult Wink
Cooperate with a high school science project? Interesting suggestion. A combination of both would be too much and BIP109 is really useless with its added limitations (a 2 MB block size limit without those is better).

Did it increase by tenfold in 5 years? Not even close. Satoshi did not have the adequate data here.
There is a lot of processing power being untapped right now. This is typically found in GPUs though:

The max radeon single card (not a 2x) of 2008 was, the 4870 doing
1.2 TFLOP (single) / 0.24 TFLOP (double) / 115gb/s ram
The max radeon single card (not a 2x) of 2015 was R9 Fury 9 doing:
8.6 TFLOP (single) / 0.53 TFLOP (double) / 512 gb/s ram
Two things: 1) I didn't ask for GPUs; 2) Still not tenfold.

You didn't logic it correctlly. the block size limit was used to protect against "poison" blocks. The block size limit is in the way of natural growth.
There's nothing natural about it.

A different solution to "poison" blocks exists now so block size limit is redundant. Segwit has nothing to do with any of this
It isn't a solution. It is a workaround that prevents certain types of transactions.

You are a great example. You studied, you did your math, you weighed in all sides. Even more, you really tried to side with Bitcoin Core and understand them as much as possible. In the end you made your choice. This is a great example of Bitcoin freedom of choice. .
1) He certainly didn't "study" enough and is unaware of a lot of things (e.g. Segwit benefits).
2) The math is high school level and provides inadequate data.
3) It is freedom of choice as long as one sides with Classic, right? Nonsense. Wake me up when the project gets real developers. Seems switching people over was easier than I thought.

Jeff and Gavin are fake developers. Luke Jr. is real. Check.

Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 10, 2016, 07:05:54 AM
 #265

Jeff and Gavin are fake developers.
You obviously don't understand what was meant by "real". I still stand by my statement. Both have had contributions to Bitcoin, but they are not as big as you think (especially lacking in lately (some time prior to Classic)). Gavin would be better off being a speak such as Antonopoulos. Besides, Garzik doesn't even agree with the grace period introduced in BIP109 at all (said 3-6 months should be a minimum) yet Gavin ignored him completely on that one.

Luke Jr. is real. Check.
Might be a bit arrogant and unusual at times, but still is.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
BlindMayorBitcorn
Legendary
*
Offline Offline

Activity: 1260
Merit: 1115



View Profile
March 10, 2016, 07:15:24 AM
 #266

Jeff and Gavin are fake developers. Luke Jr. is real. Check.
You obviously don't understand what was meant by "real". I still stand by my statement. Both have had contributions to Bitcoin, but they are not as big as you think (especially lacking in lately (some time prior to Classic)). Gavin would be better off being a speak such as Antonopoulos. Besides, Garzik doesn't even agree with the grace period introduced in BIP109 at all (said 3-6 months should be a minimum) yet Gavin ignored him completely on that one.

I'm not any happier about the prospect of a contentious hard frok than you are. But P2P Settlement Layer just doesn't have the same ring to it. Popular opinion is that Core has lost the plot. 

Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 10, 2016, 07:18:52 AM
 #267

Popular Minority opinion is that Core has lost the plot. 
Fixed that for you.


"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 10, 2016, 07:21:41 AM
 #268

Luke Jr. is real. Check.
Might be a bit arrogant and unusual at times, but still is.

Lauda, even if we have our big divergences I think you will laugh at these ones also.

https://www.reddit.com/r/Buttcoin/comments/4936kw/lukejr_is_a_seriously_a_super_crazy_person_quotes/

A bit is an understatement regarding Luke Jr., sadly yes we do have people will mental issues as part of Bitcoin Core development team. Which is sad, even more sad Luke Jr. isn't the only one.

Anyway Lauda, I know you really hate me and yet you have kept your integrity keeping this discussion on Bitcoin discussion topic and also not banning those who contest you. I do respect that. I respect debating and I do accept and agree with you that SegWit is a good feature but it isn't enough sadly.

I do respect others decision to stick with Bitcoin Blockstream Core and I know why also, and I understand why. Because they are satisfied that they somehow do something to increase a bit the capacity of the network, not much, but at least they have a roadmap and a slight increase. We gained that after almost 2 years of debating and splits and people quiting and a lot of drama.

I want to have Bitcoin in safe hands and I do consider Jeff & Gavin are good leaders. Regarding developers we both know that some of Core developers will move in Classic and also anybody with experience can join, if we don't have superstars is not an issue in my opinion.

Bitcoin Core still has chances to redeem themselves if they put on roadmap a 2MB Blocksize increase. In the meeting from Hong Kong they promised that that's why they got the pools by their side. But it isn't stated in roadmap.

Also regarding your correction of Minority .....that minority is increasing leaving only a majority of 68% which is losing ground. Lets imagine that on /r/Bitcoin was a topic "Choose your network" with a very detailed explanation of each version of Bitcoin and the effects, correctly written ofc, unbiased.

What you think it would happen?
BlindMayorBitcorn
Legendary
*
Offline Offline

Activity: 1260
Merit: 1115



View Profile
March 10, 2016, 07:22:13 AM
 #269

Popular Minority opinion is that Core has lost the plot.  
Fixed that for you.



I haven't been following the fake node war too closely. I'll take your word for this.




not much, but at least they have a roadmap and a slight increase. We gained that after almost 2 years  

See, this is a popular opinion. People everywhere think maybe our balls dropped off. If we're going to grow the user base we have to take risks.  Cool

Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
topiOleg
Full Member
***
Offline Offline

Activity: 174
Merit: 100



View Profile
March 10, 2016, 07:50:55 AM
 #270

Jeff and Gavin are fake developers. Luke Jr. is real. Check.
You obviously don't understand what was meant by "real". I still stand by my statement. Both have had contributions to Bitcoin, but they are not as big as you think (especially lacking in lately (some time prior to Classic)).


I find interesting correlation between Bitcoin price increase and Jeff + Gavin number of github commits to Bitcoin. Maybe when both become very active again the Bitcoin price will not stagnate anymore. Big hope in this, so we need competetion because with Blockstream, there is not much hope in Bitcoin price increase when they dont feel the need to keep the Bitcoin adoption unrestricted, some of them even feels blocks should be lowered (thus the amount of people using Bitcoin) ! 

YarkoL
Legendary
*
Offline Offline

Activity: 996
Merit: 1012


View Profile
March 10, 2016, 08:33:08 AM
 #271


A bit is an understatement regarding Luke Jr., sadly yes we do have people will mental issues as part of Bitcoin Core development team. Which is sad, even more sad Luke Jr. isn't the only one.

By no means. For example, I have some "mental issues", that is, I entertain some notions
that great deal of people may deem crazy. And since quite a lot of Bitcoiners are
of somewhat individualistic bent, I take it as given that I encounter a
lot of notions here that seem to myself crazy or reprehensible.

Thankfully none of that matters in the least with regard to the issues we are
discussing here.

“God does not play dice"
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 10, 2016, 08:54:04 AM
 #272

You didn't logic it correctlly. the block size limit was used to protect against "poison" blocks. The block size limit is in the way of natural growth.
There's nothing natural about it.

A different solution to "poison" blocks exists now so block size limit is redundant. Segwit has nothing to do with any of this
It isn't a solution. It is a workaround that prevents certain types of transactions.

If Bitcoin adoption continues then natural growth occurs!ergo he block size limit is in the way. This is a reasonable assertion so what followed still stands. If you refute that Bitcoin adoption is going to continue than you have a point. Do you refute that. Do you think Bitcoin will not grow?

Given that Bitcoin will continue to grow and that there is a known lead time to safely deploy code that removes the limit (75% activation and 28 days grace) and that it cannot be reliably known that Bitcoin growth will not exceed current limits and that exceeding those limits affects the normal operation of the network then that code must be deployed now to mitigate the risk of the network not being able to operate normally.

Certain types of transactions are the problem. Excluding those types of transactions is a solution.

It's a workaround the same way the 1MB block size limit was.

If it causes problems and it can be solved differently then it can be removed.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
March 10, 2016, 09:17:41 AM
 #273

Everything belongs to core!  Cheesy



Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 10, 2016, 01:20:18 PM
 #274

I haven't been following the fake node war too closely. I'll take your word for this.
The number of nodes is a unreliable metric, especially in a short amount of time.

Lauda, even if we have our big divergences I think you will laugh at these ones also. https://www.reddit.com/r/Buttcoin/comments/4936kw/lukejr_is_a_seriously_a_super_crazy_person_quotes/
A bit is an understatement regarding Luke Jr., sadly yes we do have people will mental issues as part of Bitcoin Core development team. Which is sad, even more sad Luke Jr. isn't the only one.
No. I'm not going to attack an individual because of their beliefs in this case. You can be a huge nutjob and still be a great developer/engineer. These attacks on individuals help noone.

Anyway Lauda, I know you really hate me and yet you have kept your integrity keeping this discussion on Bitcoin discussion topic and also not banning those who contest you. I do respect that. I respect debating and I do accept and agree with you that SegWit is a good feature but it isn't enough sadly.
AFAIK nobody was ever banned due to a difference in opinions. I would never punish someone because of such. Why isn't Segwit enough?

We gained that after almost 2 years of debating and splits and people quiting and a lot of drama.
So the block size increase was supposed to happen in 2014? This doesn't make sense.

I want to have Bitcoin in safe hands and I do consider Jeff & Gavin are good leaders. Regarding developers we both know that some of Core developers will move in Classic and also anybody with experience can join, if we don't have superstars is not an issue in my opinion.
Quote
Trace Mayer: I would like to watch a basketball game with evenly skilled players on both sides, but really we have Jordon's bulls and then we have a couple of junior high school player one the other side. It is so unevenly matched that it makes you wonder why there's even a debate at all. On the other side (Classic) you have Gavin Andersen, Mike Hearn (note: quit already), Jeff Garzik and that's pretty much it...
There is unanimous agreement among developers like 23 out of 25 (Garzik, and Andersen). Why is it that the two people who disagree haven't really contributed any significant code in years?
You have to admit that he makes a good point here. He's asking for commits and that's what would matter from this viewpoint. You'd also have to admit that Classic is relying on the work of Core/Core contributors helping Classic (if it comes to a fork).

Also regarding your correction of Minority .....that minority is increasing leaving only a majority of 68% which is losing ground. Lets imagine that on /r/Bitcoin was a topic "Choose your network" with a very detailed explanation of each version of Bitcoin and the effects, correctly written ofc, unbiased.
No. That number is randomly picked. There is no way to measure exactly how much of the total ecosystem (miners, developers, users, merchants) are in support of a contentious HF.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
n0ne
Hero Member
*****
Offline Offline

Activity: 2562
Merit: 548


8ombard - Pick, Play, Prosper!


View Profile WWW
March 10, 2016, 01:41:12 PM
 #275

It is possible to grow larger than the visa network with the current user rate in accordance to the time of existence.

█████▄▄██
███▄█████
██▄███████▄
████████████████
███▀██████████▀
██▄████████████▄
░█████▀▀▀▀▀▀█████
████▀████████▀████
▀▀▀▀▄▄▄▄▄█████████
█████▀███████▄████
███████▀▀▄▄▄█████
███████████████▀
████████████▀▀
OMBARD.com|.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀
██████░██░████░██
▄▄░▄▄░▄▄░▄▄░▄▄░▄▄▄▄
▀▀░▀▀░▀▀░▀▀░▀▀░▀▀▀▀
██████████████
▄▄░▄▄▄▄░▄▄░▄▄▄▄▄▄
▀▀░▀▀▀▀░▀▀░▀▀▀▀▀▀
██░██░██████████
▄▄▄▄▄▄▄▄░▄▄░▄▄▄▄
▀▀▀▀▀▀▀▀░▀▀░▀▀▀▀
.
PICK,
PLAY,
PROSPER!
|.

██████
██████████
██████████
██████████████
████████████████
████████████████
████████████████
████████████████
████████████████
█████████████████   ██
PROVABLY
FAIR
1%█████████████████   ██
HOUSE
EDGE
100%█████████████████   ██
DEPOSIT
BONUS
.
  Play now  
btcusury
Sr. Member
****
Offline Offline

Activity: 433
Merit: 260


View Profile
March 10, 2016, 02:12:59 PM
 #276

The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?

That's not even really the appropriate question. The real question you should be asking is "What would prevent someone who wants to destroy the network from producing such a block?"

But you don't ask that because ... (you tell me).

FACT: There were hundreds of thousands of unnecessary deaths by December 2020 due to the censorship of all effective treatments (most notably ivermectin) in order to obtain EUA for experimental GT spike protein injections despite spike bioweaponization patents going back about a decade, and the manufacturers have 100% legal immunity despite long criminal histories.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 10, 2016, 02:17:50 PM
 #277

I am truly unsettled; I have switched from Core 0.11 to Classic 0.11 to Core 0.12 to Classic 0.12 and it is totally possible/likely I will switch again.  Perhaps I will try using more advanced mathematics, above and beyond high school maths. Smiley

#0: Reducing unproductive contention in the Bitcoin community is a personal goal.

#1: Right now my top interest is in the vulnerability to transactions with lots of inputs that lead to long compute times for verifications.  One wonders why we don't see bad actors using this against Bitcoin relentlessly.  Maybe there are less malicious folks attacking than I hear about?  Maybe the bad guys don't know how much this can hurt?

#2: Secondarily I would like to see a concerted effort be made to 1) educate users about setting fees well to facilitate quick confirmations, i.e. to avoid long/painful/anxiety-causing commit times and 2) enhancing wallets to make it automatic and default to have high enough fees.

#3: Thirdly; I do highly prefer adoption much more than increasing fees which means to me capacity.  Increasing fees will have to come eventually but in the meantime I would happily sacrifice fees until adoption is really widespread.

#4: Fourthly; Is there something to the Bitcoin Unlimited stuff?

#5: Lastly; all of those other features/functions like transaction malleability, etc.  Perhaps these should be higher on my list but I haven't dug into them yet; sorry.

With those in mind;

block size limit, e.g. 2MB; helps with #3, hurts #1, we should resist doing it just because it seems obvious, obvious is not a reliable attribute

SegWit; helps with #3, hurts #1, introduces other complexities that might have subtle consequences

limit inputs: all by itself this is great for #1, seems simple enough, workaround is trivial, i.e. create multiple small transactions
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 10, 2016, 02:22:21 PM
 #278

SegWit; helps with #3, hurts #1, introduces other complexities that might have subtle consequences
It doesn't hurt number one: It either 1) Leaves #1 unchanged; 2) Improves #2 due to linear scaling.

#4: Fourthly; Is there something to the Bitcoin Unlimited stuff?
Not really, no.

Lastly; all of those other features/functions like transaction malleability, etc.  Perhaps these should be higher on my list but I haven't dug into them yet; sorry.
As an example, Mt.Gox claimed that the "hack" occurred due to transactions malleability (I don't want to get into this in depth as some believes differentiating theories; that's why I said "claimed").
Quote
3) New mechanism for adding OPcodes
4) More flexible security model (fraud proofs)
5) Potential bandwidth decrease for SPV nodes.
You should look into those. Even Gavin praises Segwit.


"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 10, 2016, 02:33:00 PM
 #279

The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?

That's not even really the appropriate question. The real question you should be asking is "What would prevent someone who wants to destroy the network from producing such a block?"

But you don't ask that because ... (you tell me).


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify
- Someone wants to destroy the network ( 100 million dollar investment )
- Other miners want to let him destroy the network

Conclusion:
BITCOIN IS DOOOM!


there's nothing stopping the other miners from orphenning the attackers blocks... if his block give them much inconvenience for whatever reason they will orphen them...



nickenburg
Sr. Member
****
Offline Offline

Activity: 434
Merit: 511


View Profile
March 10, 2016, 02:38:55 PM
 #280

It is possible to grow larger than the visa network with the current user rate in accordance to the time of existence.

I would say the bitcoin network definetly could get bigger then the visa network,  and the time of existance is relativly small so it still has time to grow as well.
And if you look on blockchain.info and you see al the transactions already being made.
You realize this is really big already,  the amount of people all over the world that are using it,  we could form a Bitcoin country already!
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 10, 2016, 02:47:08 PM
 #281

I am truly unsettled; I have switched from Core 0.11 to Classic 0.11 to Core 0.12 to Classic 0.12 and it is totally possible/likely I will switch again.  Perhaps I will try using more advanced mathematics, above and beyond high school maths. Smiley

#0: Reducing unproductive contention in the Bitcoin community is a personal goal.

#1: Right now my top interest is in the vulnerability to transactions with lots of inputs that lead to long compute times for verifications.  One wonders why we don't see bad actors using this against Bitcoin relentlessly.  Maybe there are less malicious folks attacking than I hear about?  Maybe the bad guys don't know how much this can hurt?

#2: Secondarily I would like to see a concerted effort be made to 1) educate users about setting fees well to facilitate quick confirmations, i.e. to avoid long/painful/anxiety-causing commit times and 2) enhancing wallets to make it automatic and default to have high enough fees.

#3: Thirdly; I do highly prefer adoption much more than increasing fees which means to me capacity.  Increasing fees will have to come eventually but in the meantime I would happily sacrifice fees until adoption is really widespread.

#4: Fourthly; Is there something to the Bitcoin Unlimited stuff?

#5: Lastly; all of those other features/functions like transaction malleability, etc.  Perhaps these should be higher on my list but I haven't dug into them yet; sorry.

With those in mind;

block size limit, e.g. 2MB; helps with #3, hurts #1, we should resist doing it just because it seems obvious, obvious is not a reliable attribute

SegWit; helps with #3, hurts #1, introduces other complexities that might have subtle consequences

limit inputs: all by itself this is great for #1, seems simple enough, workaround is trivial, i.e. create multiple small transactions

#0 this can be solved by core adding in the code, to not cause the contention they cry about. then with all the different code bases (btcd, bitcoinj, classic, core, etc) having the code, it is then just a waiting game to see if miners decide to upgrade too knowing all the users are ready to accept such changes.

#1 this can be solved by knowing REALISTICALLY the times it actually takes to sort through a transaction with 50 inputs, 100 inputs, 1000 inputs. and then see how many times people have made a genuine(not attack) transaction of such. to set an arbitrary rule to ignore transactions with such amount of inputs, just like they ignore transactions with no fee.

#2 1) also educate people how to code/make transactions with least bloat/random use of inputs. to help them decrease their own cost of sending a tx, aswell as helping reduce chances of verification time problems.
#2 2) its a little too early to push the transaction fee up.. id say it should be a slow process over years-decades, not a rush job to try making 3000+ transactions every 10 minutes have high fee's meaning the fee naturally increases because there are only 2000 transactions(average) allowed in per block.
imagine it. 3000 tx pay 4c.. 2000 get allowed in block1 and 1000 in block2.. then the next 3000 tx only 1000 is in block 2 and 2000 are in block 3. meaning that if the 3rd set of transactions have any hope of getting in block 4 or 5 they are likely to have to pay a premium to fight off competition..
so within 5 blocks the price begins to rise.

the fee is not an essential part of mining income. it is just a small bonus. it does not need to, nor should offset the reward for many years. because the reward is part of the mechanism that helps the speculation of the deflationary price(price rise). by flooding miners with lots of coins will make them spend them just as fast which not only affects the valuation of bitcoin. but also pushes miners to add more ASICS, and raising the difficulty. making bitcoin more centralized due to the smaller pools losing the competition. also pushing customers/users away from bitcoin because a fee does not actually guarantee the very next block(no guarantee of confirmation in 10 minutes)
pushing the fee has a knock on affect on many aspects. and there is no logic to push it too fast.

#3 agreed. allowing twice as much buffer room(2mb) to grow, or even 4x as much buffer room(2mb+segwit) allows for NATURAL SLOW growth without pushing or irritations and without demanding to dev-team for just an extra spoon of buffer every couple months knowing it takes 2 years to get the spoon.

#4 i personally am not in BU camp, xt camp, classic camp or core camp. i just want more buffer space without beeding to be spoonfed by developers. BU is the premiss that there needs no hard limit. and miners can set their own preferential soft limits. but best to ask someone(hopefully they reply unbiasedly) who has researched it in more detail

#5 i agree it doesnt need to be a classic OR core. it needs both. in combination

edit
your final sentence answered the #1 for yourself and more elegantly then i did

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 10, 2016, 02:57:12 PM
Last edit: March 10, 2016, 03:25:59 PM by franky1
 #282

Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify
- Someone wants to destroy the network ( 100 million dollar investment )
- Other miners want to let him destroy the network

Conclusion:
BITCOIN IS DOOOM!


there's nothing stopping the other miners from orphenning the attackers blocks... if his block give them much inconvenience for whatever reason they will orphen them...




EDIT. my now removed statement was on the bases of a 1mb attack. now lets make it 64mb..
if it takes 10minutes to validate a malicious 1mb. then it takes 640 minutes to validate a 64mb.. and so malicious miner is 638 minutes behind the competitors before it even begins to hash a block, because neutral miners will ignore such stupid transactions just like they ignore 0 fee transactions and make a standard block using more rational size transactions

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 10, 2016, 02:57:24 PM
 #283

The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?

That's not even really the appropriate question. The real question you should be asking is "What would prevent someone who wants to destroy the network from producing such a block?"

But you don't ask that because ... (you tell me).


I do not ask the question because, an attacker intending to destroy the network would not produce such a block, because such a block does not destroy the network.

This transaction monopolization can only happen IF all the other mining pools choose to mine on top of that monopolizing mining pool's blocks. Yet, doing so will result in lower profitability for the other (the majority) of pools since they (having smaller validation capacity) must always mine 1-txn blocks and are therefore unable to reap transaction fees. This is an unstable situation - if a single mining pool chooses to ignore the large block and is able to find a small competing block while other pools are still validating a large block, it is in the other pools best interest to switch to this new sibling6. By switching, the other pools reduce the risk that they are mining on top of an invalid block, and can mine blocks with transactions. But if mining pools know that the majority will switch to a discovered sibling, it is rational for all pools except for the producer of the large block to search for a sibling rather than produce a 1-txn block.

Knowing that an expensive to validate block is mitigated as an attack vector, the question remains as to why somebody would produce such a block?

The implication being that if you have the money/hardware to produce blocks, your most profitable course of action is to just mine blocks honestly.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 10, 2016, 03:20:46 PM
 #284

miners are free to determine if a block is valid or not by whatever standards they choose. they are by no means forced to accept any block. the majority of miners need to accept the block for this block to be valid. therefore any argument that starts with " if a miner is a bad actor " is not valid.  you need to start with " if majority of miners are bad actors ". At best the bad guy miner could cause some disruption during the time it take all the other miners to agree to whatever new rules they need to implement to start ignoring the attackers blocks. these new rules do not need a HF.

is there anything false with this statement?

franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 10, 2016, 03:35:09 PM
 #285

miners are free to determine if a block is valid or not by whatever standards they choose. they are by no means forced to accept any block. the majority of miners need to accept the block for this block to be valid. therefore any argument that starts with " if a miner is a bad actor " is not valid.  you need to start with " if majority of miners are bad actors ". At best the bad guy miner could cause some disruption during the time it take all the other miners to agree to whatever new rules they need to implement to start ignoring the attackers blocks. these new rules do not need a HF.

is there anything false with this statement?


nope
i agree with your statement, infact it adds another level ontop of my statement. because my statement was under the premiss that 64mb blocks were acceptable as standard and that there was little to no code to void a competitors solved block due to malicious bloat. and only code for miners to just ignore or accept malicious transactions within their own attempts of making a block.

similar to the 0fee ignore game. some miners ignore transactions without fee's but if a competitor had a block solution and those had tx's without fee's the ignorant miner would still accept its competitors solution.

but you are quite right the miners could also reject a block if it does not like the content of their competitors block,

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
btcusury
Sr. Member
****
Offline Offline

Activity: 433
Merit: 260


View Profile
March 10, 2016, 04:16:56 PM
 #286

The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?

That's not even really the appropriate question. The real question you should be asking is "What would prevent someone who wants to destroy the network from producing such a block?"

But you don't ask that because ... (you tell me).


I do not ask the question because, an attacker intending to destroy the network would not produce such a block, because such a block does not destroy the network.

This transaction monopolization can only happen IF all the other mining pools choose to mine on top of that monopolizing mining pool's blocks. Yet, doing so will result in lower profitability for the other (the majority) of pools since they (having smaller validation capacity) must always mine 1-txn blocks and are therefore unable to reap transaction fees. This is an unstable situation - if a single mining pool chooses to ignore the large block and is able to find a small competing block while other pools are still validating a large block, it is in the other pools best interest to switch to this new sibling6. By switching, the other pools reduce the risk that they are mining on top of an invalid block, and can mine blocks with transactions. But if mining pools know that the majority will switch to a discovered sibling, it is rational for all pools except for the producer of the large block to search for a sibling rather than produce a 1-txn block.

Knowing that an expensive to validate block is mitigated as an attack vector, the question remains as to why somebody would produce such a block?

The implication being that if you have the money/hardware to produce blocks, your most profitable course of action is to just mine blocks honestly.

You are missing my point because you are assuming that the motivations of all participants is that of monetary profit. You can't just ignore the observation that decentralized cryptocurrency represents the only real threat to the central banksters in a very long time. The point is the same for 64MB as it is for 2MB, only more obvious in that the problems 64MB would create are much greater.



FACT: There were hundreds of thousands of unnecessary deaths by December 2020 due to the censorship of all effective treatments (most notably ivermectin) in order to obtain EUA for experimental GT spike protein injections despite spike bioweaponization patents going back about a decade, and the manufacturers have 100% legal immunity despite long criminal histories.
dwma
Sr. Member
****
Offline Offline

Activity: 405
Merit: 250


View Profile
March 10, 2016, 04:35:49 PM
 #287

Quote
By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.
Did it increase by tenfold in 5 years? Not even close. Satoshi did not have the adequate data here.
Quote
Satoshi certainly didn’t do much (if any) analysis of the scaling limitations of Bitcoin.
We need not appeal to authority. Just because Satoshi invented it, that does not mean that he knows the answers to everything.

I suspect Moore's law is dead and I'm not sure it ever existed for spinning drives or SSDs which will be the problem, not CPUs.
AliceWonderMiscreations
Full Member
***
Offline Offline

Activity: 182
Merit: 107


View Profile WWW
March 10, 2016, 06:16:35 PM
 #288

Quote
By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.
Did it increase by tenfold in 5 years? Not even close. Satoshi did not have the adequate data here.
Quote
Satoshi certainly didn’t do much (if any) analysis of the scaling limitations of Bitcoin.
We need not appeal to authority. Just because Satoshi invented it, that does not mean that he knows the answers to everything.

I suspect Moore's law is dead and I'm not sure it ever existed for spinning drives or SSDs which will be the problem, not CPUs.

In 1998 I paid $200 for a 2 GB Seagate SCSI drive so I could install MKLinux DR3 on my 233 MHz Beige G3.

Looking at newegg, I can get a 4 TB platter drive for about $200 or a 1 TB SSD (500 MB for top brand) for $200.

I hereby reserve the right to sometimes be wrong
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 10, 2016, 06:19:11 PM
 #289

The elephant in the room just will not go away though. The limit was a temporary fix.

You do realise that a block as big as 64 MB will take a lot longer than 10 minutes to verify on anything but the most costly hardware?

(so if Satoshi had "got it right" with the original block size then Bitcoin wouldn't be able to even produce blocks as quickly as 10 minutes and no-one but corporations would be able to afford to even run full nodes - but hey who cares about inconvenient truths such as those)


Premise:
- 64MB blocks are allowed
- 64MB blocks take over 10 minutes to verify

Question:
What financial incentive exists for a miner to produce such a block?

That's not even really the appropriate question. The real question you should be asking is "What would prevent someone who wants to destroy the network from producing such a block?"

But you don't ask that because ... (you tell me).


I do not ask the question because, an attacker intending to destroy the network would not produce such a block, because such a block does not destroy the network.

This transaction monopolization can only happen IF all the other mining pools choose to mine on top of that monopolizing mining pool's blocks. Yet, doing so will result in lower profitability for the other (the majority) of pools since they (having smaller validation capacity) must always mine 1-txn blocks and are therefore unable to reap transaction fees. This is an unstable situation - if a single mining pool chooses to ignore the large block and is able to find a small competing block while other pools are still validating a large block, it is in the other pools best interest to switch to this new sibling6. By switching, the other pools reduce the risk that they are mining on top of an invalid block, and can mine blocks with transactions. But if mining pools know that the majority will switch to a discovered sibling, it is rational for all pools except for the producer of the large block to search for a sibling rather than produce a 1-txn block.

Knowing that an expensive to validate block is mitigated as an attack vector, the question remains as to why somebody would produce such a block?

The implication being that if you have the money/hardware to produce blocks, your most profitable course of action is to just mine blocks honestly.

You are missing my point because you are assuming that the motivations of all participants is that of monetary profit. You can't just ignore the observation that decentralized cryptocurrency represents the only real threat to the central banksters in a very long time. The point is the same for 64MB as it is for 2MB, only more obvious in that the problems 64MB would create are much greater.


No such assumption was made. In the quoted text a method by which *other* miners who are motivated by money act to thwart this kind of attack by mining a smaller sibling block, building on that and ultimately orphaning the attacker's block.

There are simpler ways that a theoretical bad actor with no profit motivation and lots of money can destroy bitcoin. Setting up a mining operation to craft malicious expensive to validate blocks doesn't seem like a good angle to me!

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
rizzlarolla
Hero Member
*****
Offline Offline

Activity: 812
Merit: 1001


View Profile
March 10, 2016, 07:52:55 PM
 #290


I like CIYAM. He's grumpier than I am. Wow.  Angry

I also agree with his ad sig attitude.
And I like his individualism, his strength of mind.
And I like his intelligence.

I have been trying to post in his support.
I failed. No post.

I just don't agree with his take on the way forward, any more, sorry.

I don't even want to get technical, CIYAM will wipe the floor with me.
I doubt CIYAM will therefore hold me in any regard. Unfortunately.

I didn't want a block size increase. I think I do now.
A limited increase. (1.5) 2mb. If it is "easily" doable.
I understand, in the scheme of things, it is.

Leave segwit for now. (and hopefully forever)
2mb is needed, safe, obvious, and simplistic?
(needed for the steady growth we have watched over the years, as I think David R said)



monsanto
Legendary
*
Offline Offline

Activity: 1241
Merit: 1005


..like bright metal on a sullen ground.


View Profile
March 10, 2016, 08:43:23 PM
 #291

Sure don't sound like Szabo.
QuestionAuthority
Legendary
*
Offline Offline

Activity: 2156
Merit: 1393


You lead and I'll watch you walk away.


View Profile
March 10, 2016, 11:17:06 PM
 #292


I like CIYAM. He's grumpier than I am. Wow.  Angry

I also agree with his ad sig attitude.
And I like his individualism, his strength of mind.
And I like his intelligence.

I have been trying to post in his support.
I failed. No post.

I just don't agree with his take on the way forward, any more, sorry.

I don't even want to get technical, CIYAM will wipe the floor with me.
I doubt CIYAM will therefore hold me in any regard. Unfortunately.

I didn't want a block size increase. I think I do now.
A limited increase. (1.5) 2mb. If it is "easily" doable.
I understand, in the scheme of things, it is.

Leave segwit for now. (and hopefully forever)
2mb is needed, safe, obvious, and simplistic?
(needed for the steady growth we have watched over the years, as I think David R said)


I like CIYAM point of view as well. I don't think he's grumpy. He never used to be like this. He's just like me, he's just sick of the bullshit.

SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 11, 2016, 05:12:27 AM
 #293

Meanwhile they are 66% of network left that are running Bitcoin Core code.

If it gets at 50% what that means? They've lost majority ?
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 11, 2016, 07:15:59 AM
 #294

Meanwhile they are 66% of network left that are running Bitcoin Core code.
Do you not read what I write to you or is your brain unable to comprehend what I'm saying? I'm not certain which the case is. The number of nodes is not a reliable metric, especially not in a short time frame. This is because people can jump-start a lot of nodes in a very small amount of time. Additionally, it is possible to run fake nodes as well. 

If it gets at 50% what that means? They've lost majority ?
It means nothing. Exactly nothing, especially when a Sybil-attack is being promoted.

He's just like me, he's just sick of the bullshit.
Welcome to the club.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
BittBurger (OP)
Hero Member
*****
Offline Offline

Activity: 924
Merit: 1001


View Profile
March 12, 2016, 12:31:42 AM
 #295

Can someone please explain to me how we have these Satoshi quotes and yet there is still any discussion / debate / disagreement on it?  

I don't understand how this can be viewed as anything other than what it is:  A complete deviation from the expected design of Bitcoin for the financial profit of a company that wants to take transactions off the main chain.  

And how the lack of urgency to scale is anything other than what it is:  A means to an end for ensuring the Lightening Network will make money.

This is an honest, genuine question.  Until I found the Satoshi quotes I was on the fence on this issue.  

Now im just like "what the actual fuck?"

-B-

Owner: "The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
View it on the Blockchain | Genesis Block Newspaper Copies
AlexGR
Legendary
*
Offline Offline

Activity: 1708
Merit: 1049



View Profile
March 12, 2016, 01:57:19 AM
 #296

Can someone please explain to me how we have these Satoshi quotes and yet there is still any discussion / debate / disagreement on it?  

The quote says one thing, reality says another.

Reality = right now you can't get to a blocksize where you can do visa level tx capacity.

And this is similarly true, not only for bitcoin, but bitcoin-based clones and other blockchain-based systems.
madjules007
Sr. Member
****
Offline Offline

Activity: 400
Merit: 250



View Profile
March 12, 2016, 02:25:57 AM
Last edit: March 12, 2016, 03:00:17 AM by madjules007
 #297

The limit was a temporary fix.

Shortly before disappearing, Satoshi also repeated over and over that there were more ways that Bitcoin could be successfully DOS attacked than he could count.

The limit (in conjunction with transaction fees) were intended as a DOS attack and spam deterrent. Do you have any evidence that those risks have been mitigated? Do you have any evidence that suggests that 2MB blocks won't be filled to capacity right away? That would leave us where we are today -- with no scaling solutions, wondering how much more un-optimized throughput the network can safely handle.  

I don't believe a second, compatible implementation of Bitcoin will ever be a good idea.  So much of the design depends on all nodes getting exactly identical results in lockstep that a second implementation would be a menace to the network.

Satoshi's description of how to increase the block size limit was also clearly in the context of a software update. Not an incompatible implementation of bitcoin that essentially attacks the network and intentionally forks from all other client nodes. He even thought compatible alternatives were a bad idea.

One could make the argument that Satoshi thought we could raise the limit as/when needed (as if we needed his authority anyway). In that respect, the question of necessity is subjective and debatable, and it is not immediately clear that an increase to 2MB now, with no attempts to make throughput more scalable is necessary.

One could not make the argument that Satoshi thought we should increase the block size limit through a contentious hard fork.

All of this is moot. We should be talking in terms of "what is best for bitcoin" -- with respect to users, nodes and miners -- not talking past each other with interpretations of things Satoshi said 5-6 years ago. Not only was Satoshi wrong about some things, but the state of bitcoin has changed a lot. Hell, if we left the codebase as he originally wrote it, those 184 billion bitcoins from the August 2010 value overflow incident would still be with us. Better to work towards bitcoin's principles than aimlessly trying to fit some arbitrary interpretation of Satoshi's words.

██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
RISE
AliceGored
Member
**
Offline Offline

Activity: 117
Merit: 10


View Profile
March 12, 2016, 03:33:39 AM
 #298

The limit was a temporary fix.

Shortly before disappearing, Satoshi also repeated over and over that there were more ways that Bitcoin could be successfully DOS attacked than he could count.

Well then, as he talked about "other ways" should we disregard this:

It can be phased in, like:

if (blocknumber > 115000)
    maxblocksize = largerlimit

It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.

When we're near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade.

Also, while not conclusive at all, notice the hypothetical block number he used, and possibly why he didn't pull 1 million out of that hat.

The limit (in conjunction with transaction fees) were intended as a DOS attack and spam deterrent. Do you have any evidence that those risks have been mitigated? Do you have any evidence that suggests that 2MB blocks won't be filled to capacity right away? That would leave us where we are today -- with no scaling solutions, wondering how much more un-optimized throughput the network can safely handle.  

It's been somewhat mitigated, in the sense that back then, it cost exactly nothing to solve a block, or a trivial amount. Today it takes massive investment, located in massive farms, which are not incentivized to kill Bitcoin by DoSing it with malicious blocks, just for the lels. Blocks that the other miners might as well orphan(stale) anyway. The disagreement is about who determines what is spam. One group thinks it should be the miners deciding the block sizes they produce and the fees they require for admission. The other group thinks it should be an insular group of friends on irc, with a blatant COI.

I don't believe a second, compatible implementation of Bitcoin will ever be a good idea.  So much of the design depends on all nodes getting exactly identical results in lockstep that a second implementation would be a menace to the network.

Satoshi's description of how to increase the block size limit was also clearly in the context of a software update. Not an incompatible implementation of bitcoin that essentially attacks the network and intentionally forks from all other client nodes. He even thought compatible alternatives were a bad idea.

One could make the argument that Satoshi thought we could raise the limit as/when needed (as if we needed his authority anyway). In that respect, the question of necessity is subjective and debatable, and it is not immediately clear that an increase to 2MB now, with no attempts to make throughput more scalable is necessary.

One could not make the argument that Satoshi thought we should increase the block size limit through a contentious hard fork.

All of this is moot. We should be talking in terms of "what is best for bitcoin" -- with respect to users, nodes and miners -- not talking past each other with interpretations of things Satoshi said 5-6 years ago. Not only was Satoshi wrong about some things, but the state of bitcoin has changed a lot. Hell, if we left the codebase as he originally wrote it, those 184 billion bitcoins from the August 2010 value overflow incident would still be with us. Better to work towards bitcoin's principles than aimlessly trying to fit some arbitrary interpretation of Satoshi's words.

Which (possibly) makes you wonder if there is some reason it hasn't been released as a software update... Could there be anyone with their fingers on the central planning levers that might have a conflicted interest in a somewhat expensive main chain, via offering the medicine for that disease? Has an extremely well connected group taken $76 million of other peoples' money... to do exactly that?
AliceWonderMiscreations
Full Member
***
Offline Offline

Activity: 182
Merit: 107


View Profile WWW
March 12, 2016, 06:12:12 AM
 #299

Quote
The disagreement is about who determines what is spam.

Nodes can decide what they relay and do not relay, providing some level of spam control.

For transactions that make it to a miner, really only the miner should decide. It's their block.

Now maybe we could have consensus rules and I support consensus rules for a maximum TX rate but minimum, no, if a miner wants to include some cheap transactions it should be up to them.

That's my opinion.

The only reason I want rules for maximum is to protect users from bugs in clients (or mistakes) and prevent an easy mechanism of money laundering through TX fee.

I hereby reserve the right to sometimes be wrong
n0ne
Hero Member
*****
Offline Offline

Activity: 2562
Merit: 548


8ombard - Pick, Play, Prosper!


View Profile WWW
March 12, 2016, 07:22:05 AM
 #300

It is possible to grow larger than the visa network with the current user rate in accordance to the time of existence.

I would say the bitcoin network definetly could get bigger then the visa network,  and the time of existance is relativly small so it still has time to grow as well.
And if you look on blockchain.info and you see al the transactions already being made.
You realize this is really big already,  the amount of people all over the world that are using it,  we could form a Bitcoin country already!

No need to form such a country to establish our bitcoin. A lot of transactions are being used with blockchain but visa usage is high due to the universal acceptance. But if bitcoin gets universal acceptance, it will be controlled by someone which causes a backing in growth.

█████▄▄██
███▄█████
██▄███████▄
████████████████
███▀██████████▀
██▄████████████▄
░█████▀▀▀▀▀▀█████
████▀████████▀████
▀▀▀▀▄▄▄▄▄█████████
█████▀███████▄████
███████▀▀▄▄▄█████
███████████████▀
████████████▀▀
OMBARD.com|.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀
██████░██░████░██
▄▄░▄▄░▄▄░▄▄░▄▄░▄▄▄▄
▀▀░▀▀░▀▀░▀▀░▀▀░▀▀▀▀
██████████████
▄▄░▄▄▄▄░▄▄░▄▄▄▄▄▄
▀▀░▀▀▀▀░▀▀░▀▀▀▀▀▀
██░██░██████████
▄▄▄▄▄▄▄▄░▄▄░▄▄▄▄
▀▀▀▀▀▀▀▀░▀▀░▀▀▀▀
.
PICK,
PLAY,
PROSPER!
|.

██████
██████████
██████████
██████████████
████████████████
████████████████
████████████████
████████████████
████████████████
█████████████████   ██
PROVABLY
FAIR
1%█████████████████   ██
HOUSE
EDGE
100%█████████████████   ██
DEPOSIT
BONUS
.
  Play now  
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 12, 2016, 09:54:04 AM
 #301

The quote says one thing, reality says another.

Reality = right now you can't get to a blocksize where you can do visa level tx capacity.
In other words, just because Satoshi invented Bitcoin that does not mean that he knew it all. The current engineers (or everyone I guess) have so much data that has been gathered over the years. You can't get nowhere near Visa unless you want to run Bitcoin on a few data centers only (however, I'd argue that it would be near worthless then).

And this is similarly true, not only for bitcoin, but bitcoin-based clones and other blockchain-based systems.
Correct.

No need to form such a country to establish our bitcoin. A lot of transactions are being used with blockchain but visa usage is high due to the universal acceptance. But if bitcoin gets universal acceptance, it will be controlled by someone which causes a backing in growth.
No. Bitcoin can not be controlled by anyone. This is just false.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
Slark
Legendary
*
Offline Offline

Activity: 1862
Merit: 1004


View Profile
March 12, 2016, 10:15:27 AM
 #302

No. Bitcoin can not be controlled by anyone. This is just false.
In theory where everyone 'owns' a little piece of hashing power and we don't have whales controlling huge amount of bitcoins - in this situation bitcoin is safe and indeed are free from manipulation.
But we are reaching the state when bitcoin will be controlled by large centralized mining farms as they will provide majority of the hash power.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 12, 2016, 10:17:26 AM
 #303

In theory where everyone 'owns' a little piece of hashing power and we don't have whales controlling huge amount of bitcoins - in this situation bitcoin is safe and indeed are free from manipulation.
But we are reaching the state when bitcoin will be controlled by large centralized mining farms as they will provide majority of the hash power.
Miners can't force the rules on the node operators. Miners actually have no power without the users. Their hashrate and coins would plummet in value very quickly if they did something malicious.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
ATguy
Sr. Member
****
Offline Offline

Activity: 423
Merit: 250



View Profile
March 12, 2016, 11:00:34 AM
 #304

The disagreement is about who determines what is spam. One group thinks it should be the miners deciding the block sizes they produce and the fees they require for admission. The other group thinks it should be an insular group of friends on irc, with a blatant COI.

Free market deciding or group of elitists deciding... strange there is still support from part of the Bitcoiners that elitists decide better, which is basically the same how the old financial system is governed.

Just the last excuse is full nodes might not keep with bandwith, CPU and storage requirement when miners are free to decide about block sizes. But miners act rationally to keep the coins+fees worth something in at least short future as well, thus the doom scenarios about full node centralization is myth because it is not in miner interests to have small price of Bitcoin as a result of continued distrust in Bitcoin.

Give faith in free market, and you will get the most efective system ever possible. No human regulators can come even close.

.Liqui Exchange.Trade and earn 24% / year on BTC, LTC, ETH
....Brand NEW..........................................Payouts every 24h. Learn more at official thread
RealPhotoshoper
Legendary
*
Offline Offline

Activity: 1050
Merit: 1001



View Profile
March 12, 2016, 12:20:37 PM
 #305

Can someone explain to me why there is any debate when Nakamoto himself said:

---------------------
Quote from Mike Hearn:

https://bitcointalk.org/index.php?topic=149668.msg1596879#msg1596879
https://duckduckgo.com/?q=%22Bitcoin+can+already+scale+much+larger+than+that+with+existing+hardware+for+a+fraction+of+the+cost.%22

  • Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.
  • The block size limit was a quick safety hack that was always meant to be removed.
  • In fact, in the very first email he sent me back in April 2009, he said this:

--------------------------------------------------
Email from Satoshi Nakamoto to Mike Hearn:

"Hi Mike,
I'm glad to answer any questions you have. If I get time, I ought to write a FAQ to supplement the paper.
There is only one global chain.

The existing Visa credit card network processes about 15 million Internet purchases per day worldwide. Bitcoin can already scale much larger than that with existing hardware for a fraction of the cost. It never really hits a scale ceiling. If you're interested, I can go over the ways it would cope with extreme size.  By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.

I don't anticipate that fees will be needed anytime soon, but if it becomes too burdensome to run a node, it is possible to run a node that only processes transactions that include a transaction fee. The owner of the node would decide the minimum fee they'll accept. Right now, such a node would get nothing, because nobody includes a fee, but if enough nodes did that, then users would get faster acceptance if they include a fee, or slower if they don't. The fee the market would settle on should be minimal. If a node requires a higher fee, that node would be passing up all transactions with lower fees.
It could do more volume and probably make more money by processing as many paying transactions as it can. The transition is not controlled by some human in charge of the system though, just individuals reacting on their own to market forces.

Eventually, most nodes may be run by specialists with multiple GPU cards. For now, it's nice that anyone with a PC can play without worrying about what video card they have, and hopefully it'll stay that way for a while. More computers are shipping with fairly decent GPUs these days, so maybe later we'll transition to that."


~ Satoshi Nakamoto
---------------------------------------
Quote:

"Satoshi said back in 2010 that he intended larger block sizes to be phased in with some simple if (height > flag_day) type logic, theymos has linked to the thread before. I think he would be really amazed at how much debate this thing has become. He never attributed much weight to it, it just didn't seem important to him. And yes, obviously, given the massive forum dramas that have resulted it'd have been nice if he had made the size limit floating from the start like he did with difficulty. However, he didn't and now we have to manage the transition."

~ Mike Hearn, on bitcointalk.org, March 07, 2013, 06:15:30 PM

https://bitcointalk.org/index.php?topic=1347.msg15366#msg15366
bit.ly/1YqiV41

----------------------------------------
Quote from Satoshi:

It can be phased in, like:

if (blocknumber > 115000)
    maxblocksize = largerlimit

It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.  When we're near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade.


~ Satoshi Nakamoto, on bitcointalk.org, October 04, 2010, 07:48:40 PM

----------------------------------------
----------------------------------------
----------------------------------------

So now,

If Satoshi himself "never really gave block size limit much weight"  (he assumed scaling was an obvious need that would happen quickly and easily), why are a group of developers refusing to scale the protocol... while simultaneously creating a tool that will generate massive income by moving transactions off the block chain, and into their exclusive transaction processing system (Lightening Network)?  Is it any wonder they were given nearly $50 million in VC funding when VC's realized they just took over Bitcoin transaction processing?

Is this not blatantly changing the design and purpose Satoshi gave to Bitcoin (to freely scale to massive sizes, to support on-chain transaction needs).  This seems to be of grave concern, no?

-B-


----------------------------------------
----------------------------------------
----------------------------------------
i've been open the thread you mentioned above,and some wuestion ruined my mind.
is that really satoshi nakamoto?and he exist on this forum?
and Mike Hearn cominucate with satoshi as well,but why Mike hearn admit him self as satoshi nakamoto,i've more confuse after read this thread  Huh
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 12, 2016, 02:41:11 PM
 #306

also to note. Visa's 42,000-45,000 tx/s is not based on real data.. its based on small group of computers in a lab, pushed to their limit with fake data.
 and then days later that number multiplied by how many computers they have world wide
http://www.visa.com/blogarchives/us/2013/10/10/stress-test-prepares-visanet-for-the-most-wonderful-time-of-the-year/index.html
Quote
a group of VisaNet engineers packs their bags – which include hardware and software that emulates our network – and goes to Gaithersburg, Maryland where an IBM state-of-the-art testing facility is located.

Once there, our engineers create a mirror image of VisaNet’s authorization systems and the fun begins. For five days, they bombard the systems with simulated transactions. The objective is to validate the hardware and software configuration against expected peak message rates. At the same time, we look into extreme message rate scenarios to uncover any possible bottlenecks that may need future remediation.

After the test is concluded, the VisaNet team determines the best configuration to maintain our higher levels of security and reliability when millions of transactions from all over the world hit the network during the holiday shopping season. As a result, they also come back from Gaithersburg with a number that represents the maximum processing capacity VisaNet can handle, which later is stamped on coffee mugs and distributed to the team involved in the test. This year’s “mug number” – as we like to call it – is 47,000 transaction messages per second, which is a huge step forward from 2012’s peak capacity of 30,000 transaction messages per second!

do you know what the really funny part is..
42,000(2012)45,000(more recent).. is not actually the complete settlement of funds, but just the messages that say (yep they have cash).
so if bitcoin wanted to compare results. its not about confirmations. its just about VALIDATING TRANSACTIONS

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 12, 2016, 02:53:17 PM
 #307

also to note:
visa's REAL USAGE statistics is based on ~900million customers

doing an average of ~40tx a year

because we all know how the blockstreamers say the network is working perfectly even with lag(mempool growth) due to temporary high use at peak times. so lets play their game and average it down and ignore peak potential capacity..
which equals (if evenly divided down to average transactions per second) ~1,141tx a second
(900mill*40tx=36bill)(36bill/365/24/60/60=1141)

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
AlexGR
Legendary
*
Offline Offline

Activity: 1708
Merit: 1049



View Profile
March 12, 2016, 02:55:54 PM
 #308

also to note. Visa's 42,000-45,000 tx/s is not based on real data.. its based on small group of computers in a lab, pushed to their limit with fake data.

I think the target is definitely not in the 40k tx/s range.

https://en.wikipedia.org/wiki/Visa_Inc.

In 2009, Visa’s global network (known as VisaNet) processed 62 billion transactions with a total volume of $4.4 trillion.[6][7]

/365 = 170mn txs per day
/86400 secs per day = 1966 tx/s on avg.

Right now it should be close to 3k on avg due to growth in electronic payments.

So the target is around 3-4.000 tx/s. This is 1100 to 1500 times more than the 2.7tx/s.... so we are looking at 1.1 to 1.5gb blocks just to handle an average annualized load (no peaks).

1.1 to 1.5gb blocks are not feasible right now in a decentralized manner.
watashi-kokoto
Sr. Member
****
Offline Offline

Activity: 682
Merit: 268



View Profile
March 12, 2016, 03:00:22 PM
 #309

We need to consider the inherent risks in the centralization. It's much better if miners represent the central authority than someone else. For example the economic majority, (wallet operators, exchanges) need to have a way to represent their vote on the issue.
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 12, 2016, 03:00:29 PM
 #310

lol i laugh at all you guys. trying to denounce the visa VS bitcoin thing..

based on previous posts about numbers visa has 900million customers and took 60 years to get there.
right now bitcoin has maybe 2.5mill users and took 7 years to get there.

stop thinking it must be only camp segwit or camp hardlimit(maxblocksize).. think BOTH!!
stop thinking bitcoin needs to have Visa capacity right now either.. (it can grow over time)

the solution does not need to be found in 16 months to jump from 2.5mill to 900mill users.. it can happen in the next couple decades

so here is some numbers to chew on
visa average 40tx per user per year(36bill tx over the total users)

so here goes
1mb bitcoin= ~105mill tx a year (~2000tx*144*365) (2.5mill users at 40tx a year average)
2mb+segwit is ~420mill tx a year (2 times real hard limit buffer and 2 times bait and switch)
2mb+segwit+LN=6.3bill tx a year (i think i remember lauda spouting off about a 15x LN capacity increase or something)


pretending that we have the blockstream mindset that people will use LN instead of ONCHAIN transactions.

by 2018 we will have the capacity to do 150million peoples transactions equivelent to visa. FULLY SETTLED!(not just signature verified/authorised)
remember 2mb+segwit =10mill people at 40 tx a year and LN (supposedly) is 15x that.

2mb+segwit+LN =150mill people
4mb+segwit+LN =300mill people (lets say a hard fork at the next reward halving 2020)
6mb+segwit+LN =450mill people (lets say a hard fork at the next reward halving 2024)
8mb+segwit+LN =600mill people (lets say a hard fork at the next reward halving 2028)
10mb+segwit+LN =750mill people (lets say a hard fork at the next reward halving 2032)
12mb+segwit+LN =900mill people (lets say a hard fork at the next reward halving 2036)

now you may ask.. its the year 2036 (20 years time) and we expect that we would have a 2PetaByte hard drive for $100. just like a 2tb hard drive now and a 2gb hard drive 20 years ago. how much data would that be per year.(IF, BIG IF. all blocks were near 100% full)

12mb+segwit = 24mb real data per block
24*144*365=1.26Tb a year, or the equivalent to $0.63 of hard drive costs per year ($100 for 2Pb hard drive)

now you may cry about bandwidth..

well we are talking about 20 years time so expect internet to get faster.
but based on today a 750kBIT average upload is ~0.093mBYTE a second =55MByte every 10 minutes (allowing for 25 leachers grabbing blocks AND getting relayed unconfirmed tx(or 50 blockgrabbers if u ignore tx relay))
even someone with basic 256kdown/64kup (bottomline ADSL) =4.8MByte every 10 minutes  (allowing for 2 leachers grabbing blocks AND getting relayed unconfirmed tx(or 4 blockgrabbers if u ignore tx relay))

so lets imagine worse case, where upload doesnt jump 100fold (like last 20 years) but just 10fold
thats 500mbyte average speed every blocktime  (allowing for 20 leachers grabbing blocks AND getting relayed unconfirmed tx(or 40 blockgrabbers if u ignore tx relay))
thats 48mbyte BASIC bottomline, every blocktime  (allowing for 1 leachers grabbing blocks AND getting relayed unconfirmed tx(or 2 blockgrabbers if u ignore tx relay))

but like i said that is worse case scenario.

so it is all possible.
its just not a  reason to pretend that the community is demanding 900million customer capacity today, by using the visa argument of lets say 20years... just to avid allowing 10million customer capacity in the next 16 months.

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
March 12, 2016, 03:03:08 PM
 #311

The limit was a temporary fix.

Shortly before disappearing, Satoshi also repeated over and over that there were more ways that Bitcoin could be successfully DOS attacked than he could count.

The limit (in conjunction with transaction fees) were intended as a DOS attack and spam deterrent. Do you have any evidence that those risks have been mitigated? Do you have any evidence that suggests that 2MB blocks won't be filled to capacity right away? That would leave us where we are today -- with no scaling solutions, wondering how much more un-optimized throughput the network can safely handle.  

I don't believe a second, compatible implementation of Bitcoin will ever be a good idea.  So much of the design depends on all nodes getting exactly identical results in lockstep that a second implementation would be a menace to the network.

Satoshi's description of how to increase the block size limit was also clearly in the context of a software update. Not an incompatible implementation of bitcoin that essentially attacks the network and intentionally forks from all other client nodes. He even thought compatible alternatives were a bad idea.

One could make the argument that Satoshi thought we could raise the limit as/when needed (as if we needed his authority anyway). In that respect, the question of necessity is subjective and debatable, and it is not immediately clear that an increase to 2MB now, with no attempts to make throughput more scalable is necessary.

One could not make the argument that Satoshi thought we should increase the block size limit through a contentious hard fork.

All of this is moot. We should be talking in terms of "what is best for bitcoin" -- with respect to users, nodes and miners -- not talking past each other with interpretations of things Satoshi said 5-6 years ago. Not only was Satoshi wrong about some things, but the state of bitcoin has changed a lot. Hell, if we left the codebase as he originally wrote it, those 184 billion bitcoins from the August 2010 value overflow incident would still be with us. Better to work towards bitcoin's principles than aimlessly trying to fit some arbitrary interpretation of Satoshi's words.

As there are already several implementations the menace threat seems overblown. Even if you were to assume that Core was the "one true bitcoin", which version of it is gospel? If miners refuse to upgrade to 0.12 is this an attack? A menace?

It is possible to release an implementation right now with the limit removed. This still isn't an attack. Bitcoin's built in consensus mechanism protects the network.

There is evidence right now that the 1MB limit causes problems. There is only speculation that removing it causes problems.

The reason we can't talk in terms of what is best for bitcoin is because people have different opinions. People prefer speculative drama to mundane truth. The reason we can't work towards Bitcoin's principles is that these are also subject to people's opinion. If you refuse to accept that "What Satoshi said" has any importance all we have is the court of public opinion. In which case your view is as equally valid as mine. How do we resolve that? What are bitcoins principles when it comes to disagreement about what bitcoin is?

With regards to the "appeal to authority" defence. If something is a fact, and somebody happens to state that fact. Any argument that relies on the fact cannot be dismissed by saying "we should not rely on what somebody said". The argument is being made because of the fact, not because of what was said.

The limit was a temporary anti DOS measure, it is now restricting growth of transactions. Those are facts.

IMHO what is best for bitcoin is to allow it to grow unencumbered by artificial limits.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
franky1
Legendary
*
Offline Offline

Activity: 4214
Merit: 4458



View Profile
March 12, 2016, 03:06:45 PM
Last edit: March 12, 2016, 03:28:05 PM by franky1
 #312

1.1 to 1.5gb blocks are not feasible right now in a decentralized manner.

no one is asking for 1.5gb blocks today!!

thats a futile argument to meander away from the real debate that bitcoiners are talking about today(2mb+segwit before 2018(preferably under 16 months), and then the POSSIBILITIES over the next TWENTY+ YEARS

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
AlexGR
Legendary
*
Offline Offline

Activity: 1708
Merit: 1049



View Profile
March 12, 2016, 04:34:18 PM
 #313

1.1 to 1.5gb blocks are not feasible right now in a decentralized manner.

no one is asking for 1.5gb blocks today!!

thats a futile argument to meander away from the real debate that bitcoiners are talking about today(2mb+segwit before 2018(preferably under 16 months), and then the POSSIBILITIES over the next TWENTY+ YEARS

Although it's you I quote, it's for the OP to read.
SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 13, 2016, 01:47:07 AM
Last edit: March 13, 2016, 02:22:00 AM by SpiryGolden
 #314

All this without any real progress it gets us to this. Bitcoin is left behind.  Cry It makes me really sad. This are the sad days of Bitcoin, the moment that you realize the trolling and the small view and mentality of Core developers brings Bitcoin to the ground. Sadly this is what is happening and it will happen in the past months.

Some will diss me , I still have a large amount of Bitcoins and didn't choose YET other sides but I am slowing migrating to better options with more capable development team that doesn't stall what is obvious needed and what obvious draw capital to Bitcoin (Fast , cheap , secure method of payment , moving money and many more ). They've dragged in the dirt and people are starting to quit step by step, companies to back off.

One of the news that should worry it is Microsoft stopping to accept Bitcoin payments. The whole market goes into regression and gets Bitcoin back into the dark age.

It is sad and even more sad that clouded mind people are supporting this, they are supporting the real death of Bitcoin and not the #90 fake death. This is real and not a joke anymore. Everyday passing by without an already implemented solution it makes the market bleed slowly.

We have statements from big players on market like XAPO , Microsoft , Humble Bundle and other merchants across the globe that are slowly retreating from this and looking for other better & faster alternatives.

Yes this is a sad day for both sides ..Bitcoin Core or Bitcoin Classic. The both sides have lost big time.


The whole process of acceptance across the globe in early days was a lot of work and a lot of proving that it is a good payment method that is good to use , a lot of work from all holders , from all believers. Now all that work is really gone. How you can convince a giant like Microsoft to accept Bitcoin back? Hard to do that, very hard.

And by all that I want to say :"Thanks Bitcoin Blockstream Core for ruining this for all in order to self serve your own financial interest" .
voos
Member
**
Offline Offline

Activity: 108
Merit: 10


View Profile
March 13, 2016, 07:39:09 PM
Last edit: March 15, 2016, 12:43:02 AM by voos
 #315

Big blockers are very sad people.
madjules007
Sr. Member
****
Offline Offline

Activity: 400
Merit: 250



View Profile
March 15, 2016, 09:36:20 PM
 #316

As there are already several implementations the menace threat seems overblown. Even if you were to assume that Core was the "one true bitcoin", which version of it is gospel? If miners refuse to upgrade to 0.12 is this an attack? A menace?

Gospel? Satoshi just meant that it could cause compatibility issues that lead to networks forking off from one another. Core is one implementation. Having multiple implementations has nothing to do with updating to the most recent release of a client you already run. It seems like people like to cite Satoshi only when it suits them, and to otherwise brush what he said under the rug. The obvious solution is to approach the questions without citing someone that can't defend against people misconstruing his words.

It is possible to release an implementation right now with the limit removed. This still isn't an attack. Bitcoin's built in consensus mechanism protects the network.

That only works if you retain the consensus mechanism. There can be no question that lowering the threshold for consensus changes to 75% redefines what the "consensus mechanism" is. It not only attempts to redefine the English language definition (general agreement) but it attempts to redefine it in opposition to every intentional fork in bitcoin's history.

Nick Szabo called it "a 51% attack being justified through argument." You can't redefine the very basis of bitcoin -- consensus to achieve trustlessness -- so you can change the rules for everyone else without their agreement, and then deny that it's an attack. It very clearly is, and I'm happy to fork off anyone that tries.

Redefining consensus as "majority rule" was never something I signed up for by running my nodes. I'll just fork you off my network if you attempt to break consensus. And that's the root of the problem of breaking consensus in a controversy -- nodes = the rules. Miners are perpetually forced to act on incomplete information as quickly as possible, and they can only follow users, lest they be mining on an empty network. They can point their hashpower where they want, but that doesn't mean I'm going to install a forked client, nor that the rest of the network will. When the dust settles, we might find that users exist on both chains, and once difficulty adjusts, mining may be profitable on both. In that context, redefining the "consensus mechanism" defeats the purpose of consensus, period. And miner "consensus" -- which  does not define the rules for network nodes -- does nothing to change that.

There is evidence right now that the 1MB limit causes problems. There is only speculation that removing it causes problems.

That's akin to saying that there is evidence that a limit on taxes has led to a budget deficit, so we should remove all limits on taxes. That sort of logic is completely useless and doesn't belong in any serious debate. There can only be speculation on the consequences of increasing the block size limit. That is not evidence that we should do so. There is also only speculation that increasing the block size is safe (and to what extent). To use your logic, should we axe it from discussion then?

The reason we can't talk in terms of what is best for bitcoin is because people have different opinions. People prefer speculative drama to mundane truth. The reason we can't work towards Bitcoin's principles is that these are also subject to people's opinion. If you refuse to accept that "What Satoshi said" has any importance all we have is the court of public opinion. In which case your view is as equally valid as mine. How do we resolve that? What are bitcoins principles when it comes to disagreement about what bitcoin is?

In a term, consensus mechanism. Until consensus is reached on protocol changes, the status quo prevails. Many are upset that they can't get their way, and want to weasel their way into implementing changes with less and less agreement among the community that runs the software. I think that's atrocious. Again, any such attackers will just be forked off my network. You could argue that my nodes will exist only on a dead network -- I vehemently disagree and believe that game theory does little to support that in the context of an extremely contentious debate and a lowered majority threshold for miner agreement.

With regards to the "appeal to authority" defence. If something is a fact, and somebody happens to state that fact. Any argument that relies on the fact cannot be dismissed by saying "we should not rely on what somebody said". The argument is being made because of the fact, not because of what was said.

That's not an appeal to authority then. Saying "Satoshi said x, therefore y" is different than "Because of x, therefore y, which also agrees with Satoshi's findings." Often, though, people make arguments like the former. Like you did here. That fallacious logic is why I responded to you in the first place.

The limit was a temporary anti DOS measure, it is now restricting growth of transactions. Those are facts.

Temporary, until today? Next year? 2020?

What I said (and what you've glossed over) was:

Quote
The limit (in conjunction with transaction fees) were intended as a DOS attack and spam deterrent. Do you have any evidence that those risks have been mitigated? Do you have any evidence that suggests that 2MB blocks won't be filled to capacity right away? That would leave us where we are today -- with no scaling solutions, wondering how much more un-optimized throughput the network can safely handle.

Your facts are meaningless without context, and without addressing the actual issues.

IMHO what is best for bitcoin is to allow it to grow unencumbered by artificial limits.

You've got the touch of a sophist, alright. Cheesy

Is the 21 million coin cap artificial? It is intended to control inflation.

The 1MB cap was intended to mitigate DOS attacks and deter spam. If you want to change the rules, you should begin by making a case that those concerns have been addressed.


██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
RISE
BlindMayorBitcorn
Legendary
*
Offline Offline

Activity: 1260
Merit: 1115



View Profile
March 15, 2016, 10:00:30 PM
 #317

In a term, consensus mechanism. Until consensus is reached on protocol changes, the status quo prevails.

But what's the status quo? Certainly not SegWit/CT/LN et al. Is it?

Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
madjules007
Sr. Member
****
Offline Offline

Activity: 400
Merit: 250



View Profile
March 15, 2016, 10:53:47 PM
 #318

In a term, consensus mechanism. Until consensus is reached on protocol changes, the status quo prevails.

But what's the status quo? Certainly not SegWit/CT/LN et al. Is it?

The status quo is a 1MB block size limit as a hard rule. Segwit, as a soft fork, doesn't remove or change consensus rules. And miners can soft fork whenever they please -- that's not up to users, nodes or developers.

How are CT or LN consensus issues?

██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
██████████████████████
RISE
BlindMayorBitcorn
Legendary
*
Offline Offline

Activity: 1260
Merit: 1115



View Profile
March 15, 2016, 10:58:36 PM
 #319

In a term, consensus mechanism. Until consensus is reached on protocol changes, the status quo prevails.

But what's the status quo? Certainly not SegWit/CT/LN et al. Is it?

The status quo is a 1MB block size limit as a hard rule. Segwit, as a soft fork, doesn't remove or change consensus rules. And miners can soft fork whenever they please -- that's not up to users, nodes or developers.

How are CT or LN consensus issues?

Good point. Smiley

Forgive my petulance and oft-times, I fear, ill-founded criticisms, and forgive me that I have, by this time, made your eyes and head ache with my long letter. But I cannot forgo hastily the pleasure and pride of thus conversing with you.
danda
Full Member
***
Offline Offline

Activity: 201
Merit: 157


View Profile WWW
March 15, 2016, 11:34:48 PM
 #320

It is not strictly a matter of mathematically optimal outcomes.

There is also the fact that we have an existing protocol that thousands (at least) of people are using and have entrusted significant value into.  That protocol is essentially frozen, much like ipv4.   It was and is an agreement, a promise.

I think it is very unlikely that the community at large will ever choose to change that agreement for anything that is not clearly and undeniably an existential threat.   This means that no controversial changes will be made going forward at the consensus level.

This immutability is a good thing.   It means that cryptocurrency can be trusted.

I for one will lose a great deal of trust in cryptocurrency if a controversial change ever does in fact happen because it will mean that a majority can dictate terms and potentially steal from a minority.

This does not mean that technical advances cannot happen.  They simply should happen outside Bitcoin.  Either in higher layers or in alt-coins.

Ethereum and now ZCash are talking about their governance models and how they can flexibly incorporate changes over time.   To me, this mutability is a drawback in a currency.    If humans can simply change it out from beneath me, then why should I trust it?

No, it is better to have many competing currencies, each with their own properties.   I believe that in the end, those that get the fundamentals of privacy, fungibility, immutability and scalability right will be winners, though of course this could take decades or even centuries and many things will happen along the way.



I am truly unsettled; I have switched from Core 0.11 to Classic 0.11 to Core 0.12 to Classic 0.12 and it is totally possible/likely I will switch again.  Perhaps I will try using more advanced mathematics, above and beyond high school maths. Smiley

mybitprices.info - wallet auditing   |  hd-wallet-derive - derive keys locally |  hd-wallet-addrs - find used addrs
lightning-nodes - list of LN nodes  |  coinparams - params for 300+ alts  |  jsonrpc-cli - cli jsonrpc client
subaddress-derive-xmr - monero offline wallet tool
maokoto
Hero Member
*****
Offline Offline

Activity: 770
Merit: 500


✪ NEXCHANGE | BTC, LTC, ETH & DOGE ✪


View Profile WWW
March 15, 2016, 11:45:53 PM
 #321

I think that the statement is true: "Bitcoin CAN [not that it will neccessarily] scale larger than the Visa Network". It seems like that is far away yet, and blocksize would likely be solved one way or another if adoption were increasing at a good rate.

Etalia
Newbie
*
Offline Offline

Activity: 51
Merit: 0


View Profile
March 23, 2016, 10:58:55 AM
 #322

If this is true and not a bluff this could be very big for Bitcoin. To be able to surpass visa is quiet an accomplishment.
We will just have to wait on what the future brings.
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
July 20, 2020, 11:58:50 AM
 #323

Can someone please explain to me how we have these Satoshi quotes and yet there is still any discussion / debate / disagreement on it?  

The quote says one thing, reality says another.

Reality = right now you can't get to a blocksize where you can do visa level tx capacity.

And this is similarly true, not only for bitcoin, but bitcoin-based clones and other blockchain-based systems.

The Real Reality says you can increase the block size as demand grows, and Bitcoin Cash is the real example of that.

Nope- BCH has a limit on blocks - but not on code altering

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
July 21, 2020, 11:47:52 AM
 #324

There is no upper limit for cryptocurrency exchanges on www.changenow.io  You can as well make use of visa or master cards to make purchase of crypto assets on ChangeNow. This is a non custodial exchange that gives more to experience

Sigh

sadly there is no limit (yet) on creating different tokens - only clear ICOs or security tokens got 'limited'

The world needs only 1 open value transfer protocol - and that needs just enough capacity to do the job

that should be Bitcoin Consensus Rule 1

next max 21 Million coins

next interest curve / how to mine those coins

...

Consensus rules on txs volume policy -> rekt ,  cause consensus is: max adoption


job done

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
Alucard1
Full Member
***
Offline Offline

Activity: 574
Merit: 125


View Profile
July 24, 2020, 01:57:41 PM
 #325

Can someone explain to me why there is any debate when Nakamoto himself said:

---------------------
Quote from Mike Hearn:

https://bitcointalk.org/index.php?topic=149668.msg1596879#msg1596879
https://duckduckgo.com/?q=%22Bitcoin+can+already+scale+much+larger+than+that+with+existing+hardware+for+a+fraction+of+the+cost.%22

  • Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.
  • The block size limit was a quick safety hack that was always meant to be removed.
  • In fact, in the very first email he sent me back in April 2009, he said this:

--------------------------------------------------
Email from Satoshi Nakamoto to Mike Hearn:

"Hi Mike,
I'm glad to answer any questions you have. If I get time, I ought to write a FAQ to supplement the paper.
There is only one global chain.

The existing Visa credit card network processes about 15 million Internet purchases per day worldwide. Bitcoin can already scale much larger than that with existing hardware for a fraction of the cost. It never really hits a scale ceiling. If you're interested, I can go over the ways it would cope with extreme size.  By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.

I don't anticipate that fees will be needed anytime soon, but if it becomes too burdensome to run a node, it is possible to run a node that only processes transactions that include a transaction fee. The owner of the node would decide the minimum fee they'll accept. Right now, such a node would get nothing, because nobody includes a fee, but if enough nodes did that, then users would get faster acceptance if they include a fee, or slower if they don't. The fee the market would settle on should be minimal. If a node requires a higher fee, that node would be passing up all transactions with lower fees.
It could do more volume and probably make more money by processing as many paying transactions as it can. The transition is not controlled by some human in charge of the system though, just individuals reacting on their own to market forces.

Eventually, most nodes may be run by specialists with multiple GPU cards. For now, it's nice that anyone with a PC can play without worrying about what video card they have, and hopefully it'll stay that way for a while. More computers are shipping with fairly decent GPUs these days, so maybe later we'll transition to that."


~ Satoshi Nakamoto
---------------------------------------
Quote:

"Satoshi said back in 2010 that he intended larger block sizes to be phased in with some simple if (height > flag_day) type logic, theymos has linked to the thread before. I think he would be really amazed at how much debate this thing has become. He never attributed much weight to it, it just didn't seem important to him. And yes, obviously, given the massive forum dramas that have resulted it'd have been nice if he had made the size limit floating from the start like he did with difficulty. However, he didn't and now we have to manage the transition."

~ Mike Hearn, on bitcointalk.org, March 07, 2013, 06:15:30 PM

https://bitcointalk.org/index.php?topic=1347.msg15366#msg15366
bit.ly/1YqiV41

----------------------------------------
Quote from Satoshi:

It can be phased in, like:

if (blocknumber > 115000)
    maxblocksize = largerlimit

It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.  When we're near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade.


~ Satoshi Nakamoto, on bitcointalk.org, October 04, 2010, 07:48:40 PM

----------------------------------------
----------------------------------------
----------------------------------------

So now,

If Satoshi himself "never really gave block size limit much weight"  (he assumed scaling was an obvious need that would happen quickly and easily), why are a group of developers refusing to scale the protocol... while simultaneously creating a tool that will generate massive income by moving transactions off the block chain, and into their exclusive transaction processing system (Lightening Network)?  Is it any wonder they were given nearly $50 million in VC funding when VC's realized they just took over Bitcoin transaction processing?

Is this not blatantly changing the design and purpose Satoshi gave to Bitcoin (to freely scale to massive sizes, to support on-chain transaction needs).  This seems to be of grave concern, no?

-B-


----------------------------------------
----------------------------------------
----------------------------------------
Satoshi nakamoto is getting debated due to being desperate of each crypto users to know and to reveal it's identity and personality. Everyone wants to know the real identity of Satoshi nakamoto and thank him for everything that he done because without him, there are no crypto currency. Even me, if I have a chance to meet and greet Satoshi nakamoto, I am going to thank him until the end of the day and I know that it is not enough to thank him.

cosmicrays
Newbie
*
Offline Offline

Activity: 32
Merit: 0


View Profile
July 24, 2020, 03:54:02 PM
 #326

I think it was obvious from the very beginning. Bitcoin and crypto in general are literally an ocean of possibilities and use cases, including scaling.
Folio
Member
**
Offline Offline

Activity: 76
Merit: 35


View Profile
August 15, 2020, 01:11:45 PM
 #327

In the last week I have read a lot, a lot of articles regarding this block-size issue.
I have read very good arguments from the yes side and the no side. One hour I thought "good, they saved bitcoin decentralization" and the next hour I thought "damn, this greedy group has destroyed Satoshi's vision".

There are the original words of Satoshi that clearly indicate that the block size limit was just a temporary fix and that he wanted bitcoin to perform better than visa.

Now I am sick of this topic and I hope I can remove it from my head.

If Satoshi is still alive as I think he is and reads this I tell him this: this issue has created a very big and ugly division in the bitcoin world, you should address it. Move a few coins from your wallet and sign a message in which you yourself declare what has to be done regarding the block size. Only that would stop this conflict.
You can do it safely and effectively, so do it.
sgbett
Legendary
*
Offline Offline

Activity: 2576
Merit: 1087



View Profile
August 18, 2020, 09:40:33 AM
Merited by hv_ (2)
 #328

In the last week I have read a lot, a lot of articles regarding this block-size issue.
I have read very good arguments from the yes side and the no side. One hour I thought "good, they saved bitcoin decentralization" and the next hour I thought "damn, this greedy group has destroyed Satoshi's vision".

There are the original words of Satoshi that clearly indicate that the block size limit was just a temporary fix and that he wanted bitcoin to perform better than visa.

Now I am sick of this topic and I hope I can remove it from my head.

If Satoshi is still alive as I think he is and reads this I tell him this: this issue has created a very big and ugly division in the bitcoin world, you should address it. Move a few coins from your wallet and sign a message in which you yourself declare what has to be done regarding the block size. Only that would stop this conflict.
You can do it safely and effectively, so do it.

N-N-N-Necropost!  Grin

Blocksize debate is done. If you like 1MB you stick with BTC if you like no limit then you go with BSV. There's nothing to debate any more!

Satoshi has explained exactly what should happen, but "proof of satoshi" requires a lot more than moving coins.

All that proves is that someone with the keys for those coins, moved the coins.

"A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution" - Satoshi Nakamoto
*my posts are not investment advice*
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
August 20, 2020, 08:46:13 PM
 #329

In the last week I have read a lot, a lot of articles regarding this block-size issue.
I have read very good arguments from the yes side and the no side. One hour I thought "good, they saved bitcoin decentralization" and the next hour I thought "damn, this greedy group has destroyed Satoshi's vision".

There are the original words of Satoshi that clearly indicate that the block size limit was just a temporary fix and that he wanted bitcoin to perform better than visa.

Now I am sick of this topic and I hope I can remove it from my head.

If Satoshi is still alive as I think he is and reads this I tell him this: this issue has created a very big and ugly division in the bitcoin world, you should address it. Move a few coins from your wallet and sign a message in which you yourself declare what has to be done regarding the block size. Only that would stop this conflict.
You can do it safely and effectively, so do it.

N-N-N-Necropost!  Grin

Blocksize debate is done. If you like 1MB you stick with BTC if you like no limit then you go with BSV. There's nothing to debate any more!

Satoshi has explained exactly what should happen, but "proof of satoshi" requires a lot more than moving coins.

All that proves is that someone with the keys for those coins, moved the coins.

Correct.

Nothing to add

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
BitcoinFX
Legendary
*
Offline Offline

Activity: 2646
Merit: 1720


https://youtu.be/DsAVx0u9Cw4 ... Dr. WHO < KLF


View Profile WWW
August 20, 2020, 10:43:49 PM
 #330

Lots to add ...

Hal Finney knew it ... December 30, 2010 ...

...snip... Bitcoin itself cannot scale to have every single financial transaction in the world be broadcast to everyone and included in the block chain. There needs to be a secondary level of payment systems which is lighter weight and more efficient. Likewise, the time needed for Bitcoin transactions to finalize will be impractical for medium to large value purchases.

...snip...

Imagine all those BSV folks thinking that unlimited data storage in addition to transactions is going to work out well long-term.  Cheesy

Imagine trying to store the entire internet in every Bitcoin block. Anyone involved with BSV is an absolute complete and utter total plank.

...snip...

...

Signing a Verifiable message with early BTC addresses associated with Satoshi would prove a darn sight more than Craig Wright has to-date.

Also, Satoshi has a PGP Key ...

...snip...

It would be easy for the real Satoshi Nakamoto to prove identity using Bitcoin and/or PGP with signed messages and other evidence.

For future reference, here's my public key.  It's the same one that's been there since the bitcoin.org site first went up in 2008.  Grab it now in case you need it later.

http://www.bitcoin.org/Satoshi_Nakamoto.asc

For some unknown reason the key is not currently hosted on bitcoin.org , however the original key is hosted on this forum here:

- https://bitcointalk.org/Satoshi_Nakamoto.asc
and
- https://web.archive.org/web/20110228054007/http://www.bitcoin.org/Satoshi_Nakamoto.asc

Satoshi Nakamoto
E-mail: satoshin@gmx.com (the same email address as the bitcoin whitepaper)

Public Key
Key ID: 18C09E865EC948A1

...snip...

BSV is NOT Bitcoin and Craig Wright is NOT Satoshi.

- https://seekingsatoshi.weebly.com/fraud-timeline.html

"Bitcoin OG" 1JXFXUBGs2ZtEDAQMdZ3tkCKo38nT2XSEp | Bitcoin logo™ Enforcer? | Bitcoin is BTC | CSW is NOT Satoshi Nakamoto | I Mine BTC, LTC, ZEC, XMR and GAP | BTC on Tor addnodes Project | Media enquiries : Wu Ming | Enjoy The Money Machine | "You cannot compete with Open Source" and "Cryptography != Banana" | BSV and BCH are COUNTERFEIT.
hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
August 21, 2020, 12:39:21 PM
 #331

Add only proof of panic, like true big blogger.

  Grin

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
jpnl0006
Jr. Member
*
Offline Offline

Activity: 480
Merit: 4


View Profile
August 21, 2020, 05:03:52 PM
 #332

Can someone explain to me why there is any debate when Nakamoto himself said:

---------------------
Quote from Mike Hearn:

https://bitcointalk.org/index.php?topic=149668.msg1596879#msg1596879
https://duckduckgo.com/?q=%22Bitcoin+can+already+scale+much+larger+than+that+with+existing+hardware+for+a+fraction+of+the+cost.%22

  • Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.
  • The block size limit was a quick safety hack that was always meant to be removed.
  • In fact, in the very first email he sent me back in April 2009, he said this:

--------------------------------------------------
Email from Satoshi Nakamoto to Mike Hearn:

"Hi Mike,
I'm glad to answer any questions you have. If I get time, I ought to write a FAQ to supplement the paper.
There is only one global chain.

The existing Visa credit card network processes about 15 million Internet purchases per day worldwide. Bitcoin can already scale much larger than that with existing hardware for a fraction of the cost. It never really hits a scale ceiling. If you're interested, I can go over the ways it would cope with extreme size.  By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.

I don't anticipate that fees will be needed anytime soon, but if it becomes too burdensome to run a node, it is possible to run a node that only processes transactions that include a transaction fee. The owner of the node would decide the minimum fee they'll accept. Right now, such a node would get nothing, because nobody includes a fee, but if enough nodes did that, then users would get faster acceptance if they include a fee, or slower if they don't. The fee the market would settle on should be minimal. If a node requires a higher fee, that node would be passing up all transactions with lower fees.
It could do more volume and probably make more money by processing as many paying transactions as it can. The transition is not controlled by some human in charge of the system though, just individuals reacting on their own to market forces.

Eventually, most nodes may be run by specialists with multiple GPU cards. For now, it's nice that anyone with a PC can play without worrying about what video card they have, and hopefully it'll stay that way for a while. More computers are shipping with fairly decent GPUs these days, so maybe later we'll transition to that."


~ Satoshi Nakamoto
---------------------------------------
Quote:

"Satoshi said back in 2010 that he intended larger block sizes to be phased in with some simple if (height > flag_day) type logic, theymos has linked to the thread before. I think he would be really amazed at how much debate this thing has become. He never attributed much weight to it, it just didn't seem important to him. And yes, obviously, given the massive forum dramas that have resulted it'd have been nice if he had made the size limit floating from the start like he did with difficulty. However, he didn't and now we have to manage the transition."

~ Mike Hearn, on bitcointalk.org, March 07, 2013, 06:15:30 PM

https://bitcointalk.org/index.php?topic=1347.msg15366#msg15366
bit.ly/1YqiV41

----------------------------------------
Quote from Satoshi:

It can be phased in, like:

if (blocknumber > 115000)
    maxblocksize = largerlimit

It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.  When we're near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade.


~ Satoshi Nakamoto, on bitcointalk.org, October 04, 2010, 07:48:40 PM

----------------------------------------
----------------------------------------
----------------------------------------

So now,

If Satoshi himself "never really gave block size limit much weight"  (he assumed scaling was an obvious need that would happen quickly and easily), why are a group of developers refusing to scale the protocol... while simultaneously creating a tool that will generate massive income by moving transactions off the block chain, and into their exclusive transaction processing system (Lightening Network)?  Is it any wonder they were given nearly $50 million in VC funding when VC's realized they just took over Bitcoin transaction processing?

Is this not blatantly changing the design and purpose Satoshi gave to Bitcoin (to freely scale to massive sizes, to support on-chain transaction needs).  This seems to be of grave concern, no?

-B-


----------------------------------------
----------------------------------------
----------------------------------------


The blockchain network is much more robust than the Visa Network. I see the Blockchain or Bitcoin newtork is the way forward as people are more confident in the transaction that concerns bitcoin rather than Visa and I see  the bitcoin network having more reviews and upgrades to help the network scale further even in the future.
Cryptoreflector_666
Member
**
Offline Offline

Activity: 728
Merit: 24


View Profile
August 21, 2020, 06:08:17 PM
 #333

Can someone explain to me why there is any debate when Nakamoto himself said:

---------------------
Quote from Mike Hearn:

https://bitcointalk.org/index.php?topic=149668.msg1596879#msg1596879
https://duckduckgo.com/?q=%22Bitcoin+can+already+scale+much+larger+than+that+with+existing+hardware+for+a+fraction+of+the+cost.%22

  • Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.
  • The block size limit was a quick safety hack that was always meant to be removed.
  • In fact, in the very first email he sent me back in April 2009, he said this:

--------------------------------------------------
Email from Satoshi Nakamoto to Mike Hearn:

"Hi Mike,
I'm glad to answer any questions you have. If I get time, I ought to write a FAQ to supplement the paper.
There is only one global chain.

The existing Visa credit card network processes about 15 million Internet purchases per day worldwide. Bitcoin can already scale much larger than that with existing hardware for a fraction of the cost. It never really hits a scale ceiling. If you're interested, I can go over the ways it would cope with extreme size.  By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.

I don't anticipate that fees will be needed anytime soon, but if it becomes too burdensome to run a node, it is possible to run a node that only processes transactions that include a transaction fee. The owner of the node would decide the minimum fee they'll accept. Right now, such a node would get nothing, because nobody includes a fee, but if enough nodes did that, then users would get faster acceptance if they include a fee, or slower if they don't. The fee the market would settle on should be minimal. If a node requires a higher fee, that node would be passing up all transactions with lower fees.
It could do more volume and probably make more money by processing as many paying transactions as it can. The transition is not controlled by some human in charge of the system though, just individuals reacting on their own to market forces.

Eventually, most nodes may be run by specialists with multiple GPU cards. For now, it's nice that anyone with a PC can play without worrying about what video card they have, and hopefully it'll stay that way for a while. More computers are shipping with fairly decent GPUs these days, so maybe later we'll transition to that."


~ Satoshi Nakamoto
---------------------------------------
Quote:

"Satoshi said back in 2010 that he intended larger block sizes to be phased in with some simple if (height > flag_day) type logic, theymos has linked to the thread before. I think he would be really amazed at how much debate this thing has become. He never attributed much weight to it, it just didn't seem important to him. And yes, obviously, given the massive forum dramas that have resulted it'd have been nice if he had made the size limit floating from the start like he did with difficulty. However, he didn't and now we have to manage the transition."

~ Mike Hearn, on bitcointalk.org, March 07, 2013, 06:15:30 PM

https://bitcointalk.org/index.php?topic=1347.msg15366#msg15366
bit.ly/1YqiV41

----------------------------------------
Quote from Satoshi:

It can be phased in, like:

if (blocknumber > 115000)
    maxblocksize = largerlimit

It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.  When we're near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade.


~ Satoshi Nakamoto, on bitcointalk.org, October 04, 2010, 07:48:40 PM

----------------------------------------
----------------------------------------
----------------------------------------

So now,

If Satoshi himself "never really gave block size limit much weight"  (he assumed scaling was an obvious need that would happen quickly and easily), why are a group of developers refusing to scale the protocol... while simultaneously creating a tool that will generate massive income by moving transactions off the block chain, and into their exclusive transaction processing system (Lightening Network)?  Is it any wonder they were given nearly $50 million in VC funding when VC's realized they just took over Bitcoin transaction processing?

Is this not blatantly changing the design and purpose Satoshi gave to Bitcoin (to freely scale to massive sizes, to support on-chain transaction needs).  This seems to be of grave concern, no?

-B-


----------------------------------------
----------------------------------------
----------------------------------------


The blockchain network is much more robust than the Visa Network. I see the Blockchain or Bitcoin newtork is the way forward as people are more confident in the transaction that concerns bitcoin rather than Visa and I see  the bitcoin network having more reviews and upgrades to help the network scale further even in the future.

In my opinion, bitcoin is much slower than a visa, but you remain anonymous. It seems to me that both bitcoin and visa should remain in the world, because they perform different functions in essence. There are different functions and there should be different ways to solve them.

hv_
Legendary
*
Offline Offline

Activity: 2506
Merit: 1055

Clean Code and Scale


View Profile WWW
August 23, 2020, 06:11:11 PM
 #334

Can someone explain to me why there is any debate when Nakamoto himself said:

---------------------
Quote from Mike Hearn:

https://bitcointalk.org/index.php?topic=149668.msg1596879#msg1596879
https://duckduckgo.com/?q=%22Bitcoin+can+already+scale+much+larger+than+that+with+existing+hardware+for+a+fraction+of+the+cost.%22

  • Satoshi did plan for Bitcoin to compete with PayPal/Visa in traffic volumes.
  • The block size limit was a quick safety hack that was always meant to be removed.
  • In fact, in the very first email he sent me back in April 2009, he said this:

--------------------------------------------------
Email from Satoshi Nakamoto to Mike Hearn:

"Hi Mike,
I'm glad to answer any questions you have. If I get time, I ought to write a FAQ to supplement the paper.
There is only one global chain.

The existing Visa credit card network processes about 15 million Internet purchases per day worldwide. Bitcoin can already scale much larger than that with existing hardware for a fraction of the cost. It never really hits a scale ceiling. If you're interested, I can go over the ways it would cope with extreme size.  By Moore's Law, we can expect hardware speed to be 10 times faster in 5 years and 100 times faster in 10. Even if Bitcoin grows at crazy adoption rates, I think computer speeds will stay ahead of the number of transactions.

I don't anticipate that fees will be needed anytime soon, but if it becomes too burdensome to run a node, it is possible to run a node that only processes transactions that include a transaction fee. The owner of the node would decide the minimum fee they'll accept. Right now, such a node would get nothing, because nobody includes a fee, but if enough nodes did that, then users would get faster acceptance if they include a fee, or slower if they don't. The fee the market would settle on should be minimal. If a node requires a higher fee, that node would be passing up all transactions with lower fees.
It could do more volume and probably make more money by processing as many paying transactions as it can. The transition is not controlled by some human in charge of the system though, just individuals reacting on their own to market forces.

Eventually, most nodes may be run by specialists with multiple GPU cards. For now, it's nice that anyone with a PC can play without worrying about what video card they have, and hopefully it'll stay that way for a while. More computers are shipping with fairly decent GPUs these days, so maybe later we'll transition to that."


~ Satoshi Nakamoto
---------------------------------------
Quote:

"Satoshi said back in 2010 that he intended larger block sizes to be phased in with some simple if (height > flag_day) type logic, theymos has linked to the thread before. I think he would be really amazed at how much debate this thing has become. He never attributed much weight to it, it just didn't seem important to him. And yes, obviously, given the massive forum dramas that have resulted it'd have been nice if he had made the size limit floating from the start like he did with difficulty. However, he didn't and now we have to manage the transition."

~ Mike Hearn, on bitcointalk.org, March 07, 2013, 06:15:30 PM

https://bitcointalk.org/index.php?topic=1347.msg15366#msg15366
bit.ly/1YqiV41

----------------------------------------
Quote from Satoshi:

It can be phased in, like:

if (blocknumber > 115000)
    maxblocksize = largerlimit

It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete.  When we're near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade.


~ Satoshi Nakamoto, on bitcointalk.org, October 04, 2010, 07:48:40 PM

----------------------------------------
----------------------------------------
----------------------------------------

So now,

If Satoshi himself "never really gave block size limit much weight"  (he assumed scaling was an obvious need that would happen quickly and easily), why are a group of developers refusing to scale the protocol... while simultaneously creating a tool that will generate massive income by moving transactions off the block chain, and into their exclusive transaction processing system (Lightening Network)?  Is it any wonder they were given nearly $50 million in VC funding when VC's realized they just took over Bitcoin transaction processing?

Is this not blatantly changing the design and purpose Satoshi gave to Bitcoin (to freely scale to massive sizes, to support on-chain transaction needs).  This seems to be of grave concern, no?

-B-


----------------------------------------
----------------------------------------
----------------------------------------


The blockchain network is much more robust than the Visa Network. I see the Blockchain or Bitcoin newtork is the way forward as people are more confident in the transaction that concerns bitcoin rather than Visa and I see  the bitcoin network having more reviews and upgrades to help the network scale further even in the future.

In my opinion, bitcoin is much slower than a visa, but you remain anonymous. It seems to me that both bitcoin and visa should remain in the world, because they perform different functions in essence. There are different functions and there should be different ways to solve them.

Nope, not anonymous at all, traceable for a reason of trust of public ledger

Carpe diem  -  understand the White Paper and mine honest.
Fix real world issues: Check out b-vote.com
The simple way is the genius way - Satoshi's Rules: humana veris _
suryogandul
Full Member
***
Offline Offline

Activity: 550
Merit: 100



View Profile
August 23, 2020, 09:22:12 PM
 #335

bitcoin scale does have a bigger potential than visa. but remember it's just that bitcoin can not be like a visa that clearly has freedom for all things. there are still countries that prohibit the use of bitcoin so that now visas are still on a bigger scale

Sadlife
Sr. Member
****
Offline Offline

Activity: 1400
Merit: 269



View Profile
August 24, 2020, 02:34:05 AM
 #336

Its possible that Bitcoin could scale exceed the current process of paypal but there's a problem with that, the issue is the halving functionality. Without incentives miners wouldn't create nodes to mine and process transactions,
and the recent drop of hashrate is because of people given up on Bitcoin mining difficulty and lower rewards in processing blocks.

         ▄▄▄▀█▀▀▀█▀▄▄▄
       ▀▀   █     █
    ▀      █       █
  █      ▄█▄       ▐▌
 █▀▀▀▀▀▀█   █▀▀▀▀▀▀▀█
█        ▀█▀        █
█         █         █
█         █        ▄█▄
 █▄▄▄▄▄▄▄▄█▄▄▄▄▄▄▄█   █
  █       ▐▌       ▀█▀
  █▀▀▀▄    █       █
  ▀▄▄▄█▄▄   █     █
         ▀▀▀▄█▄▄▄█▄▀▀▀
.
CRYPTO CASINO
FOR WEB 3.0
.
▄▄▄█▀▀▀
▄▄████▀████
▄████████████
█▀▀    ▀█▄▄▄▄▄
█        ▄█████
█        ▄██████
██▄     ▄███████
████▄▄█▀▀▀██████
████       ▀▀██
███          █
▀█          █
▀▀▄▄ ▄▄▄█▀▀
▀▀▀▄▄▄▄
  ▄ ▄█ ▄
▄▄        ▄████▀       ▄▄
▐█
███▄▄█████████████▄▄████▌
██
██▀▀▀▀▀▀▀████▀▀▀▀▀▀████
▐█▀    ▄▄▄▄ ▀▀        ▀█▌
     █▄████   ▄▀█▄     ▌

     ██████   ▀██▀     █
████▄    ▀▀▀▀           ▄████
█████████████████████████████
████████████████████████████
█████████████████████████
▀███████████████████████▀
██████▌█▌█▌██████▐█▐█▐███████
.
OWL GAMES
|.
Metamask
WalletConnect
Phantom
▄▄▄███ ███▄▄▄
▄▄████▀▀▀▀ ▀▀▀▀████▄▄
▄  ▀▀▀▄▄▀▀▀▀▀▀▀▀▀▄▄▀▀▀  ▄
██▀ ▄▀▀             ▀▀▄ ▀██
██▀ █ ▄     ▄█▄▀      ▄ █ ▀██
██▀ █  ███▄▄███████▄▄███  █ ▀██
█  ▐█▀    ▀█▀    ▀█▌  █
██▄ █ ▐█▌  ▄██   ▄██  ▐█▌ █ ▄██
██▄ ████▄    ▄▄▄    ▄████ ▄██
██▄ ▀████████████████▀ ▄██
▀  ▄▄▄▀▀█████████▀▀▄▄▄  ▀
▀▀████▄▄▄▄ ▄▄▄▄████▀▀
▀▀▀███ ███▀▀▀
.
DICE
SLOTS
BACCARAT
BLACKJACK
.
GAME SHOWS
POKER
ROULETTE
CASUAL GAMES
▄███████████████████▄
██▄▀▄█████████████████████▄▄
███▀█████████████████████████
████████████████████████████▌
█████████▄█▄████████████████
███████▄█████▄█████████████▌
███████▀█████▀█████████████
█████████▄█▄██████████████▌
██████████████████████████
█████████████████▄███████▌
████████████████▀▄▀██████
▀███████████████████▄███▌
              ▀▀▀▀█████▀
0nline
Copper Member
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
September 08, 2020, 08:15:17 PM
 #337

BTCBTC Good luck, you will surely need it. Now since you don't use technical arguments or anything I'd ask you kindly to not derail this thread further. These is at least one person that seems decent in it and worth talking to.
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 [All]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!