Bitcoin Forum
April 16, 2024, 08:50:17 PM *
News: Latest Bitcoin Core release: 26.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 »  All
  Print  
Author Topic: Satoshi Nakamoto: "Bitcoin can scale larger than the Visa Network"  (Read 18268 times)
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 08:59:33 PM
 #121

but the block would get orphaned?
No, why would they?
because the 1 TX the miner included, is obviously an attack to try and get the other miners to run in circles validating for minutes while he starts on the new block?

let's ignore the fact that miners have a strong incentive to keep the network running smoothly... and think about this WILD speculation that with 2MB blocks a miners can block all mining on the network by creating extremely complex transaction that take a long time to validate.

why can't the other miners simply orphen these blocks?

Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1713300617
Hero Member
*
Offline Offline

Posts: 1713300617

View Profile Personal Message (Offline)

Ignore
1713300617
Reply with quote  #2

1713300617
Report to moderator
1713300617
Hero Member
*
Offline Offline

Posts: 1713300617

View Profile Personal Message (Offline)

Ignore
1713300617
Reply with quote  #2

1713300617
Report to moderator
franky1
Legendary
*
Offline Offline

Activity: 4186
Merit: 4406



View Profile
March 08, 2016, 09:09:07 PM
 #122

So, one very large transaction with numerous sigops leads to the quadratic growth.  Hmm, so blindly increasing the block size is asking for trouble.  How many of these troublesome transactions are launched against us recently?  Or is it an unexploited vulnerability?
Perhaps we could increase the block size *and* constrain sigops/transaction until we get SegWit out the door?
Well the problem is only minor at a 1 MB block size limit. Segwit should scale it down and make it linear. It should be released in April (from the initial estimates), so I don't see how you plan to deploy the block size limit and a sigops limitation in <2 months.

and now you see why a 2mb+segwit released in april with 6month grace is not a problem

i love it when lauda debunks his own doomsday scenario

having the 2mb block limit included in aprils release is easy. plus it incentivises more people to download the april version ensuring a real chance of no contention, instead of having upgrades every couple months EG march april july. which just mess with the community too much

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 09:14:26 PM
 #123

So, one very large transaction with numerous sigops leads to the quadratic growth.  Hmm, so blindly increasing the block size is asking for trouble.  How many of these troublesome transactions are launched against us recently?  Or is it an unexploited vulnerability?
Perhaps we could increase the block size *and* constrain sigops/transaction until we get SegWit out the door?
Well the problem is only minor at a 1 MB block size limit. Segwit should scale it down and make it linear. It should be released in April (from the initial estimates), so I don't see how you plan to deploy the block size limit and a sigops limitation in <2 months.
Another thank you Lauda:  If only more people took the time to make clear their positions like you do.  Yep, I see it; with SegWit so close it would be disruptive to overlap it with a block size limit increase combined with a sigops limitation.  I do feel it was a missed opportunity not to provide a modest block increase (even without a sigops limitation) many months ago but there's no going back now.  *Hopefully* SegWit will come out on time, functioning well, and be embraced quickly without a lot of angst.
Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 09:23:45 PM
 #124

Another thank you Lauda:  If only more people took the time to make clear their positions like you do.  Yep, I see it; with SegWit so close it would be disruptive to overlap it with a block size limit increase combined with a sigops limitation.  I do feel it was a missed opportunity not to provide a modest block increase (even without a sigops limitation) many months ago but there's no going back now.  *Hopefully* SegWit will come out on time, functioning well, and be embraced quickly without a lot of angst.
You're very welcome. I do hope that there will be a hard fork proposal after Segwit which will gain consensus. However, if Segwit is adopted quickly by both the miners and users we should see a good increase in transaction capacity that should hopefully get us through 2016. The problem caused in this conflict is that there are a lot of people who aren't willing to listen to reason and facts. Time should not be wasted on them.

why can't the other miners simply orphen these blocks?
Exactly how do you plan on "simply" detecting these blocks and why would somebody orphan them? How do you classify this as an attack; was the TX that F2Pool did to clear the spam also an attack?

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 09:23:49 PM
 #125

So, one very large transaction with numerous sigops leads to the quadratic growth.  Hmm, so blindly increasing the block size is asking for trouble.  How many of these troublesome transactions are launched against us recently?  Or is it an unexploited vulnerability?
Perhaps we could increase the block size *and* constrain sigops/transaction until we get SegWit out the door?
Well the problem is only minor at a 1 MB block size limit. Segwit should scale it down and make it linear. It should be released in April (from the initial estimates), so I don't see how you plan to deploy the block size limit and a sigops limitation in <2 months.
and now you see why a 2mb+segwit released in april with 6month grace is not a problem

i love it when lauda debunks his own doomsday scenario

having the 2mb block limit included in aprils release is easy. plus it incentivises more people to download the april version ensuring a real chance of no contention, instead of having upgrades every couple months EG march april july. which just mess with the community too much
Well, franky1, that's really interesting (although it could have been said just as well without the provocative words); 2MB + SegWit.  We do want to be careful -- releasing multiple things at the same time can lead to confusion if things don't go perfectly well.  I'd hate to see something going wrong and each side blaming the other.  Also, if there would be a need to retract a feature/function then that would likely cause a media stir; something Bitcoin doesn't need anymore of at this time.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 09:30:38 PM
Last edit: March 08, 2016, 09:45:44 PM by adamstgBit
 #126

why can't the other miners simply orphen these blocks?
Exactly how do you plan on "simply" detecting these blocks and why would somebody orphan them? How do you classify this as an attack; was the TX that F2Pool did to clear the spam also an attack?

somthing like  if TX has >10,000 inputs it's no good?

why not!? I wouldn't mind if F2Pool's tiny blocks were orphaned by other miners. maybe he'll stop making these tiny blocks if they do punish him for it?

oh wait i read that wrong....

again, WHY NOT!? the TX in question WAS an attack, why include in the blockchain?

if you can't be sure if its spam or not ( <1$ TX with 1cent fee ) fine included in the block, but if you are sure it is spam ( >10K inputs ) why include it?

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 09:50:18 PM
 #127

As hard as it might be to see, I really believe the crisis in front of us is one of perception as opposed to anything technical.  Perceptions are manageable while the real work of sorting through the technical issues is taken out of the limelight.

The network is working and it's still relatively cheap (5cents pre TX)
But we have threads that are titled " why is my TX not confirming ?? " or somthing to that effect
Newbies are using bitcoin for the first time and are having a hard time, with their BTC tie up seemly never to confrim, and they conclude that bitcoin is not all that it's cracked up to be....
is this a problem?

I remember when i was a newbie, i would check over and over waiting for the first 6 confirmations, everything went smoothly, but I didnt fully trust that it would go smoothly, i was afraid my money would get lost or somthing. slowly my confidence in the system grew as i used it more and understood it more.

not sure i would have be able to build any confidence had i started using bitcoin today....
adamstgBit, you've hit it right on the head.  We must do everything we can to help folks adopt Bitcoin without pain/anxiety.  Our marketing messages need to set expectations appropriately; gone are the days of leading with the misleading "no fees".  It's ok to indicate lower fees than the competition, e.g. wire transfers, VISA, *if* it is indeed true but don't touch the topic if it is not -- if 5¢/transaction is the going fee then transactions less than say $5, i.e. 1%, make less sense.  Instead lead with our indisputable strengths, e.g. ~1 hour (6 blocks) to guarantee transfers of even large amounts anywhere in the world.  It is just unbelievable that there are wallets out there that don't set an appropriate fee automatically and by default such that transactions get through quickly despite the dynamic environment.  Once a new user grows accustom then they could dig in and find the overrides to try transactions with low/zero fees and see the natural consequences of long delays.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 09:55:37 PM
 #128

why can't the other miners simply orphen these blocks?
Exactly how do you plan on "simply" detecting these blocks and why would somebody orphan them? How do you classify this as an attack; was the TX that F2Pool did to clear the spam also an attack?

somthing like  if TX has >10,000 inputs it's no good?

why not!? I wouldn't mind if F2Pool's tiny blocks were orphaned by other miners. maybe he'll stop making these tiny blocks if they do punish him for it?

oh wait i read that wrong....

again, WHY NOT!? the TX in question WAS an attack, why include in the blockchain?

if you can't be sure if its spam or not ( <1$ TX with 1cent fee ) fine included in the block, but if you are sure it is spam ( >10K inputs ) why include it?
That's interesting adamstgBit.  *Is* it *always* the case that >10K inputs is indeed spam?  *Is* there *ever* a case where it is not?  Can the same movements be accomplished by splitting it up into multiple transactions to avoid triggering the spam rejection?
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 09:59:34 PM
 #129

As hard as it might be to see, I really believe the crisis in front of us is one of perception as opposed to anything technical.  Perceptions are manageable while the real work of sorting through the technical issues is taken out of the limelight.

The network is working and it's still relatively cheap (5cents pre TX)
But we have threads that are titled " why is my TX not confirming ?? " or somthing to that effect
Newbies are using bitcoin for the first time and are having a hard time, with their BTC tie up seemly never to confrim, and they conclude that bitcoin is not all that it's cracked up to be....
is this a problem?

I remember when i was a newbie, i would check over and over waiting for the first 6 confirmations, everything went smoothly, but I didnt fully trust that it would go smoothly, i was afraid my money would get lost or somthing. slowly my confidence in the system grew as i used it more and understood it more.

not sure i would have be able to build any confidence had i started using bitcoin today....
adamstgBit, you've hit it right on the head.  We must do everything we can to help folks adopt Bitcoin without pain/anxiety.  Our marketing messages need to set expectations appropriately; gone are the days of leading with the misleading "no fees".  It's ok to indicate lower fees than the competition, e.g. wire transfers, VISA, *if* it is indeed true but don't touch the topic if it is not -- if 5¢/transaction is the going fee then transactions less than say $5, i.e. 1%, make less sense.  Instead lead with our indisputable strengths, e.g. ~1 hour (6 blocks) to guarantee transfers of even large amounts anywhere in the world.  It is just unbelievable that there are wallets out there that don't set an appropriate fee automatically and by default such that transactions get through quickly despite the dynamic environment.  Once a new user grows accustom then they could dig in and find the overrides to try transactions with low/zero fees and see the natural consequences of long delays.
I feel the core dev team is not willing to make the appropriate trade offs.
I do believe we could of had the 2MB limit inplace months ago and all this  pain/anxiety avoided.
but the Core dev team doesn't seem very concerned with end users pain/anxiety, as much as they are with theoretically more elegant scaling.
I feel they are programmers not qualified to manage and direct the project, and they are making bad decision, maybe the decision they make are technically more elegant, but they do not create a more elegant user experience. but they are convinced that the users dont matter because its Bitcoin's birth right to replace central banking.

David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 10:02:06 PM
Last edit: March 08, 2016, 10:22:50 PM by David Rabahy
 #130

Listen everyone: *** Bitcoin is a fantastic thing!!! ***  Our internal debates are seen; they are taken as a measure of our ability to provide governance or not.  There are folks lurking in here; picking up on the mood and reporting on it.  Every posting should take this into account.

Championing Bitcoin first before espousing one technical aspect or another will help.

Avoiding the practice of nitpicking or worse will help.

Compromise and appeasement are valuable tools.
franky1
Legendary
*
Offline Offline

Activity: 4186
Merit: 4406



View Profile
March 08, 2016, 10:04:08 PM
Last edit: March 08, 2016, 10:18:31 PM by franky1
 #131

Well, franky1, that's really interesting (although it could have been said just as well without the provocative words); 2MB + SegWit.  We do want to be careful -- releasing multiple things at the same time can lead to confusion if things don't go perfectly well.  I'd hate to see something going wrong and each side blaming the other.  Also, if there would be a need to retract a feature/function then that would likely cause a media stir; something Bitcoin doesn't need anymore of at this time.

then we move onto the other debate. blockstream devs not only want segwit in april but some other changes within the same release.

one of them is actually debunks the "hard fork is doomsday".. because it, itself is a hardfork.
by which i mean luke Jr's proposal for code to be added in april with just a 3 month grace (activating at block 420,000) in the attempt to unnaturally drop the difficulty to allow miners to solve blocks easier. basically forcing blocks to be made in 5 minutes instead of 10, to allow an extra 2 weeks of similar income before natural biweekly difficulty adjustments raise the difficulty.

which apart from knowing its just a feature that has no long term purpose. the suggestion that a hardfork can be added in april with a 3 month grace totally contradicts the same devs who say that a hardfork which is not on the roadmap needs 12months grace..

but the ultimate thing that defies logic, is the solid faith that all code done by blockstream is perfect and any proposal outside of blockstream must be ruled out, veto'd, debated to cause contention and delay.

like i said a 2mb buffer is like the 1mb buffer in 2013. it wont cause a doubling of blocksize over night. as the miners will have preferential settings to grow slowly within the hard limit.

the code itself is simple to implement. and the only thing that can cause harm would be contention by those foolhardily refusing to upgrade when the majority have actually already upgraded.

thats why its better to have the code available to all and then let the community decide if they want it..
if no one wants it. it doesnt activate, its that simple rather then avoid and delay the code and cause the contention that they doomsday speak about. by never letting the community have access to it.(self fulfilling prophecy they created)

now that has been said. going back to the other stuff you have said
i would like to see the results of the tests you intend to do. as real results always outweigh opinion and guesswork. so your tests can actually help out alot of people

I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.
Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 10:12:07 PM
 #132

I feel the core dev team is not willing to make the appropriate trade offs.
I do believe we could of had the 2MB limit inplace months ago and all this pain/anxiety avoided.
but the Core dev team doesn't seem very concerned with end users pain/anxiety, as much as they are with theoretically more elegant scaling.
I feel they are programmers not qualified to manage and direct of the project, and they are making bad decision, maybe the decision they make are technically more elegant, but they are currently create a more elegant user experience. but they are convinced that the users dont matter because its Bitcoin birth right to replace central banking.
Bitcoin is up and running well (assuming one uses reasonable fees ~5¢/transaction) despite an onslaught of ill-intention persons.  Shame on all of us for not adjusting our marketing messages earlier to set expectations better.  Find me another system that has withstood as much and moves millions of dollars a day.

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free?

My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.

Lauda has done a yeoman's job representing the positions; thank goodness someone has the patience.
David Rabahy
Hero Member
*****
Offline Offline

Activity: 709
Merit: 501



View Profile
March 08, 2016, 10:17:33 PM
 #133

i would like to see the results of the tests you intend to do. as real results always outweigh opinion and guesswork. so your tests can actually help out alot of people
Although I am willing to participate in a test to measure the SigOp quadratic growth, I am not able without help.  If someone will build the code then I will install, run, and report back on it.
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 10:41:37 PM
 #134

I feel the core dev team is not willing to make the appropriate trade offs.
I do believe we could of had the 2MB limit inplace months ago and all this pain/anxiety avoided.
but the Core dev team doesn't seem very concerned with end users pain/anxiety, as much as they are with theoretically more elegant scaling.
I feel they are programmers not qualified to manage and direct of the project, and they are making bad decision, maybe the decision they make are technically more elegant, but they are currently create a more elegant user experience. but they are convinced that the users dont matter because its Bitcoin birth right to replace central banking.
Bitcoin is up and running well (assuming one uses reasonable fees ~5¢/transaction) despite an onslaught of ill-intention persons.  Shame on all of us for not adjusting our marketing messages earlier to set expectations better.  Find me another system that has withstood as much and moves millions of dollars a day.

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free?

My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.

Lauda has done a yeoman's job representing the positions; thank goodness someone has the patience.

give it a few more days of debate, you'll see your optimism and praise will turn to anger and disgust. LOL  Grin

your a really nice guy, quite refreshing!

LEAVE THIS PLACE IMMEDIATELY, this is for your own good.

Lauda
Legendary
*
Offline Offline

Activity: 2674
Merit: 2965


Terminated.


View Profile WWW
March 08, 2016, 10:48:32 PM
 #135

That's interesting adamstgBit.  *Is* it *always* the case that >10K inputs is indeed spam?  *Is* there *ever* a case where it is not?  Can the same movements be accomplished by splitting it up into multiple transactions to avoid triggering the spam rejection?
These limitations are very bad. Wasn't Bitcoin supposed to be censorship free? Who gets to decide what kind of transactions we are going to limit? As an example, the sigops limitation that Gavin implemented in Classic is not a solution of any kind. For example, if we had confidential transactions today they would not work due to this. There are so many potential use cases that it is nearly impossible for us to consider everything.

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free? My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.
They're a group of volunteers. Some are employed by MIT (Wladimir IIRC), and some are employed by Blockstream (Maxwell, Wuille) and such. However, most of them are just volunteers from what I know.

Lauda has done a yeoman's job representing the positions; thank goodness someone has the patience.
I try my best, as long as the other person (especially when lacking knowledge) is willing to listen to reason, facts, data and such.

"The Times 03/Jan/2009 Chancellor on brink of second bailout for banks"
😼 Bitcoin Core (onion)
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 10:51:36 PM
 #136

why can't the other miners simply orphen these blocks?
Exactly how do you plan on "simply" detecting these blocks and why would somebody orphan them? How do you classify this as an attack; was the TX that F2Pool did to clear the spam also an attack?

somthing like  if TX has >10,000 inputs it's no good?

why not!? I wouldn't mind if F2Pool's tiny blocks were orphaned by other miners. maybe he'll stop making these tiny blocks if they do punish him for it?

oh wait i read that wrong....

again, WHY NOT!? the TX in question WAS an attack, why include in the blockchain?

if you can't be sure if its spam or not ( <1$ TX with 1cent fee ) fine included in the block, but if you are sure it is spam ( >10K inputs ) why include it?
That's interesting adamstgBit.  *Is* it *always* the case that >10K inputs is indeed spam?  *Is* there *ever* a case where it is not?  Can the same movements be accomplished by splitting it up into multiple transactions to avoid triggering the spam rejection?

someone can probably come up with some highly speculative scenario where such a transaction could potently be useful
is that excuse a good reason to not protect the network against such an attack?

SpiryGolden
Hero Member
*****
Offline Offline

Activity: 812
Merit: 500



View Profile
March 08, 2016, 10:59:06 PM
 #137

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free? My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.
They're a group of volunteers. Some are employed by MIT (Wladimir IIRC), and some are employed by Blockstream (Maxwell, Wuille) and such. However, most of them are just volunteers from what I know.



They are mostly Blockstream members. They are on payroll of bankers from VC's. Before somebody call me a liar. You can check out https://www.blockstream.com/team/ and then compare with https://bitcoin.org/en/development#bitcoin-core-contributors . I will stop coming into a conversation without proof. I hope that's good enough. It will never have an 2MB hardfork they will just give us breadcrumbs to shut up a bit while they are building solutions so people will go in wave to their solutions rather then stay on main chain. Transforming it in a settlement layer and absolute the P2P payment system and take it off-chain to their solution.

That's the last thing I had to say. No more involving in this subject, it drains me out of energy, it drains me out of patience. All I can do is keep my Bitcoin Classic nodes up and hope slowly other will join as more will be aware of the breadcrumbs were given. Dramatizing the whole increase of block size just because it's dangerous without any real and statistical answer while they forked when testing SegWit it says a lot.

Also they are a lot of Bitcoin Core supporters here while the whole board bitcointalk is 100% Bitcoin Core by just not offering people alternative solutions and advertising every update of Bitcoin Core while not offering people alternatives with proper explanation prooves again the control of mass and keeping it them blind against recent Bitcoin discussion. It absolute the whole idea of community of an free & open source project made by people for people. So Bitcoin Classic must somehow pay for advertising and inform people about the news and their right to choose, because other channels are trying to hide as much as possible the right to choose.

Thanks,
I wish both sides good luck and may the best chain win.

adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 11:09:21 PM
 #138


These limitations are very bad. Wasn't Bitcoin supposed to be censorship free? Who gets to decide what kind of transactions we are going to limit?


this is a prime example of ivory tower thinking getting in the way of real solutions that matter to real poeple.


As an example, the sigops limitation that Gavin implemented in Classic is not a solution of any kind. For example, if we had confidential transactions today they would not work due to this. There are so many potential use cases that it is nearly impossible for us to consider everything.

this is a prime example of bad project management, we can't achieve any kind of consensus without first agreeing to what the main goals of the project is.

of course Gavin is doing things differently in classic his primary goal is to scale the blockchain as much as possible.

this is NOT the main goal of Core, which is fine, there second layer solution is fine, but it's simply not what the majority want...

Gleb Gamow
In memoriam
VIP
Legendary
*
Offline Offline

Activity: 1428
Merit: 1145



View Profile
March 08, 2016, 11:11:50 PM
 #139

I would be willing to run a full node on a testnet to see if my system could handle larger blocks, i.e. verify a large block in less than the average time between blocks.

I have a question:  The total amount work to verify N 1MB blocks is about the same as single N-MB block, right?  For example, 32 1MB blocks take about the same amount of work to verify as a single 32MB block, right?  Just please ignore the live delivery of blocks for the moment.  Or is there some advantage to large blocks where less headers have to be processed.  Imagine a full node was off the air for a day or two and is just trying to catch up as fast as possible.  What block size facilitates that best?

To me it seems fees tend to be inversely proportional to block size, i.e. with smaller blocks fees rise as folks compete to get into blocks, with larger blocks fees get smaller with less competition to get into blocks.  What does it cost a bad actor (if there is truly such a thing in this realm) to clog up the works?  I suppose we are looking for the right size of block to cause them to expend their resources most quickly.  Make the block size very small and the fee competition would rise high enough to deplete the bad actor very fast; everyone suffers higher fees until they are run out of town (so to speak).  Hmm, but if the block size is very small then even when there aren't any bad actors on the scene, regular legit users would be forced to compete.  At the other end of the spectrum; make the block size very large and with such low competition fees would diminish.  The real question here is what happens to the fees/MB across the spectrum of block sizes.

Is there *anyone* preferring a smaller than 1MB block size right now?  I haven't heard of any but you never know.  I do think some miners do artificially constrain the block size they produce to like 900KB or so (I'm not sure of their motivation).  Even if the block size were increased then such miners could still constrain the ones they produce, right?

A transaction cannot span multiple blocks, right?  I suppose the block size creates a functional limit on transaction sizes.  Or is the size of a transaction constrained some other way?

Odd! I find this post more similar to Satoshi Nakamoto's writings that the letter in the OP supposedly sent to Mike Hearn by SN which I contend was written by MH. (see my earlier post in this thread)
adamstgBit
Legendary
*
Offline Offline

Activity: 1904
Merit: 1037


Trusted Bitcoiner


View Profile WWW
March 08, 2016, 11:29:39 PM
 #140

Who pays the Core Dev Team?  Are they doing all their work for Bitcoin for free? My sincerest heartfelt praise and admiration go out to Core Dev Team members!  They are giving us something I couldn't do myself.  That said, I do think there's room for improvement on the handling of perceptions front.  I sincerely believe they have the overall good of Bitcoin *and* the users of it foremost in their minds.
They're a group of volunteers. Some are employed by MIT (Wladimir IIRC), and some are employed by Blockstream (Maxwell, Wuille) and such. However, most of them are just volunteers from what I know.

They are mostly Blockstream members. They are on payroll of bankers from VC's. Before somebody call me a liar. You can check out https://www.blockstream.com/team/ and then compare with https://bitcoin.org/en/development#bitcoin-core-contributors . I will stop coming into a conversation without proof. I hope that's good enough. It will never have an 2MB hardfork they will just give us breadcrumbs to shut up a bit while they are building solutions so people will go in wave to their solutions rather then stay on main chain. Transforming it in a settlement layer and absolute the P2P payment system and take it off-chain to their solution.

That's the last thing I had to say. No more involving in this subject, it drains me out of energy, it drains me out of patience. All I can do is keep my Bitcoin Classic nodes up and hope slowly other will join as more will be aware of the breadcrumbs were given. Dramatizing the whole increase of block size just because it's dangerous without any real and statistical answer while they forked when testing SegWit it says a lot.

Also they are a lot of Bitcoin Core supporters here while the whole board bitcointalk is 100% Bitcoin Core by just not offering people alternative solutions and advertising every update of Bitcoin Core while not offering people alternatives with proper explanation prooves again the control of mass and keeping it them blind against recent Bitcoin discussion. It absolute the whole idea of community of an free & open source project made by people for people. So Bitcoin Classic must somehow pay for advertising and inform people about the news and their right to choose, because other channels are trying to hide as much as possible the right to choose.

Thanks,
I wish both sides good luck and may the best chain win. 

+1

@David Rabahy, this is all true, read it and understand why we are out of patience.

Pages: « 1 2 3 4 5 6 [7] 8 9 10 11 12 13 14 15 16 17 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!