Bitcoin Forum
June 22, 2024, 07:28:30 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 [16] 17 18 »  All
  Print  
Author Topic: Segregated witness - The solution to Scalability (short term)?  (Read 23096 times)
The00Dustin
Hero Member
*****
Offline Offline

Activity: 807
Merit: 500


View Profile
December 23, 2015, 05:57:10 PM
 #301

Is there anyway to stop or block segregated witness? From what I understand, it hits testnet in two days... Sad
No one answered your question, so I will.  The answer is yes, all you have to do is find unfixable problems with it and exploit them on testnet.  Doing the same with fixable problems will delay it at the very least while also ensuring it is more secure if it is ultimately deployed live.  Several potential attack vectors have been discussed in this thread, if any of them truly exist, you can "take advantage" of them on testnet and protect bitcoin at the same time.
BitUsher
Legendary
*
Offline Offline

Activity: 994
Merit: 1034


View Profile
December 23, 2015, 06:12:47 PM
 #302

Is there anyway to stop or block segregated witness? From what I understand, it hits testnet in two days... Sad
No one answered your question, so I will.  The answer is yes, all you have to do is find unfixable problems with it and exploit them on testnet.  Doing the same with fixable problems will delay it at the very least while also ensuring it is more secure if it is ultimately deployed live.  Several potential attack vectors have been discussed in this thread, if any of them truly exist, you can "take advantage" of them on testnet and protect bitcoin at the same time.

Yes , that is the another solution to potentially either stop SepSig or improve it when solutions are discovered. This should naturally be the first course of action one should take as all other implementations can learn from and or use SepSig as well because it is a valuable solution to many problems. I highly recommend everyone test and attack SepSig on the testnet for the betterment of our ecosystem. Please post any results in this thread and mailing lists.

The second solution I already mentioned can be carried out simultaneosly if you wish -
Is there anyway to stop or block segregated witness? From what I understand, it hits testnet in two days... Sad
This is rather simple to answer. Gavin is in support of SepSig(although prefers a hardfork) , Hearn is no longer interested in XT or working with Bitcoin directly. So your first task is to amass a separate group developers to maintain a github fork and than convince enough nodes and miners to adopt your version whether its Bitcoin XT or Bitcoin Unlimited, or something else. I encourage you to do this as I think a diversity of choice and implementations is a good thing for bitcoin. I also encourage any supporters of alternate implementations to be both proud of their work and not play the victim if their implementation doesn't get adopted en masse.

This discussion sometimes has gotten heated and bitter, but without evidence to the contrary, I will assume good faith and welcome other ideas and competing development teams that we can all learn and share from.
johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
December 24, 2015, 12:11:07 AM
Last edit: December 24, 2015, 12:29:20 AM by johnyj
 #303

I understand it and don't hate the core developers. Seg wit is a thoughtful solution, but not an elegant one. IMHO, it's just more spackle over the cracks and a further delay to increasing block size.  

Good observation

In change management, the first question is not about if a change is good or bad, it is why must we make this change. It is always the motivation behind the change worth looking

It seems SegWit is invented to temporary circumvent the hard coded 1MB limitation of the block size, so that the traffic can still grow without trigger a fee event

So the next question will be: Why are we so fear of a fee event?

Jeff gave some answer here:
Quote
*Key observation:   A Bitcoin Fee Event (see def. at top) is an Economic
Change Event.*

An Economic Change Event is a period of market chaos, where large changes
to prices and sets of economic actors occurs over a short time period.

A Fee Event is a notable Economic Change Event, where a realistic
projection forsees higher fee/KB on average, pricing some economic actors
(bitcoin projects and businesses) out of the system.

I don't think this so called "chaos" is convincing enough, so the next question: Who are these bitcoin projects and businesses, and is bitcoin's goal to benefit average people or serve these projects/businesses?

Although institutions have large capital and influence in the industry, I don't think bitcoin's purpose is to become banks' another payment network (Banks being the highest form of business, a business large enough will start to do banking)

In fact, businesses can always pass the fee cost to customer, and those customers are not fee sensitive (Statistics showed that majority of users come to bitcoin and use it for long term value store and high value international remittance, both are not very sensitive to fee and transaction frequency) So a higher fee will not affect business either. And large businesses can establish clearing channel to dramatically reduce the fee cost, this is a common practice in financial world, they don't need to change bitcoin architecture to do that

So I think the motivation behind the architecture change of bitcoin is still not enough convincing. Since no one have seen a fee event, it might not be the "chaos" that is predicted by Jeff, so people must see it with their own eyes to be convinced that it is a problem that really need to be solved. What if it is not a problem at all? Banks are still closed during weekends and holiday, is that a problem for our financial system?

Even a fee event negatively affect majority of the user experience, the way to future scaling should still follow Satoshi's vision as much as possible. Anyway this is his invention, no one except him have the right to change it to something totally different

jbreher
Legendary
*
Offline Offline

Activity: 3038
Merit: 1660


lose: unfind ... loose: untight


View Profile
December 24, 2015, 12:21:58 AM
 #304

   Other changes required: Even a single-line change such as increasing the maximum block size has effects on other parts of the code, some of which are undesirable. For example, right now it’s possible to construct a transaction that takes up almost 1MB of space and which takes 30 seconds or more to validate on a modern computer (blocks containing such transactions have been mined). In 2MB blocks, a 2MB transaction can be constructed that may take over 10 minutes to validate which opens up dangerous denial-of-service attack vectors. Other lines of code would need to be changed to prevent these problems.

The average non-developer likely naturally assumes that 2MB blocks is a safe change to make and conservative, but a targeted DDOS attack exploiting that 10 minute validation delay(up from 30 seconds for 1MB ) would be disastrous.  

I must admit that I am guilty of such an assumption. Time linear with block size seems rational on the surface before looking into the matter.

Are you asserting that the worst cast for a 1MB block size today is less than 30 seconds on the same hardware that would have a worst case of 10 minutes if the only variable is a blocksize doubled to 2MB?

What are the characteristics of such an aberrant block?

Anyone with a campaign ad in their signature -- for an organization with which they are not otherwise affiliated -- is automatically deducted credibility points.

I've been convicted of heresy. Convicted by a mere known extortionist. Read my Trust for details.
jbreher
Legendary
*
Offline Offline

Activity: 3038
Merit: 1660


lose: unfind ... loose: untight


View Profile
December 24, 2015, 12:27:06 AM
 #305

Is there anyway to stop or block segregated witness? From what I understand, it hits testnet in two days... Sad
No one answered your question, so I will.  The answer is yes, all you have to do is find unfixable problems with it and exploit them on testnet.  Doing the same with fixable problems will delay it at the very least while also ensuring it is more secure if it is ultimately deployed live.  Several potential attack vectors have been discussed in this thread, if any of them truly exist, you can "take advantage" of them on testnet and protect bitcoin at the same time.

Great answer. IF there are no killer problems in SegWit, I can be swayed to support it (my current position, as a conservative measure, is to try to uncover faults). If indeed we can throw all possible attacks at it on the test net, and it comes through unscathed, then what would be the downside to adopt it on the main net? (he asked rhetorically)

Anyone with a campaign ad in their signature -- for an organization with which they are not otherwise affiliated -- is automatically deducted credibility points.

I've been convicted of heresy. Convicted by a mere known extortionist. Read my Trust for details.
BitUsher
Legendary
*
Offline Offline

Activity: 994
Merit: 1034


View Profile
December 24, 2015, 12:32:13 AM
 #306

Since no one have seen a fee event, it might not be the "chaos" that is predicted by Jeff, so people must see it with their own eyes to be convinced that it is a problem that really need to be solved. What if it is not a problem at all? Banks are still closed during weekends and holiday, is that a problem for our financial system?
There is already some evidence that a fee market has existed :

https://rusty.ozlabs.org/?p=564

Even wallets like mycelium have 4 fee settings at the point of payment (low priority , economy, normal, priority) to address the fee market that has already occurred and is occurring.


Even a fee event negatively affect majority of the user experience, the way to future scaling should still follow Satoshi's vision as much as possible. Anyway this is his invention, no one except him have the right to change it to something totally different. Anyone not satisfied by his design can just create their alt

Satoshi did build a lot of extensibility and op codes within the original design so bitcoin could grow, evolve, and use layers like the lightning network. While I do respect Satoshi we shouldn't worship him and treat everything he has done as sacrosanct as he has made many mistakes. What is more important is us respecting the investment contract we have all agreed to over the years about respecting the core fundamentals that makes bitcoin unique. Satoshi can always sign a PGP key and jump and and make a comment if he has some serious concerns as well.  
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 3920
Merit: 2349


Eadem mutata resurgo


View Profile
December 24, 2015, 12:38:12 AM
 #307

I understand it and don't hate the core developers. Seg wit is a thoughtful solution, but not an elegant one. IMHO, it's just more spackle over the cracks and a further delay to increasing block size.  

Good observation

In change management, the first question is not about if a change is good or bad, it is why must we make this change. It is always the motivation behind the change worth looking

It seems SegWit is invented to temporary circumvent the hard coded 1MB limitation of the block size, so that the traffic can still grow without trigger a fee event

So the next question will be: Why are we so fear of a fee event?

Jeff gave some answer here:
Quote
*Key observation:   A Bitcoin Fee Event (see def. at top) is an Economic
Change Event.*

An Economic Change Event is a period of market chaos, where large changes
to prices and sets of economic actors occurs over a short time period.

A Fee Event is a notable Economic Change Event, where a realistic
projection forsees higher fee/KB on average, pricing some economic actors
(bitcoin projects and businesses) out of the system.

I don't think this so called "chaos" is convincing enough, so the next question: Who are these bitcoin projects and businesses, and is bitcoin's goal to benefit average people or serve these projects/businesses?

Although institutions have large capital and influence in the industry, I don't think bitcoin's purpose is to become banks' another payment network (Banks being the highest form of business, a business large enough will start to do banking)

In fact, businesses can always pass the fee cost to customer, and those customers are not fee sensitive (Statistics showed that majority of users come to bitcoin and use it for long term value store and high value international remittance, both are not very sensitive to fee and transaction frequency) So a higher fee will not affect business either. And large businesses can establish clearing channel to dramatically reduce the fee cost, this is a common practice in financial world, they don't need to change bitcoin architecture to do that

So I think the motivation behind the architecture change of bitcoin is still not enough convincing. Since no one have seen a fee event, it might not be the "chaos" that is predicted by Jeff, so people must see it with their own eyes to be convinced that it is a problem that really need to be solved. What if it is not a problem at all? Banks are still closed during weekends and holiday, is that a problem for our financial system?

Even a fee event negatively affect majority of the user experience, the way to future scaling should still follow Satoshi's vision as much as possible. Anyway this is his invention, no one except him have the right to change it to something totally different


Bitcoin itself is a huge "Economic Change Event" in the wider context of the existing monetary systems (i think this is where Jeff probably got the idea from) ... fees coming online for bitcoin TX is a storm in a teacup by comparison.

johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
December 24, 2015, 12:48:50 AM
 #308

   Other changes required: Even a single-line change such as increasing the maximum block size has effects on other parts of the code, some of which are undesirable. For example, right now it’s possible to construct a transaction that takes up almost 1MB of space and which takes 30 seconds or more to validate on a modern computer (blocks containing such transactions have been mined). In 2MB blocks, a 2MB transaction can be constructed that may take over 10 minutes to validate which opens up dangerous denial-of-service attack vectors. Other lines of code would need to be changed to prevent these problems.

The average non-developer likely naturally assumes that 2MB blocks is a safe change to make and conservative, but a targeted DDOS attack exploiting that 10 minute validation delay(up from 30 seconds for 1MB ) would be disastrous.  

I must admit that I am guilty of such an assumption. Time linear with block size seems rational on the surface before looking into the matter.

Are you asserting that the worst cast for a 1MB block size today is less than 30 seconds on the same hardware that would have a worst case of 10 minutes if the only variable is a blocksize doubled to 2MB?

What are the characteristics of such an aberrant block?



I heard that some new libs can dramatically increase the verification speed, this might not be a large concern by then

BitUsher
Legendary
*
Offline Offline

Activity: 994
Merit: 1034


View Profile
December 24, 2015, 12:51:27 AM
 #309

   Other changes required: Even a single-line change such as increasing the maximum block size has effects on other parts of the code, some of which are undesirable. For example, right now it’s possible to construct a transaction that takes up almost 1MB of space and which takes 30 seconds or more to validate on a modern computer (blocks containing such transactions have been mined). In 2MB blocks, a 2MB transaction can be constructed that may take over 10 minutes to validate which opens up dangerous denial-of-service attack vectors. Other lines of code would need to be changed to prevent these problems.

The average non-developer likely naturally assumes that 2MB blocks is a safe change to make and conservative, but a targeted DDOS attack exploiting that 10 minute validation delay(up from 30 seconds for 1MB ) would be disastrous.  

I must admit that I am guilty of such an assumption. Time linear with block size seems rational on the surface before looking into the matter.

Are you asserting that the worst cast for a 1MB block size today is less than 30 seconds on the same hardware that would have a worst case of 10 minutes if the only variable is a blocksize doubled to 2MB?

What are the characteristics of such an aberrant block?

This is indeed true and is included in the FAQ backed by most of the developers and something I was unaware of as well. I haven't done the math but appears that a 2 MB block with heavy P2SH that can extend validation time to those lengths with certain nodes. It is likely representative as a worst case scenario but does support an idea of how even a modest increase can bring down nodes in an already delicate environment where we have too much centralization.

I would like to see the math as well.

---------------------------------------------------------------

https://bitcoinmagazine.com/articles/segregated-witness-part-why-you-should-care-about-a-nitty-gritty-technical-trick-1450827675

Segregated Witness, Part 2: Why You Should Care About a Nitty-Gritty Technical Trick


I heard that some new libs can dramatically increase the verification speed, this might not be a large concern by then

If you review their FAQ you can see this is precisely why they want to roll out the other libraries first before increasing the limit with a hardfork. They are well aware that LN won't be very useful at 1MB + 4MB SepSig.
jbreher
Legendary
*
Offline Offline

Activity: 3038
Merit: 1660


lose: unfind ... loose: untight


View Profile
December 24, 2015, 01:03:14 AM
 #310

   Other changes required: Even a single-line change such as increasing the maximum block size has effects on other parts of the code, some of which are undesirable. For example, right now it’s possible to construct a transaction that takes up almost 1MB of space and which takes 30 seconds or more to validate on a modern computer (blocks containing such transactions have been mined). In 2MB blocks, a 2MB transaction can be constructed that may take over 10 minutes to validate which opens up dangerous denial-of-service attack vectors. Other lines of code would need to be changed to prevent these problems.

The average non-developer likely naturally assumes that 2MB blocks is a safe change to make and conservative, but a targeted DDOS attack exploiting that 10 minute validation delay(up from 30 seconds for 1MB ) would be disastrous.  

I must admit that I am guilty of such an assumption. Time linear with block size seems rational on the surface before looking into the matter.

Are you asserting that the worst cast for a 1MB block size today is less than 30 seconds on the same hardware that would have a worst case of 10 minutes if the only variable is a blocksize doubled to 2MB?

What are the characteristics of such an aberrant block?



I heard that some new libs can dramatically increase the verification speed, this might not be a large concern by then

Thanks. To be clear, is this re-serialization totaling 1.25 GB something that the _current_ Bitcoin Core does when faced with this aberrant block, or are we comparing apples to oranges?

Got a link to the presentation?

Anyone with a campaign ad in their signature -- for an organization with which they are not otherwise affiliated -- is automatically deducted credibility points.

I've been convicted of heresy. Convicted by a mere known extortionist. Read my Trust for details.
BitUsher
Legendary
*
Offline Offline

Activity: 994
Merit: 1034


View Profile
December 24, 2015, 01:09:58 AM
 #311

https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/012104.html

3 BiPs for SepSig being developed-

CONSENSUS BIP: witness structures and how they're committed to blocks,
cost metrics and limits, the scripting system (witness programs), and
the soft fork mechanism.  Draft - https://github.com/bitcoin/bips/pull/265

PEER SERVICES BIP: relay message structures, witnesstx serialization,
and other issues pertaining to the p2p protocol such as IBD,
synchronization, tx and block propagation, etc...

APPLICATIONS BIP: scriptPubKey encoding formats and other wallet
interoperability concerns.

-------------------------------------------------------------

johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
December 24, 2015, 01:15:25 AM
Last edit: December 24, 2015, 01:37:28 AM by johnyj
 #312


To be clear, is this re-serialization totaling 1.25 GB something that the _current_ Bitcoin Core does when faced with this aberrant block, or are we comparing apples to oranges?

Got a link to the presentation?

F2Pool did this on their node, the video is from the September scaling bitcoin conference,

 https://www.youtube.com/watch?v=TgjrS-BPWDQ

https://scalingbitcoin.org/montreal2015/presentations/Day2/11-Friedenbach-scaling-bitcoin.pdf

johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
December 24, 2015, 01:19:20 AM
 #313

Bitcoin itself is a huge "Economic Change Event" in the wider context of the existing monetary systems (i think this is where Jeff probably got the idea from) ... fees coming online for bitcoin TX is a storm in a teacup by comparison.

During July and September coinwallet.eu attack, all the blocks were full for at least a week, but you just need to raise the fee to 0.0005 btc to get a confirmation in 10 minutes, how is that a storm in a teacup?

johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
December 24, 2015, 01:27:16 AM
 #314


This is indeed true and is included in the FAQ backed by most of the developers and something I was unaware of as well. I haven't done the math but appears that a 2 MB block with heavy P2SH that can extend validation time to those lengths with certain nodes. It is likely representative as a worst case scenario but does support an idea of how even a modest increase can bring down nodes in an already delicate environment where we have too much centralization.


If this is true then SW is not a good idea since it increased the effective block size, and when you have signature and transactions separated, shouldn't it take longer time to verify? If a 3.2MB block takes 10 minutes to verify then the SW will not work at all since it bumps it to 4MB, attackers only need to send out such specifically constructed blocks to stall the network

BitUsher
Legendary
*
Offline Offline

Activity: 994
Merit: 1034


View Profile
December 24, 2015, 01:34:33 AM
 #315


This is indeed true and is included in the FAQ backed by most of the developers and something I was unaware of as well. I haven't done the math but appears that a 2 MB block with heavy P2SH that can extend validation time to those lengths with certain nodes. It is likely representative as a worst case scenario but does support an idea of how even a modest increase can bring down nodes in an already delicate environment where we have too much centralization.


If this is true then SW is not a good idea since it increased the effective block size, and when you have signature and transactions separated, shouldn't it take longer time to verify? If a 3.2MB block takes 10 minutes to verify then the SW will not work at all since it bumps it to 4MB, attackers only need to send out such specifically constructed blocks to stall the network

The point of the FaQ is to point out that simply increasing the block limit isn't enough and ....

In 2MB blocks, a 2MB transaction can be constructed that may take over 10 minutes to validate which opens up dangerous denial-of-service attack vectors. Other lines of code would need to be changed to prevent these problems.

The developers are cognizant of these problems and thus why they are proposing a more complicated proposal than simply changing one variable.... Meaning they are going to prevent that attack when they raise capacity by 1.6-4x with SepSig specifically by changing those other lines of code.
BitUsher
Legendary
*
Offline Offline

Activity: 994
Merit: 1034


View Profile
December 24, 2015, 01:42:17 AM
 #316

Bitcoin itself is a huge "Economic Change Event" in the wider context of the existing monetary systems (i think this is where Jeff probably got the idea from) ... fees coming online for bitcoin TX is a storm in a teacup by comparison.

During July and September coinwallet.eu attack, all the blocks were full for at least a week, but you just need to raise the fee to 0.0005 btc to get a confirmation in 10 minutes, how is that a storm in a teacup?

Concerns of a Fee Market Event may be valid. The coinwallet.eu attack would only briefly fill blocks during a few moment here and there and never created sustained filled blocks for a long period of time. https://www.quandl.com/data/BCHAIN/AVBLS

We have no idea what will happen if a there is sustained Fee event --

This is why Garzik defines such fee event as

FE - "Fee Event", the condition where main chain MSG_BLOCK is 95+% to hard
limit for 7 or more days in row, "blocks generally full"

https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011973.html

Bitcoin has never seen this before and we don't know what will happen.


johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
December 24, 2015, 01:51:59 AM
 #317

Bitcoin itself is a huge "Economic Change Event" in the wider context of the existing monetary systems (i think this is where Jeff probably got the idea from) ... fees coming online for bitcoin TX is a storm in a teacup by comparison.

During July and September coinwallet.eu attack, all the blocks were full for at least a week, but you just need to raise the fee to 0.0005 btc to get a confirmation in 10 minutes, how is that a storm in a teacup?

Concerns of a Fee Market Event may be valid. The coinwallet.eu attack would only briefly fill blocks during a few moment here and there and never created sustained filled blocks for a long period of time. We have no idea what will happen if a there is sustained Fee event --

This is why Garzik defines such fee event as

FE - "Fee Event", the condition where main chain MSG_BLOCK is 95+% to hard
limit for 7 or more days in row, "blocks generally full"

https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-December/011973.html

Bitcoin has never seen this before and we don't know what will happen.


It is easy to predict from each individual user point of view: If I'm currently doing one transaction per day and it cost 0.0001 btc for fee, if the transaction capacity gets full, I will combine 2 transactions and do one transaction every 2 days and pay 0.0002 btc for fee

If everyone is doing this, then the number of transactions will be cut by half, all the blocks become half-empty

In old times, when banks are closed during weekends, people will do their withdraw early and they withdraw much more to deal with the transaction capacity stop, and they won't give up using banks





BitUsher
Legendary
*
Offline Offline

Activity: 994
Merit: 1034


View Profile
December 24, 2015, 01:58:09 AM
 #318

It is easy to predict from each individual user point of view: If I'm currently doing one transaction per day and it cost 0.0001 btc for fee, if the transaction capacity gets full, I will combine 2 transactions and do one transaction every 2 days and pay 0.0002 btc for fee

If everyone is doing this, then the number of transactions will be cut by half, all the blocks become half-empty

I am more concerned with the automated scripts that verify payments , full mempools crashing nodes, and new adopters being confused when they don't receive a confirmation. Users like us who pay attention to specifics and know workarounds like paying a higher fee or combining txs may become the minority in a "fee market event"

All I'm suggesting is that there are many valid viewpoints and we should prepare for these concerns with tested backup plans. Hopefully a massive swell in adoption can also be directed to use off the chain solutions like coinbase/circle as well to temporarily buffer any negative impact a fee market event creates.

There isn't a single solution that will allow us to grow but many combined solutions that must be implemented. Bitcoin is fragile and cannot scale well right now until we make many changes.
johnyj
Legendary
*
Offline Offline

Activity: 1988
Merit: 1012


Beyond Imagination


View Profile
December 24, 2015, 02:53:30 AM
 #319


Satoshi did build a lot of extensibility and op codes within the original design so bitcoin could grow, evolve, and use layers like the lightning network. While I do respect Satoshi we shouldn't worship him and treat everything he has done as sacrosanct as he has made many mistakes. What is more important is us respecting the investment contract we have all agreed to over the years about respecting the core fundamentals that makes bitcoin unique. Satoshi can always sign a PGP key and jump and and make a comment if he has some serious concerns as well.  


It is not worship, more like respect the intellectual property of the original designer

What makes bitcoin valuable? The idea tested by time. You could have more refined design than bitcoin like Ethereum, but without the test of time, any code is just a piece of open source software which worth almost nothing. Overtime, many people little by little build up the trust and value of bitcoin, obviously its architecture is part of this value

Imagine that if you redesign an altcoin with SW architecture, would it get any value? Almost none of course. However, if you put this design as a soft fork of bitcoin, trying to slowly seek its way into the bitcoin ecosystem and become part of it. This kind of action is very alike virus or trojan, highly malicious. If your design is really genius and excellent, you should ask for major approval and implement it as a hard fork. Being a soft fork just feels shady

If anything that scales bitcoin can be accepted, I'm sure we will have Cisco/Ericsson/VISA bitcoin architecture that can scale much better than SW, anyway their engineers are dealing with traffic of millions of nodes, bitcoin level of petty traffic would make them laugh, their engineering team will totally replace bitcoin core devs, right?

jbreher
Legendary
*
Offline Offline

Activity: 3038
Merit: 1660


lose: unfind ... loose: untight


View Profile
December 24, 2015, 04:31:20 AM
 #320


To be clear, is this re-serialization totaling 1.25 GB something that the _current_ Bitcoin Core does when faced with this aberrant block, or are we comparing apples to oranges?

Got a link to the presentation?

F2Pool did this on their node, the video is from the September scaling bitcoin conference,

 https://www.youtube.com/watch?v=TgjrS-BPWDQ

https://scalingbitcoin.org/montreal2015/presentations/Day2/11-Friedenbach-scaling-bitcoin.pdf

Found it. https://www.youtube.com/watch?v=TgjrS-BPWDQ&feature=youtu.be&t=3h05m Mark Freidenbach (?) near the end of the morning session at day 2 of the Scaling Bitcoin conference.


eta: ^ did you sneak an edit in with the link while I was off looking for it?

So during the slide you had a pic of above, he was talking about a non-standard block that aggregated as much dust as F2Pool was able to fit in a single block. Note that those transactions that went into that block are already unrelayable under standard rules.

In summary remarks he indicated that 'blocksize is a poor indicator of the resource consumption required to process a block'.

Anyone with a campaign ad in their signature -- for an organization with which they are not otherwise affiliated -- is automatically deducted credibility points.

I've been convicted of heresy. Convicted by a mere known extortionist. Read my Trust for details.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 [16] 17 18 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!