Bitcoin Forum
April 26, 2024, 02:56:44 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Poll
Question: Will you support Gavin's new block size limit hard fork of 8MB by January 1, 2016 then doubling every 2 years?
1.  yes
2.  no

Pages: « 1 ... 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 [1190] 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 ... 1557 »
  Print  
Author Topic: Gold collapsing. Bitcoin UP.  (Read 2032135 times)
Zangelbert Bingledack
Legendary
*
Offline Offline

Activity: 1036
Merit: 1000


View Profile
May 10, 2015, 04:54:15 PM
 #23781

By an eyeballing it's about 6 years from now (tenfolding about every four years), but I kind of doubt extrapolation can hold up very well as the transactional currency aspect doesn't really start to come to the fore until later when the network effects reach critical mass. Perhaps the exponential growth is enough to account for that, but I suspect something faster unless a lot of the transaction volume moves off chain.

Assuming 20MB means about 100 TPS, ten years from now we'd be at 200MB blocks and 1000 TPS, then by around 2030 we'd be into the 2GB and 10000+ TPS range, which looks like pretty solid global adoption (roughly 4000x current TPS). I suppose that's not unreasonable.

I should note that some say the 1MB limit is already pushing down the increase, which would be another reason to doubt the extrapolation.

While I'm extrapolating, if we consider price at that level to be something in the very general ballpark of $1M per BTC, which is about 4000x the current price, it all fits together somewhat cleanly. Though that means the price will only tenfold roughly every 4 years as well. Though who knows, target price could be $10M or even $100M for all I know - in which case the price would tenfold about every 3 years or 2.5 years, respectively, on average for the next ~15 years. Of course if we get an S-curve for the price growth, most of those gains will be front loaded over the next several years.

/speculation Grin
1714143404
Hero Member
*
Offline Offline

Posts: 1714143404

View Profile Personal Message (Offline)

Ignore
1714143404
Reply with quote  #2

1714143404
Report to moderator
1714143404
Hero Member
*
Offline Offline

Posts: 1714143404

View Profile Personal Message (Offline)

Ignore
1714143404
Reply with quote  #2

1714143404
Report to moderator
1714143404
Hero Member
*
Offline Offline

Posts: 1714143404

View Profile Personal Message (Offline)

Ignore
1714143404
Reply with quote  #2

1714143404
Report to moderator
You get merit points when someone likes your post enough to give you some. And for every 2 merit points you receive, you can send 1 merit point to someone else!
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
cypherdoc (OP)
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
May 10, 2015, 04:57:13 PM
 #23782

By an eyeballing it's about 6 years from now (tenfolding about every four years), but I kind of doubt extrapolation can hold up very well as the transactional currency aspect doesn't really start to come to the fore until later when the network effects reach critical mass. Perhaps the exponential growth is enough to account for that, but I suspect something faster unless a lot of the transaction volume moves off chain.

Assuming 20MB means about 100 TPS, ten years from now we'd be at 200MB blocks and 1000 TPS, then by around 2030 we'd be into the 2GB and 10000+ TPS range, which looks like pretty solid global adoption (roughly 4000x current TPS). I suppose that's not unreasonable.

By the way, if we consider price at that level to be something in the very general ballpark of $1M per BTC, which is about 4000x the current price, it all fits together somewhat cleanly. Though that means the price will only tenfold roughly every 4 years as well. Though who knows, target price could be $10M or even $100M for all I know - in which case the price would tenfold about every 3 years or 2.5 years, respectively, on average for the next ~15 years. Of course if we get an S-curve for the price growth, most of those gains will be front loaded over the next several years.

/speculation Grin

Then, as you've said before, we need to get going!
Zangelbert Bingledack
Legendary
*
Offline Offline

Activity: 1036
Merit: 1000


View Profile
May 10, 2015, 05:27:07 PM
 #23783

Then, as you've said before, we need to get going!

I do detect a trace of inevitability in the air recently.

Even this blocksize debate, with all the hypothetical perils it highlights going forward, has left me with one remarkable sensation I haven't experienced in all my years of online discussion. Although it may look like there is a lot of squabbling and trolling and facile arguments, when compared to the baseline Internet noise level I've never seen so many people come together in debate in such a serious way with the mental discipline that comes from having so much truly on the line, both financially and personally (or ideologically), united in basic vision even if disagreeing on the details for how to get there.

It hints at the formidable power of the world's first purely economic community, and the all-permeating effect of Bitcoin's allure - it's potency in aligning people with its agenda regardless of their organizational affiliations.
Peter R
Legendary
*
Offline Offline

Activity: 1162
Merit: 1007



View Profile
May 10, 2015, 05:58:07 PM
 #23784

Peter, can you extend the chart out to the right so we can see what year the 20MB size would be hit?


Run Bitcoin Unlimited (www.bitcoinunlimited.info)
tvbcof
Legendary
*
Offline Offline

Activity: 4592
Merit: 1276


View Profile
May 10, 2015, 06:53:24 PM
Last edit: May 10, 2015, 07:09:10 PM by tvbcof
 #23785

Peter, can you extend the chart out to the right so we can see what year the 20MB size would be hit?



Thx for that.  It shows quite clearly that attempts to outgrow most of the scaling problems that vex Bitcoin by doing simplistic scaling are pretty futile which is a point of view that I held since before I bought my first BTC (on e-bay IIRC.)

I'm using the same computer now that I put together around the time Bitcoin was invented.  It's obsolete, but not horribly so.  i5 chipset (Nehalem), 4G, a few TB of mirrored encrypted storage, etc.  Sure, I could build a much better computer now (although not all that much better), but the ONLY reason I would have any need to do so would be to try to run a simple transfer node.  My network capacity has decreased by orders of magnitude since I moved out of the silicon valley so even at 7 TPS I probably would not try and if I did I would only activate it at hours when my data was not metered.

Upshot: I could still play a constructive support role in infrastructure support if I had good reason to.  One of the main reasons I do not is that if my contribution made much of a difference in a stressed situation where it would be of value is that my efforts could be nixed at a flip of the switch by simple injunctive measures (network censorship.)  Because Bitcoin dev has not focused on (or even acknowledged) this potential failure mode I feel little incentive to waste my time and money trying to support the robustness of the solution.

The chart shows that in roughly the short time I've been involved (mid 2011) we will be right back to where we are again with 20 MB (forgetting for a moment the little issue that is supposed to be forgotten that many people's 20MB has an exponential growth factor beyond that.)  There was a huge amount of low hanging fruit codebase-wise to harvest getting 1MB to work to the extent that it does.  That luxury will not be present moving into the 20 MB limit by the nature of how computer science is done.

I made several mis-calculations about Bitcoin at the time I put some actual fund into the blockchain:

1) That things would naturally centralize around such entities as blockchain.info, coinbase, etc, and thus alleviate the need to grow the transaction rate.  (A positive mis-calculation.)

2) That it would be so blatantly obvious that Bitcoin's only realistic trajectory would be as a backing store for similar solutions by the time we stressed the transaction rate limits that nobody could argue otherwise.  (A negative mis-calculation.)

edit: slight.  Also:

I would again note that the issue charted is UTXO which is not particularly related to transaction rate.  An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.  I've not 'done the math', but it seems somewhat intuitive to me that such an 'attack' would happen organically upon reasonable use rates (which we have yet to even approach in the real-world to this point if Bitcoin remains a 'one-size-fits-all' exchange currency solution.)


sig spam anywhere and self-moderated threads on the pol&soc board are for losers.
Zangelbert Bingledack
Legendary
*
Offline Offline

Activity: 1036
Merit: 1000


View Profile
May 10, 2015, 07:17:41 PM
 #23786

An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.

Something like this was brought up on reddit. Why not have higher fees for these kind of "tvbcof transactions"? (Higher fees in proportion to how much they scatter the UTXOs. And perhaps lower or zero fees for transactions that consolidate UTXOs.)
tvbcof
Legendary
*
Offline Offline

Activity: 4592
Merit: 1276


View Profile
May 10, 2015, 07:31:58 PM
 #23787

An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.

Something like this was brought up on reddit. Why not have higher fees for these kind of "tvbcof transactions"? (Higher fees in proportion to how much they scatter the UTXOs. And perhaps lower or zero fees for transactions that consolidate UTXOs.)

That would be one solution.  Another would be to periodically do what I would call a 're-base'.  Neither could be achieved without either

 - a major-ish re-design (and more and more changes qualify as 'major-ish' as Bitcoin ages), OR

 - with some increased control over the solution which could probably only come with processing centralization.  When one can mandate a 'tvbcof transaction tax' through this mechanism we will be well beyond the point where it will have been possible to implement the 'Qaddafi block' which Hearn hypothesized back around the time I took an active(-ish self) interest in Bitcoin.


sig spam anywhere and self-moderated threads on the pol&soc board are for losers.
Chainsaw
Hero Member
*****
Offline Offline

Activity: 625
Merit: 501


x


View Profile
May 10, 2015, 07:34:17 PM
 #23788

Peter, can you extend the chart out to the right so we can see what year the 20MB size would be hit?



Thx for that.  It shows quite clearly that attempts to outgrow most of the scaling problems that vex Bitcoin by doing simplistic scaling are pretty futile which is a point of view that I held since before I bought my first BTC (on e-bay IIRC.)

I'm using the same computer now that I put together around the time Bitcoin was invented.  It's obsolete, but not horribly so.  i5 chipset (Nehalem), 4G, a few TB of mirrored encrypted storage, etc.  Sure, I could build a much better computer now (although not all that much better), but the ONLY reason I would have any need to do so would be to try to run a simple transfer node.  My network capacity has decreased by orders of magnitude since I moved out of the silicon valley so even at 7 TPS I probably would not try and if I did I would only activate it at hours when my data was not metered.

Upshot: I could still play a constructive support role in infrastructure support if I had good reason to.  One of the main reasons I do not is that if my contribution made much of a difference in a stressed situation where it would be of value is that my efforts could be nixed at a flip of the switch by simple injunctive measures (network censorship.)  Because Bitcoin dev has not focused on (or even acknowledged) this potential failure mode I feel little incentive to waste my time and money trying to support the robustness of the solution.

The chart shows that in roughly the short time I've been involved (mid 2011) we will be right back to where we are again with 20 MB (forgetting for a moment the little issue that is supposed to be forgotten that many people's 20MB has an exponential growth factor beyond that.)  There was a huge amount of low hanging fruit codebase-wise to harvest getting 1MB to work to the extent that it does.  That luxury will not be present moving into the 20 MB limit by the nature of how computer science is done.

I made several mis-calculations about Bitcoin at the time I put some actual fund into the blockchain:

1) That things would naturally centralize around such entities as blockchain.info, coinbase, etc, and thus alleviate the need to grow the transaction rate.  (A positive mis-calculation.)

2) That it would be so blatantly obvious that Bitcoin's only realistic trajectory would be as a backing store for similar solutions by the time we stressed the transaction rate limits that nobody could argue otherwise.  (A negative mis-calculation.)

edit: slight.  Also:

I would again note that the issue charted is UTXO which is not particularly related to transaction rate.  An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.  I've not 'done the math', but it seems somewhat intuitive to me that such an 'attack' would happen organically upon reasonable use rates (which we have yet to even approach in the real-world to this point if Bitcoin remains a 'one-size-fits-all' exchange currency solution.)



This could be a high visibility chart.
We are extrapolating a line. I want to point out a risk.

The drawn line extrapolates based on an assumption of linear growth from some point midway along the function into the future.
If we drew a linear line from the start of the dataset through the current time, we would hit 20MB at a different, earlier date.
If we drew the line of best fit as a polynomial function (which is currently above the line and returning to it), we would hit 20 MB at a different, still earlier date.
If we drew a sigmoid function where we are approaching a ceiling, it is possible that limit would never hit 20MB.

If it is within anyone's capacity, I think it would be worth throwing these data points into some statistical software and determining line(s) of best fit, with correlations and such.
It would turn something that is subjectively interpretible into something objective.
I think that's important in this debate, for many of the reasons Zangelbert mentioned above.

cypherdoc (OP)
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
May 10, 2015, 09:56:48 PM
 #23789

Peter, can you extend the chart out to the right so we can see what year the 20MB size would be hit?



Thx for that.  It shows quite clearly that attempts to outgrow most of the scaling problems that vex Bitcoin by doing simplistic scaling are pretty futile which is a point of view that I held since before I bought my first BTC (on e-bay IIRC.)

I'm using the same computer now that I put together around the time Bitcoin was invented.  It's obsolete, but not horribly so.  i5 chipset (Nehalem), 4G, a few TB of mirrored encrypted storage, etc.  Sure, I could build a much better computer now (although not all that much better), but the ONLY reason I would have any need to do so would be to try to run a simple transfer node.  My network capacity has decreased by orders of magnitude since I moved out of the silicon valley so even at 7 TPS I probably would not try and if I did I would only activate it at hours when my data was not metered.

Upshot: I could still play a constructive support role in infrastructure support if I had good reason to.  One of the main reasons I do not is that if my contribution made much of a difference in a stressed situation where it would be of value is that my efforts could be nixed at a flip of the switch by simple injunctive measures (network censorship.)  Because Bitcoin dev has not focused on (or even acknowledged) this potential failure mode I feel little incentive to waste my time and money trying to support the robustness of the solution.

The chart shows that in roughly the short time I've been involved (mid 2011) we will be right back to where we are again with 20 MB (forgetting for a moment the little issue that is supposed to be forgotten that many people's 20MB has an exponential growth factor beyond that.)  There was a huge amount of low hanging fruit codebase-wise to harvest getting 1MB to work to the extent that it does.  That luxury will not be present moving into the 20 MB limit by the nature of how computer science is done.

I made several mis-calculations about Bitcoin at the time I put some actual fund into the blockchain:

1) That things would naturally centralize around such entities as blockchain.info, coinbase, etc, and thus alleviate the need to grow the transaction rate.  (A positive mis-calculation.)

2) That it would be so blatantly obvious that Bitcoin's only realistic trajectory would be as a backing store for similar solutions by the time we stressed the transaction rate limits that nobody could argue otherwise.  (A negative mis-calculation.)

edit: slight.  Also:

I would again note that the issue charted is UTXO which is not particularly related to transaction rate.  An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.  I've not 'done the math', but it seems somewhat intuitive to me that such an 'attack' would happen organically upon reasonable use rates (which we have yet to even approach in the real-world to this point if Bitcoin remains a 'one-size-fits-all' exchange currency solution.)



This could be a high visibility chart.
We are extrapolating a line. I want to point out a risk.

The drawn line extrapolates based on an assumption of linear growth from some point midway along the function into the future.
If we drew a linear line from the start of the dataset through the current time, we would hit 20MB at a different, earlier date.
If we drew the line of best fit as a polynomial function (which is currently above the line and returning to it), we would hit 20 MB at a different, still earlier date.
If we drew a sigmoid function where we are approaching a ceiling, it is possible that limit would never hit 20MB.

If it is within anyone's capacity, I think it would be worth throwing these data points into some statistical software and determining line(s) of best fit, with correlations and such.
It would turn something that is subjectively interpretible into something objective.
I think that's important in this debate, for many of the reasons Zangelbert mentioned above.

yeah, i agree.  the rate of tx growth could be highly variable.  almost as volatile or even dependent on the price.
shmadz
Legendary
*
Offline Offline

Activity: 1512
Merit: 1000


@theshmadz


View Profile
May 10, 2015, 10:34:30 PM
 #23790

Regarding transaction growth and the limits of bandwidth, has anyone thought of the possibility of parallel chains?

Split the Blockchain into several chunks and distribute.

Is there any discussion of this at all? Or is it just a stupid idea?

* Edit: never mind, I guess that's essentially sidechains, isn't it?

"You have no moral right to rule us, nor do you possess any methods of enforcement that we have reason to fear." - John Perry Barlow, 1996
cypherdoc (OP)
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
May 10, 2015, 11:18:57 PM
 #23791

Regarding transaction growth and the limits of bandwidth, has anyone thought of the possibility of parallel chains?

Split the Blockchain into several chunks and distribute.

Is there any discussion of this at all? Or is it just a stupid idea?

* Edit: never mind, I guess that's essentially sidechains, isn't it?

Yeah, the problem with those ideas is that you want to try and keep all TX's on the mainchain as possible to pay miners fees, imo, so as to minimize cannibalizing or even killing off Bitcoin. Any of those offchain alternatives are  likely to create friction as well.
shmadz
Legendary
*
Offline Offline

Activity: 1512
Merit: 1000


@theshmadz


View Profile
May 10, 2015, 11:46:08 PM
 #23792

Regarding transaction growth and the limits of bandwidth, has anyone thought of the possibility of parallel chains?

Split the Blockchain into several chunks and distribute.

Is there any discussion of this at all? Or is it just a stupid idea?

* Edit: never mind, I guess that's essentially sidechains, isn't it?

Yeah, the problem with those ideas is that you want to try and keep all TX's on the mainchain as possible to pay miners fees, imo, so as to minimize cannibalizing or even killing off Bitcoin. Any of those offchain alternatives are  likely to create friction as well.

So... what if you fragment the chain? Like a distributed computing project, so each node only had to do a fraction of the work. But then maybe you'd have problems with forks... but if there was a protocol that was very small, very fast that could help speed up consensus... Gavin talked about something like this once, invertable bloom filters, I think?


"You have no moral right to rule us, nor do you possess any methods of enforcement that we have reason to fear." - John Perry Barlow, 1996
cypherdoc (OP)
Legendary
*
Offline Offline

Activity: 1764
Merit: 1002



View Profile
May 11, 2015, 12:16:05 AM
 #23793

Regarding transaction growth and the limits of bandwidth, has anyone thought of the possibility of parallel chains?

Split the Blockchain into several chunks and distribute.

Is there any discussion of this at all? Or is it just a stupid idea?

* Edit: never mind, I guess that's essentially sidechains, isn't it?

Yeah, the problem with those ideas is that you want to try and keep all TX's on the mainchain as possible to pay miners fees, imo, so as to minimize cannibalizing or even killing off Bitcoin. Any of those offchain alternatives are  likely to create friction as well.

So... what if you fragment the chain? Like a distributed computing project, so each node only had to do a fraction of the work. But then maybe you'd have problems with forks... but if there was a protocol that was very small, very fast that could help speed up consensus... Gavin talked about something like this once, invertable bloom filters, I think?



he's never talked about that, afaik.  IBLT is a totally different thing.  that is a set reconciiation strategy dependent on the fact that most full nodes are carrying a set of unconfirmed tx's in their RAM that are already pretty close to identical to each others across the network.  the smaller the differences, the smaller the IBLT has to be to reconcile those differences.  the strategy is apparently only about 4 yo invented after Bitcoin. 

Gavin has proposed using IBLT's by miners who solve a block who then, instead of retransmitting the block with all it's tx's across the network which involves considerable latency, only then have to transmit the IBLT with the header to other nodes who then can reconstruct their unconfirmed tx set using the IBLT to match that of the proposed block to then see if the POW was in fact valid.

that was pretty tortured language so i hope you understand.
shmadz
Legendary
*
Offline Offline

Activity: 1512
Merit: 1000


@theshmadz


View Profile
May 11, 2015, 12:37:02 AM
 #23794

Regarding transaction growth and the limits of bandwidth, has anyone thought of the possibility of parallel chains?

Split the Blockchain into several chunks and distribute.

Is there any discussion of this at all? Or is it just a stupid idea?

* Edit: never mind, I guess that's essentially sidechains, isn't it?

Yeah, the problem with those ideas is that you want to try and keep all TX's on the mainchain as possible to pay miners fees, imo, so as to minimize cannibalizing or even killing off Bitcoin. Any of those offchain alternatives are  likely to create friction as well.

So... what if you fragment the chain? Like a distributed computing project, so each node only had to do a fraction of the work. But then maybe you'd have problems with forks... but if there was a protocol that was very small, very fast that could help speed up consensus... Gavin talked about something like this once, invertable bloom filters, I think?



he's never talked about that, afaik.  IBLT is a totally different thing.  that is a set reconciiation strategy dependent on the fact that most full nodes are carrying a set of unconfirmed tx's in their RAM that are already pretty close to identical to each others across the network.  the smaller the differences, the smaller the IBLT has to be to reconcile those differences.  the strategy is apparently only about 4 yo invented after Bitcoin. 

Gavin has proposed using IBLT's by miners who solve a block who then, instead of retransmitting the block with all it's tx's across the network which involves considerable latency, only then have to transmit the IBLT with the header to other nodes who then can reconstruct their unconfirmed tx set using the IBLT to match that of the proposed block to then see if the POW was in fact valid.

that was pretty tortured language so i hope you understand.

Yes, I was imagining using IBLT to maybe provide a mechanism to allow for some kind of parallelization of the nodes, so that each node only needs to contain a small chunk of the UTXO data.

I guess what I'm envisioning is a node structure where it might be possible to split up the work of maintaining the blockchain similar to how storj is fragmenting and distributing files.

Any proposed solutions I've seen so far to the scaling issue have been incremental at best, so I want to start spitballing some more "outside the box" crazy proposals, but I'm not a programmer so I only have a rudimentary idea of what's feasible.

Sorry if it's all just a waste of time.

"You have no moral right to rule us, nor do you possess any methods of enforcement that we have reason to fear." - John Perry Barlow, 1996
thezerg
Legendary
*
Offline Offline

Activity: 1246
Merit: 1010


View Profile
May 11, 2015, 12:57:43 AM
 #23795

I believe I know how to solve that.  There are 3 innovations that form the core of a microtxn coin -- a problem I've been keen to solve because like I said years ago i think there are room 4 3 coins:
BTC: first mover for holding and large txns.
AnonCoin: maybe MRO
MicroCoin: undefined.

The 3 innovations are:
1. UTXO oriented protocol and storage.  (Merkle UTXO trees that I discussed previously)
2. UTXO size consolidation.  There are various approaches.  The previously mentioned idea for addtl fees if outputs > inputs is not quite there.  Better is addtl fee for every not-previously seen address and UTXO merkle is changed to be address-with-balance merkle tree.  Other ideas are to make address creation expensive (like vanity addresses) and address consolidation txn gets fee rebate.
3. Transformation of blockchain to scale.


And I think 2 more innovation might as well be thrown in.
4. A distributed trustless way to communicate price of coin against USD. (Solved)
5. A way to inject oracle data into the blockchain so scripting has value (easy)

Anyway hoping to lay out my microcoin in a doc soon but I can't wrap my head around how to play nice with bitcoin... sidechain altcoin etc... what would you guys do?
TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420
Merit: 257


View Profile
May 11, 2015, 01:02:50 AM
 #23796

Surely the vitamin D3 is working because of the intense volcano energy I feel right now if one of those detractors would join me in the boxing ring right now.

But I have numb legs today from the knee down and a mild gut pain. But other than that, I am strong enough today. This is a significant improvement over March, where I couldn't even think or keep my body up out of bed most of the time.

Totally OT at this point, but upping your Vitamin D3 intake means you also need to keep your electrolytes in balance. If you start feeling sore muscles or inexplicable fatigue, that's your cue to load up on minerals. After I upped my D3 intake to 5000 IU per day, I also started supplementing 250mg magnesium oxide, 1000mg potassium chloride, and 2000mg sodium chloride.

I can't even afford the time (and expense) of travel to a first world country to get a proper MRI and have my blood work and panels properly monitored and interpreted. As I said, "my back is against the wall". I need to be very careful about minerals because I am dosing 50,000+ IU per day, thus I am at risk of kidney stones and permanent renal damage due to concentration of minerals in the kidneys. So I am drinking 2 - 3L of water a day to flush them out continuously. I am craving foods, so clearly my body wants to replenish. I am in a tropical country and eating natural whole foods too, so hopefully I'll be okay. I feel I've turned the corner. Today I am feeling very strong and want to get to the gym. I am feeling a slight lethargy today in my head, yet I also feel an energy reservoir that is ready to explode at the gym. For me, these are signs of my original self, so I am encouraged. Thanks again for the sentiments. We can stop this OT discussion now.

TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420
Merit: 257


View Profile
May 11, 2015, 01:10:21 AM
Last edit: May 11, 2015, 01:31:21 AM by TPTB_need_war
 #23797

So that is one argument that can be made against the pool's having control. Note it doesn't impact my other upthread (and very on topic) point that larger blocks favor centralization because higher orphan rates do.

larger bloat blocks have a higher chance of being orphaned

I reiterate my upthread point that higher orphan rate favors larger pools because they will be better connected (don't forget the NSA has direct taps on major trunk lines and the high-speed traders on Wall Street have this superior connectivity too) and thus mine on orphaned chains less, thus have high profits for their miners, thus driving more miners to them and making the smaller pools go bankrupt. This is a variant of the selfish mining effect.

Thus I will repeat again that larger blocks = centralization.

I hope that is clear now, since apparently you didn't get it the first time and apparently ignored or didn't understand me?

(if you did ignore before, covering your ears won't help you)

My radical re-design totally eliminates the issue of orphans and the critical advantages of connectivity latency. It is a radical paradigm shift that solves the problem that Bitcoin was designed to be centralized and there is nothing you can do within Bitcoin's current design to stop that! (I will elaborate soon)

Add: and the key point of distinction is that in Bitcoin in order to get a transaction to have a confirmation then it must be put into a block. In my novel new design, transactions don't have to be put in blocks in order to be confirmed. That is a very strong head scratching hint for you!

well that'll be a trick b/c the blockchain is Satoshi's fundamental contribution to tx security that was missing for all these decades of digital money formation.  each block cements the tx's into the chain via POW.  

you need to elaborate on your purported innovation to have any meaningful discussion.

Wink its a puzzle and you took the expected path into the maze and that is why you did not solve it. Epiphanies are like that. Yes I will have to elaborate but I am not going to give away such a valuable insight for free. I'd rather implement and profit. Wouldn't you?

You'd be crazy not to invest.

Yes it is a trick. The key insight is into how to make things orthogonal with a clever twist. I must admit, the more I think about it, the less obvious it is. I do have the antithetic weakness of the Dunning-Kruger syndrome, "Conversely, highly skilled individuals tend to underestimate their relative competence, erroneously assuming that tasks which are easy for them are also easy for others.".

For $10,000 investment and promise of secrecy, you can find out now.

Note my Bitmessage is not functioning on my new ISP, so anyone that wants to talk with me should PM me then we will go into encrypted webchat (very easy just load a webpage and chat).

tvbcof
Legendary
*
Offline Offline

Activity: 4592
Merit: 1276


View Profile
May 11, 2015, 01:23:57 AM
 #23798


Yes, I was imagining using IBLT to maybe provide a mechanism to allow for some kind of parallelization of the nodes, so that each node only needs to contain a small chunk of the UTXO data.

I guess what I'm envisioning is a node structure where it might be possible to split up the work of maintaining the blockchain similar to how storj is fragmenting and distributing files.

Any proposed solutions I've seen so far to the scaling issue have been incremental at best, so I want to start spitballing some more "outside the box" crazy proposals, but I'm not a programmer so I only have a rudimentary idea of what's feasible.

Sorry if it's all just a waste of time.

Not a waste of time, but it's true that the idea of sharding is about the first thing that pops into anyone's head since it is a go-to solution for such problems in many domains.  A problem with it as a solution to scaling problems of the block chain is that the UTXO set is real-time (whereas a block chain is batch mode) and one needs to keep a real-time eye on the stream in order to avoid being scammed (or suck it up and wait for a sufficient number of blocks to be cemented in by others.)  And, if one is already keeping an eye on a real-time stream one may as well retain the stream which gives one a whole blockchain anyway.

So-called 'treechains' seems to offer some possibility to do what I would conceptualize as 'longitudinal sharding'.  Using something like a btree logic to retain transaction chains in a longitudinal shard ('fiber'?) might be one something that could have worked and still allowed distributed system support.  That would have been interesting, but that bird had flown at the same time Satoshi made his announcement on the mailing list as I see it.

Subordinate chains (which was my word for sidechains from when I first got interested in 2011) seem to be a serviceable solution which solve the scaling problems in a very simple and effective manner.  A little crude-ish and thus not as sexy as treechains and like sharding, but that simplicity  in and of itself is confidence inspiring to me.  A very notable side benefit is the isolation from core Bitcoin which allows many different solutions to be tuned to many different problems.  Relatedly, it allows many more people and entities to become stakeholders in, and potential supporters of, Bitcoin's health.  I like that very much.

---

On a different topic (as cypherdoc says), the first thing I thought of when I read up on IBLT's is that they could be a marvelously efficient way to communicate info needed to ensure that transaction blocks were correctly formed in a number of ways.  One biggie would be that they contained only 'authorized' transactions and/or did not contain 'unauthorized' ones.  Such a thing would be necessary and necessarily efficient in order to implement whitelisting or blacklisting.  To this day I have seen nobody comment about this (potential) 'feature' of such a development.


sig spam anywhere and self-moderated threads on the pol&soc board are for losers.
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
May 11, 2015, 01:30:13 AM
Last edit: May 11, 2015, 01:48:18 AM by solex
 #23799

An attack I thought of years ago would be to formulate UTXO's systematically to chain blocks together in such a way that 1) their count would stress the working database (currently implemented in RAM most often) and 2) verifying them would touch as many blocks as possible making rendering of much of the actual blockchain itself require for many transactions.  I'm sure there is a name for such an attack.  If not, call it the 'tvbcof attack' I suppose.

Something like this was brought up on reddit. Why not have higher fees for these kind of "tvbcof transactions"? (Higher fees in proportion to how much they scatter the UTXOs. And perhaps lower or zero fees for transactions that consolidate UTXOs.)

A spotlight has gone onto the UTXO because it is bloating, e.g.

FWIW, here are a few observations related to the growth of the utxo set.

The growth rate of the utxo set has increased since late 2013 / early 2014.

More interestingly, a repeated pattern has appeared since early 2014, showing steps every sunday (around 100k utxos added to the set every sunday).



My bold emphasis.

Gavin has kicked over the card-table by writing his 20MB patch and going public with it. There is now a phenomenal amount of constructive debate going on in dev, particularly around ideas like leveraging an increased block limit for maintaining a cleaner utxo set and an improved fees market by paying for extra block space.
Interesting that the market is looking more bullish since Gavin's announcement than a long while previously.

The drawn line extrapolates based on an assumption of linear growth from some point midway along the function into the future.
If we drew a linear line from the start of the dataset through the current time, we would hit 20MB at a different, earlier date.
If we drew the line of best fit as a polynomial function (which is currently above the line and returning to it), we would hit 20 MB at a different, still earlier date.
If we drew a sigmoid function where we are approaching a ceiling, it is possible that limit would never hit 20MB.

If it is within anyone's capacity, I think it would be worth throwing these data points into some statistical software and determining line(s) of best fit, with correlations and such.
It would turn something that is subjectively interpretible into something objective.
I think that's important in this debate, for many of the reasons Zangelbert mentioned above.



Agreed, but the projection which Peter has is probably as good as any other fit. Some considerations:

The volatility in 7-day average block size has collapsed since 2010 and would imply a maturing process of the ecosystem. Arguably, the phase up to mid-2012 was primarily usage from "innovators" and since then "early adopters", which continues today. Projecting forward from mid-2012 data would seem more realistic for predicting the near future.

The "bump" from April 2012 to March 2013 is basically the effect of SatoshiDice which doubled the transaction loading until its owner (Eric Voorhees) modified it, and Core Dev changed dust thresholds.

Transaction size has doubled since 2012 from about 250 to 500 bytes (hence the oft-quoted max throughout of 7tps is more like 3tps), largely because of scripting and multisig. Without this the average block data would look more sigmoid. Adam Back suggests that off-chain tx is already 100x on-chain (e.g. including exchange volume etc), and he is probably right. Lighting Networks, Coinbase and the like, could eventually take so much off-chain volume that indeed, the 20MB may not be reached until well after 2021.

TPTB_need_war
Sr. Member
****
Offline Offline

Activity: 420
Merit: 257


View Profile
May 11, 2015, 02:01:16 AM
Last edit: May 11, 2015, 02:23:53 AM by TPTB_need_war
 #23800

I think we should also re-summarize whether pool control can be defeated by the new pool RPC (was it "getblockdata"?) that allows the miner to add or remove transactions. I must admit it has been a while since I looked at that and I might not have completely digested it (in a rush or whatever).

Perhaps someone can chime in so I don't have to go google it.

https://bitcoin.org/en/developer-guide#getblocktemplate-rpc

it's getblocktemplate and i've already pointed out that it gives miners the flexibility to construct their owns blocks.  as a former miner, moving to a new pool is one click away and all of us were watching carefully for any pool operators acting suspicious

I will reintroduce my point from 2013, that you all won't be mining in the future, because the cartels can (in the future) make mining unprofitable for you with the Transactions Withholding Attack. Many of you can't seem to grasp that attack properly, so I am not relying on that to make my argument below. Yet I maintain that the Transactions Withholding Attack is another future reason that Bitcoin mining will become entirely centralized by the banksters. Note the obvious that in the future all mining income will come from tx fees.

However does it really give control the miner? I don't think so. The users still need to forward transactions into the network and eventually the volume of transactions will be too great for miners to listen to and compare with what the pool is sending them. They will at some point be forced to delegate transaction compilation to the pools.

you'll need to give a citation on this.  i'm not aware of any problems with loading the size of the unconfirmed tx's data set into RAM at all on startup.  in fact, the set is fairly uniform across all nodes b/c of the speed of the network which allows proposals like IBLT from Gavin a chance to be implemented.  i've never heard about any concerns going forward on this.

At Visa scale (e.g. 8000 txs/sec) this is no problem. But at micropayments txs scale, i.e. billions of txs per second, the individual miners don't want to duplicate the connectivity and processing power infrastructure necessary to handle that volume of txs.

If the micropayments txs will be handled by an off-chain layer, then perhaps that aspect of my point does not apply. I will study more the off-chain proposals and get back to you this point.

I don't think IBLT can work (afair perhaps I explained my stance in my refutation of Peter Told at afair the Git hosted IBLT paper), but that is apparently OT the main point of discussion herein.

In any case, I have another argument against the intended efficacy of getblocktemplate. Again remember my concern is about a Digital Kill Switch, wherein rarely an individual's number has been shut off and they are not allowed to transaction (for political persecution or whatever), thus these will be rare occurences. So the pools can simply discard and ignore block solutions that insert such excluded transactions. Miners will have a difficult time winning political will over such rare instances. For example, if the miners move to another pool that has an established reputation of never doing this, the banksters can attack that pool by numerous means (e.g. shutting of their regulatory license, Meni's Share Withholding Attack, out-of-band social, political, and economic attacks, etc). Eventually the banksters can force you to their Sybil attacking pools (you never know which one is legit as they will change as soon as you try to Whack-A-Mole).

I am sorry. You can't win this political battle. Because the banksters will have the masses on their side. The masses are complacent and they will stick with what ever miners and pools that Amazon.com directs theirs their txs to.

You can not win this way. It is so obvious. Think it out. Even Satoshi said the mining would be controlled by corporations in the future.

Pages: « 1 ... 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 [1190] 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 ... 1557 »
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!