Bitcoin Forum
May 26, 2024, 04:00:28 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 ... 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 [87] 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 ... 288 »
1721  Bitcoin / Bitcoin Discussion / Re: Jeff Garzik chose BIP100 block size voting because Blockstream recommended it on: August 28, 2015, 09:08:28 PM
Perhaps not. But I'd say it is quite naive to think that technological advancements will continue to exponentially grow to infinity, as many seem to believe around here. In fact, Moore's Law seems to be unraveling as we speak. Just ask Intel what just happened to their tick-tock model.....

When I expressed the same concern to Gavin,  his response to me was:

Quote
I really, really don't understand this attitude-- I woulda thunk you were old enough to be confident that technology DOES improve. In fits and starts, but over the long term it definitely gets better.

My response was:

Quote
I'm confident that technology will improve. But it can improve nicely
but still be _utterly_ blown away by any particular pre-perscribed
growth formula (especially an exponential one). If the limit is some
multiple of what nodes can comfortably handle or real demand on the
network then its more or less equivalent to unlimited; so it doesn't
even have to be off by much.  I'm also experienced enough with large
scale systems to know that massive increases in scale expose new
behaviors and effects that were not visible in other regimes.  So, for
example in 1990 one might have correctly predicted the grown in raw
ALU throughput we've seen, but if you also assumed memory bandwidth
would have kept up you would be massively misplaced (factor of 100x+
off by now), and many problems become more memory bandwidth limited as
they scale.

This is especially a factor when you're talking about improvement of
the, say, 10th percentile rather than the best available technology--
which is very much a consideration.

There are also serious confounding trends. Computers are getting
better but more computing is moving onto battery powered tablet
devices which move backwards a decade in technology and on to cloud
hosted infrastructure which adds little decentralization in to
ecosystem. Sometimes the improvement in technology is rolled into
power, space, and cost savings and doesn't become readily available in
the forum of throughput. Sometimes the improvements show up in
specialized processors which may not be useful to us (e.g. most of the
multiply throughput and memory bandwidth in a high end desktop is in
the GPU right now). It wouldn't be an unreasonable guess that
computing power available at low cost to the general public may
increase slower even if raw technology improvement speeds up because
the improvements are going into other areas, esp. if applications
which need massive increases in local computing power don't
materialize.
1722  Bitcoin / Bitcoin Discussion / Re: Jeff Garzik chose BIP100 block size voting because Blockstream recommended it on: August 28, 2015, 09:05:22 PM
A misunderstanding can travel halfway around the world while the truth is putting on its shoes.


Here is the response I wrote on reddit:
Quote
Limitation of the twitter medium I assume there. I don't know why he's invoking "blockstream" there.

I asked Jeff on IRC what specifically the 75% that was in his document meant:

13:44 <gmaxwell> Technical question, it's unclear to the writeup how exactly you suggest miners signal their new size preference?  are you thinking that mine
rs express a preferred size in their blocks, and then the 75-th percentile size is used? (subject to the 2x limit) ?
 
...

13:50 <gmaxwell> What made you go for 75%?

...

13:52 <jgarzik> 75% = political science.  3/4 majority.

...

14:09 <gmaxwell> as far as the actual mechenism (which I don't see in the BIP); what I'd guess is that each block would express a preferred limit in the coinbase, next to the height,  and then at the update period those limits would get processed? e.g. taking the n-th percentile?  

14:10 <gmaxwell> (if that detail is in the bip, I'm missing it)  

...

14:14 <jgarzik> Anyway, on the BIP 100, basically the floor (least) of the range of most popular miner suggested sizes  

14:14 <jgarzik> "we all agree on this floor"  

14:15 <gmaxwell> jgarzik yea, if it took all the sizs, sorted, and then discarded 25% of the smallest ones, then took the smallest remaining size, it would accomplish that.

14:16 <jgarzik> size can decrease too, with 1MB absolute minimum.  

14:16 <gmaxwell> though maybe a number different than 75/25 would be good, I should talk to petertodd and see if his position would be changed if, say, it were 90%/10%.   Basically if you're below 5% hashpower you're not even getting a block per day on average, and so you couldn't even be doing much to prevent censorship.  

14:18 <gmaxwell> But e.g. should be we changing how the limit works slightly, e.g. making the 'size' include the change in the utxo size as I'd proposed before? so that the limit actually reflects the real costs in the network; .. absent something like that fees will never peanlize costly behavior there.

14:18 <jgarzik> nod - IMV tweaking those constants are easy while getting us past the 1MB hard fork

14:18 <jgarzik> & addressing governance are the hard parts

14:19 <jgarzik> agree RE UTXO - economically signaling those costs is important


I believe this is the entirety of my conversion with Jeff on the subject.  (I've removed many interspersed comments because IRC usually bifurcates into multiple threads of discussion due to latency; but I believe they are unrelated to this topic).  AFAICT, I never suggested 20%  but rather suggested a sensible meaning for what a 75% threshold might mean in his scheme, which to my eyes was under-defined but might have already meant that.

I don't think there is anything wrong with 20% as things go; though the main limitation in BIP100 (assuming any of the under-specified parts are filled in with things I find unobjectionable) is that it addresses only miner-vs-miner incentives issues, and even then only under assumptions about hashrate distribution which are currently untrue (e.g. right now less than one percent of the miners have >>50% of the hashrate, so BIP100 is basically a blank check to those few parties up to the 32MB limit).

Jeff's stated view is that users can be protected by exiting Bitcoin if miners are not acting perfectly, but there is incredible friction around markets (e.g. solution via exodus guarantees losers)-- "sell all your bitcoins and go use something else, is a really weak threat, in general. Because of the weakness of the threat I don't think this is a reasonable first-resort mechanism for assuring the security of the system-- I think we've already seen it disproved in practice, and under that argument, e.g. we could just give miners the ability to print infinite bitcoin, steal arbitrary coins, etc.
1723  Alternate cryptocurrencies / Altcoin Discussion / Re: Fundamental flaw in consensus algorithms? on: August 24, 2015, 07:58:51 AM
Section 4.4 of  https://download.wpsoftware.net/bitcoin/pos.pdf

Seperately, selfish mining is basically orthorgonal. And as you note, it only works when a miner has a very high proportion of the hashrate. A system where any participant has anywhere near one quarter of the authority isn't all that decenteralized.
1724  Bitcoin / Development & Technical Discussion / Re: How will XT be good with regards to the packet frame? on: August 24, 2015, 07:53:27 AM
A block never fit into a single packet.
The relay network protocol fits a significant fraction of all blocks into a single packet (about a third, ... about 60% fit in two packets), in fact.

1725  Bitcoin / Bitcoin Discussion / Re: Szabo just tweeted this on: August 21, 2015, 10:32:13 AM
whats interesting is Peercoin Achieves the second, now by design.
By, by design, requring the blockchain to be periodically signed by a trusted authority... you have a weird notion of resistance to state attacks.
1726  Bitcoin / Bitcoin Discussion / Re: Szabo just tweeted this on: August 21, 2015, 07:44:04 AM
That being so, would it make sense for Core to make a pre-emptive
strike by releasing a fork that switches to use another IP port
Someone would just bridge it. Switching proof of work algorithims might be much more effective and would have a potentially beneficial effect of resetting the currently highly centeralized mining climate... if one really did want to go the total war route. I don't currently believe it will be an issue.

Quote
The end result being that we would have a semi-centralized payment
network for "ordinary folks" and a plethora of more anonymous alt
Freedom that is only availble to people who dedicate their lives to it isn't really freedom. To be free you have to be free to do as you will, not forced to be nothing but a freedom fighter.

With few exceptions there has seldom in history been a kind of freedom that wasn't available to those few who really worked to achieve it at any cost. What fighting for peoples rights is fundimentally about is fighting for rights to be reconized by default or at least at a reasonable cost to people-- so that they are pratically available rather than merely theoretically available.

Quote
Only the myopic fixation on One True Coin prevents people realizing that
this is the most likely outcome for crypto in the long run.
Money gains its value from network effect. For a pure money, like Bitcoin, virtually all its value comes from network effect. Money is probably the worlds greatest natural monopoly. Oh just have seperate things is not a cheap and easy choice.
1727  Bitcoin / Bitcoin Discussion / Re: Szabo just tweeted this on: August 21, 2015, 06:22:11 AM
ethereum ??!?!
Even that slopply designed mess of bugs has effective blocksize and resource usage controls. Try something more like "Liquidcoin" if you want to see what a system looks like with no thought given to the incentives and stability of the system.
1728  Bitcoin / Bitcoin Discussion / Re: Szabo just tweeted this on: August 21, 2015, 06:20:10 AM
Quote
If it seems like the limit is getting hit persistently, and confirmation times are becoming a problem, an emergency limit increase is something that the core devs can patch very simply and quickly.  They can execute such an emergency block size “QE” if you will, at a moments notice.  They have demonstratively done this kind of deployment before, during the one previous hard fork, and with the F2Pool bug.  So what is the rush?
That's my question: https://bitcointalk.org/index.php?topic=1154636.0
This is correct, and it was a point I made previously and was pretty agressively attacked for being "against planning" (though what I suggest is precisely the opposite: having a planned contingency that you use when you have a strong consensus that its the right choice is good planning)-- clearly people promoting this stuff with the massive uncertanty created by XT and the irritation of "stress tests" can't honestly be concerned with there being a bit of turbulance.
1729  Bitcoin / Bitcoin Discussion / Re: Szabo just tweeted this on: August 21, 2015, 06:18:00 AM
Do we want it to handle all world daily transactions,
or do we want protection from current monetary systems and government involvement?
If we achieve the second, we can have both.  But if we only achieve the first, we likely cannot have the second (and wouldn't find it to be a substantial improvement over the fiat systems we have now, even if).

The reason for this is that if Bitcoin is secure and trustworthy, trustless decenteralized micropayment facilities can be built on top of it and extend Bitcoin's transaction security with arbritary speed or scale.  But if the system is fragle and underminable by attackers (government or otherwise) then it won't be robust enough to underpin these things.

(and things like micropayment channels were't my invention, they were recommended by the system's creator-- thats part of why Bitcoin has smart contracts to begin with.)
1730  Bitcoin / Development & Technical Discussion / Re: Why not increase the block size limit by 1MB whenever it needs to be increased? on: August 19, 2015, 03:25:54 PM
Quote
Wouldn't doing it this way increase the block size whenever the market needs it to?

Needs to be increased is tricky. The natural and necessary state for blocks is nearly full; defining need is hard.  "Near-universally agreed to be good to increase" would be better, but people are sensibly worried that it would be held back by unreasonable people and so they are unwilling to take that risk.

Quote
Ex, when it reaches 0.5MB, raise it to 2MB. When it reaches 1.5MB, raise it to 3MB.
That sounds like something that would completely undermine the existance of transactions fees-- which are the only long term security arguement we have (what will pay for adequate POW security as the subsidy declines). It would also allow the system to slip into arbritary amounts of centeralization if those increases were really guarenteed... because it could get to a point where no one but a few api providers bothered to run nodes. Sad
1731  Bitcoin / Development & Technical Discussion / Re: I suspect we need a better incentive for users to run nodes (c) on: August 18, 2015, 07:34:13 PM
Have I misrepresented? He seems to be clearly stating that distributed nodes are a small scale solution with consolidation at scale. That I find depressing and stand by what I said. If I have misinterpreted, then please correct me. The only other interpretation that I can see is he is agreeing with me, that unless clients do the block processing and miners only "generate" coins, then the result will be big server farms as we are seeing now. I think that is a rather wistful reading of the words, however.

Yes, you have, though it's not your fault.  What a client in Bitcoin is has been subverted into something that can't enforce the rules at all and is utterly dependant on trusting miners.  That was never the design of the system.
1732  Bitcoin / Development & Technical Discussion / Re: I suspect we need a better incentive for users to run nodes (c) on: August 16, 2015, 01:46:57 AM
I was quite disappointed to read that Satoshi envisioned huge centralised farms and users just being users paying fees. That means the end game is a cartel of infrastructure rich companies - back to square one for the proles.
Thats a misrepresentation in any case.

That response was to specific questions about the system being able to work at all.  My understanding of it at the time was simply what it said at face value.  Proof the system can work, the users get to choose the trade-offs; which is something classical centeralized systems couldn't offer.  ::shrugs::  Keep in mind that anything written in 2009-2011 was written in a very different world, not one where people just take for granted that Bitcoin works _at all_.

It's sad that people feel the need to put words in other people's mouths in any case. The whole appeal to authority hugely undermines the principles of the system.  If you want something that lives and dies on the whim of some authority: centeralized systems can have much better performance and security properties.


The node incentives thing doesn't seem technically feasable. Or rather, the system had that built in but it was undermined by pooled mining.  We now know how to avoid any _need_ to run pooled mining now, but it's always less costly to do so (due to the costs of running a node).

SPV could work in a way that more or less obviates the need for almost anyone to run a node at all; but the existing software never implemented that-- and those working on SPV right now are okay with fairly centeralized trust models and so they seem to not care. (Or even view the low security as a virtue).
1733  Bitcoin / Development & Technical Discussion / Re: How would you prove that you own >= X BTC without disclosing addresses ? (ZKP) on: August 15, 2015, 11:06:13 AM
Using an AOS ring signature only works when you know the pubkeys, which you don't for most coins in the Bitcoin UTXO set.

In any case, the ring signature used to construct the CT range proof is just the AOS scheme when not used with any AND,  so thats implement in secp256k1-zkp.

To avoid the proof size-- you're off in snark land-- e.g. the statement you prove "I know a private key for an adequate coin belonging to a utxo set with this hashroot". I previously suggested a scheme that lets you avoid doing 99% of the EC math inside a snark, so this could get a small under 400 byte proof and be implementable. I think it was in a forum post... I'll have to look. But the basic process have the snark instead prove "Pubkey P is a blinding of a pubkey that is member of this tree", and then you also prove you know P's discrete log externally to the snark. The verification of the blinding can be done with a single point addition in the prover.
1734  Alternate cryptocurrencies / Altcoin Discussion / Re: [VNL] Vanillacoin, a quiet word of warning. on: August 13, 2015, 07:51:39 PM
Everyone should read this article (especially the under the hood part):

http://cointelegraph.com/news/114794/miners-lost-over-50000-from-the-bitcoin-hardfork-last-weekend

John Connor warned the BTC devs about this issue:

https://github.com/bitcoin/bitcoin/pull/5634#issuecomment-69481908

Who are the idiots?

I'm afraid you failed to actually understand the discussion.  First, there has been no Bitcoin hard fork; the article you're linking to is simply flat out wrong.

Secondly, John Connor didn't warn about anything there-- in fact, he copied the code he was complaining about into his own codebase, after reformatting and with incorrect attribution in violation of the very minimal software license, and then lied about the functionality being in his forware all along. (See also: https://bitcointalk.org/index.php?topic=920344.msg10122209#msg10122209).

I find it remarkable that people will continue to use software written by someone who has been caught outright lying about the content of the binaries they distribute.  It reduces my faith in the potential for the success of cryptocurrency at all. What greater warning sign could you ask for?
1735  Bitcoin / Bitcoin Discussion / Re: What is soft folk and hard folk? on: August 13, 2015, 05:40:54 AM
That article is unfortunately pretty misleading.  Based on the commentary on reddit I think it actively harmed some people's understanding,  you should read the responses there-- including mine.

https://www.reddit.com/r/Bitcoin/comments/3griiv/on_consensus_and_forks_by_mike_hearn/
1736  Bitcoin / Development & Technical Discussion / Re: Some blocks have earlier block times than their predecessing block, how/why? on: August 11, 2015, 11:20:56 PM
Who says what the altcoins do actually works-- many have just had spontanious failures due to ill advised changes like twiddling how retargeting works... and there is no reason to expect that most have ever been analyized from an adversarial perspective (much less  actually attacked). ... but thats a question for another subforum.

As you note, it works fine in Bitcoin as is... and, in fact, requiring the time to be monotonic would open up new attacks.  If you want a monotonic time out of bitcoin you can extract one by simply running a rolling maximum over the timestamps, forcing miners to conceal more information won't help anything.
1737  Bitcoin / Development & Technical Discussion / Re: "Tarball" of blocks to speedup first full sync on: August 11, 2015, 11:17:32 PM
We used to; but since 0.10 the blockchain is now download in parallel and verified concurrently. Using the seperate download, even via bittorrent, then loading it is now usually slower than just syncing directly.
1738  Bitcoin / Development & Technical Discussion / Re: An easy way to double or triple the 6000 Bitcoin active full nodes count? on: August 11, 2015, 07:50:37 AM
Adding additional inbound reachable nodes to the network does not solve any problems we have currently. This isn't bittorrent: we're not trying to get more 'seeder capacity' or the like.

Nodes without inbound connectivity still help the network out in terms of partition resistance (more than inbound reachable ones, to some extent, because their inaccessiblity makes some DOS attacks harder to target) and block forwarding and transaction forwarding (which improves privacy somewhat for others too), but the most important thing a node does for the network is what it does for itself: it independalty verfies the information that comes in and won't accept invalid data -- no matter what, and users running (and _using_ their own nodes) is the exclusive mechensim to that directly provides any incentive alignment for miners at all.

Under no condition should you say that a node without inbound is "leaching". It isn't. It means they're not contributing socket capacity, but the total node count has fallen so far that we're nowhere near that limit either. (And if we were a few people would spin up a couple more high capacity nodes on a few hosting facilities and neatly address that.

Quote
just sits there with a green tick regardless

It displays orange half-bars when there are <= 8 connections;  IIRC the green tick is signifying that it thinks its vaguely in-sync with the network.

It does automatically use UPNP where available, though considering that so many of the resource usage complaints (which result in people not running the software at all) are related to inbound usage-- it might well be that furthering the misconception that one has to setup port forwarding to matter at all would just reduce the userbase further.
1739  Bitcoin / Development & Technical Discussion / Re: A asymmetric key question on ECDSA and Bitcoin on: August 10, 2015, 05:17:32 PM
FWIW that site isn't an "overview of curves", it's a site that exists to promote the authors solution.  While their curve selection is a fine one, some of the criteria they choose is rather tortured and the things secp256k1 "fails" are not really relevant (at least for our usage).  Meanwhile, there are other criteria which they've not included which they could have which their own perferred choices would fail-- for example, having a cofactor (which has actually resulted in vulnerabilities in protocols people created; ... which I'm not aware of happening as a result of e.g. a failure to have a unified addition law-- something they've faulted curves for because in theory the extra complexity ~might~ result in a bad implementation).   The page is thoughful overall and lists a number of interesting considerations, but it's the discussion thats interesting and worthwhile. The binary yes/no on the site does a disservice.
1740  Bitcoin / Development & Technical Discussion / Re: Berkeley Database version 4.8 requirement unsustainable on: August 04, 2015, 06:04:26 AM
But yes, the wallet needs an import/export format that's not database specific.
It has one (importwallet / dumpwallet), and has for two years now.
Pages: « 1 ... 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 [87] 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 ... 288 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!