Primarily the limit is a safeguard, a backstop. It is not meant to be a constraint on legitimate commerce. It also serves to facilitate adoption and decentralization by keeping the ability to participate affordable. Backstops are beneficial when hit, if they are protecting grandma from getting hit with the ball.
Thank you. Would you agree, if it were possible (although it is not), that the blocksize limit should somehow be automatically tied to "the bandwidth and disk space an average enthusiast can afford"?
|
|
|
NewLiberty, I have a quick question for you which will hopefully clarify your position in my mind.
Excluding DOS flooding or other malicious actors, do you believe it would ever be a beneficial thing to have the blocksize limit hit?
|
|
|
Is there a way to make nodes agree an what is the most appropriate block to mine on top of it? For example the hash of each block could be used to make all nodes of the network agree and choose either (A) or (B)
Any such scheme would lead to exactly the same attacks mentioned above.
|
|
|
Further, with the federated paycode server model, they'd need to overwrite the existing invoice on multiple servers simultaneously in order for it to be effective.
An attacker would only need to DOS the other servers such that the attacker's server is the only working server that remains (this is similar to what can happen with Electrum clients that don't do full SPV; the Electrum client does do full SPV but there's at least one client I'm aware of that uses Electrum servers but doesn't do SPV).
|
|
|
With only 6 hex digits it would be way too easy for someone to "mine" an equivalent "address" so I don't see how this can work at all.
IMO QR codes are the best approach by far.
My thoughts exactly. My 3 year old GPU can generate SHA256 hashes at a rate of about 70M/s. 6 hex digits == 2^24 combinations. 70M / 2^23 (combinations on average to find a collision) equals about 0.1 seconds time to fake a paycode. The signature doesn't seem to help, either. It only proves that an invoice is signed by someone. Of course if a paycode server were to fake an invoice, they would create a perfectly valid signature signed by themselves. Maybe, if the invoice and the paycode both covered the exact same set of data, such an attack would be more difficult, but at first glance this doesn't seem very safe from MITM.
|
|
|
This I know, but tend to think it's not the case. Anyone in this part of the forum, on this thread, probably doesn't have some obscure minority point of view. Whatever the basis for their reasoning it's probably not isolated. I think if we can get some sort of consensus on a thread like this we can too in the wider community. If we can't it would be harder, maybe not impossible, but harder depending how people dug into their positions. The longer the wait the harder. If this were mid 2010 there would likely be zero problem. High profile devs (like Satoshi/Gavin) would simply explain the necessary way forward and the small community would move along. If we're talking 2019 I don't see that happening so easily, or at all actually.
Except for that last clause perhaps, no arguments here. I'd be in favor of more structure for progress, but you won't convince everybody. There will be purists that cry centralization.
More structure can cut both ways. If done well (big if), it can reduce centralization by better distributing "votes." But you're right that you can't convince everybody. There is always a long tail of technology out into the marketplace. Just because our community is at the cutting edge of technology doesn't mean everyone is. For example, I was surprised to learn of a story in the community I came from (Hacker News) about a very successful business that still ran BASIC. This was used for order fulfillment, accounting, you name it. The business was profitable in the millions if I recall, but completely reliant on their workhorse infrastructure. It wasn't cutting edge, but it worked, and that's all that mattered. A similar story exists in the banking industry. My first assignment after college (20ish years ago) was with a defense contractor maintaining a codebase written in... BASIC. In any case, point taken.
|
|
|
Consensus does not imply unanimous appeal. Although threads such as these help to flesh out different ideas and opinions, they do very little towards determining where consensus lies. This thread could represent a vocal majority just as easily as it could a small but vocal minority (on either side).
I had a complete reply typed out for all your points but my browser closed before I sent it Ah well, I'm not re-typing it. The gist is I'm aware of the above, but don't think that's the case. I tend to think those in this part of the forum on this thread have sentiments which are not isolated. If we can gain consensus here we have a good chance in the wider community; if not then who knows, but it would become ever harder with passing time. Fair enough. Asside: did you preview your post at any point? It's in your draft history if so...
|
|
|
I have edited the OP to take advantage of your feedback on this king of attack 1-6 comfirmation attacks are always possible under normal conditions although you have to be extremely lucky.
What is the issue I try to make more difficult is the possibility of long chains replacing the current
There's nothing magical about the number 6, as I'm sure you're aware. 7-confirmation reorgs or attacks could happen, they're just less likely than 6 (and more likely than 8 ). I don't think your idea is bad per-se, I just don't think that the advantages it confers outweigh the new vulnerabilities it introduces, but that's just my opinion.
|
|
|
At worst harder, but not impossible.
LOL are you not following this thread? What easy way forward do you see emerging for the block size issue? Consensus does not imply unanimous appeal. Although threads such as these help to flesh out different ideas and opinions, they do very little towards determining where consensus lies. This thread could represent a vocal majority just as easily as it could a small but vocal minority (on either side). This is exactly where a more formal governance model (as I mentioned) could help. It too would surely be imperfect, but just about anything would be better than determining consensus based on who writes the most posts, thoughtful though they may be. A formal governance model could draw distinct conclusions: yes the BIP passed, or no it didn't. If it didn't, it can lead to compromise. If, for example, I knew that there was little support for gavin's version, I for one would be much more willing to compromise. But I simply don't know.... instead, I choose to assume that people who support Bitcoin do so because they support the ideals of a free market, but I could be wrong. If the ISO can finally manage to crank out C++11, despite the contentious issues and compromises that were ultimately required (and C++14 just two months ago too!), pretty much anything is possible IMO.
That's for a programming language not a protocol. Also see Andreas Antonopoulos's comment on ossification considering hardware, which I also agree with. I'm having trouble imagining a use case where embedded hardware with difficult-to-update software would connect to the P2P network, much less having anything to do with handling the blockchain, but my imagination isn't all that great. I also have trouble in general with any device whose purpose is highly security related that isn't software upgradeable. (Such things do exist today, and they're equally ill-advised.)
|
|
|
At the risk of putting words into his mouth (for which I apologize if I'm wrong), gavin sees it as a technical anti-DOS measure: to prevent miners from DOSing voting enthusiasts out of the network.
But that's a very costly attack, yet it doesn't accomplish anything. It's a free attack for a miner, and can arbitrarily kick anyone off the network (even if temporarily) who doesn't have sufficient bandwidth or ultimately enough disk space.
|
|
|
You're proposing that the total work be based on the value of the individual hashes, instead of the hash targets, is that correct?
Yes but with a little twist. To compare two chains the must include blocks up to the same level. This could incentivize a miner who happens to find a hash with an unusually low value to refrain from immediately broadcasting it. See, for example, this attack proposed a few years ago by casascius, which becomes viable with this proposed change. Thanks for this particular case. To compare two chains they must include blocks up to the same level. So although our friend could through such a block the rest miners lets say if they where 10 blocks ahead would ask him to catch them up by providing the missing 10 blocks each one with hash value equal or smaller of existing ones. PS. I try to think of what should happen when 2 separated block chains merge again. I don't think that this completely eliminates casascius's attack. Evil miner M finds a lucky block, and withholds it. During this period, evil miner M prepares double-spend attacks, and can point his mining power to his new (secret) chain. Once good miner A publishes her new (but less lucky) block, M can take advantage of his double-spends (which now have only one confirmation), wait some short period of time and then announce his superior block. M can therefore trick anyone who chooses to accept 1-conf transactions (which admittedly isn't all that smart, nor does M have that long to do so), plus they've also been able to mine on a secret chain while the rest of the network was wasting their time on A's shorter chain.
|
|
|
Please tell me if you agree an ossifying of the protocol - the fact it will become increasingly hard, probably impossible to make changes as adoption grows - is what we'll likely see.
Not that I was asked, but I'll offer an opinion anyways. At worst harder, but not impossible. Today we have a sort of self-enforced (by the core devs) consensus system, plus of course the ultimate ability to vote with your node and with your mining capacity. I wouldn't expect the latter to ever change (indeed some blocksize limit is required to maintain this goal). For the former, however, I doubt that having this little governance around important changes to Bitcoin will last forever -- 20 years hence I would expect a much more regimented procedure, somewhat more akin to a standards organization than what we have today (perhaps with a combination of academic, miner, and corporate interests represented, but that'd be an argument for a different thread). More governance is both bad and good-- in particular on the good side, bright lines can be drawn when it comes to voting in a way that doesn't happen so much today. If the ISO can finally manage to crank out C++11, despite the contentious issues and compromises that were ultimately required (and C++14 just two months ago too!), pretty much anything is possible IMO. If you're that worried about a ossification, perhaps you'd prefer a dead man's switch: in 20 years, the blocksize reverts to its current 1 MB.
|
|
|
We all agree that the current max block size is too restricted.
What seems obvious to me is that different people have different opinions on the underlying purpose of any blocksize limit.
At the risk of putting words into his mouth (for which I apologize if I'm wrong), Gavin sees it as a technical anti-DOS measure: to prevent miners from DOSing voting enthusiasts out of the network. If this is true, the best solution would be for an automatically adjusting limit that tracked the speed of residential connections and of residential hard drive capacities (enthusiast residences). Since that seems impossible, gavin's limited-exponential growth seems like the best educated guess that can be made today.
Others see it as a as an economic issue, and would like to tie the limit to some economic aspect of Bitcoin to solve this perceived economic threat. I'm no economist, and I certainly don't know if they're right. But guess what: I personally don't care if they're right.
Any restriction on blocksize is an artificial one, a regulation determined by some authority who believes (perhaps correctly) they know better than I. I'm OK with technical restrictions, and those that improve the ability to vote, but I am completely against any restrictions whose purpose is to alter an otherwise free-market system.
To but it bluntly, I would rather see a restriction-free Bitcoin crash and burn in the hands of a free-market system, than I would participate in a regulated Bitcoin. To me, Bitcoin should be a free market experiment (as much as is technically feasible), even if this leads to its eventual failure. Of course, that's just my personal opinion, but it's the basis for my dislike of more-limited blocksizes.
I mean no disrespect to some of the clever alternatives presented in this thread-- but I personally wouldn't want any of these "regulations" in Bitcoin.
Let me ask a question: is there anyone here who both (a) favors additional blocksize restrictions (more strict than Gavin's), and also (b) believes such restrictions are not additional "regulations" that subtract from a otherwise more-free-market system?
|
|
|
You're proposing that the total work be based on the value of the individual hashes, instead of the hash targets, is that correct? This could incentivize a miner who happens to find a hash with an unusually low value to refrain from immediately broadcasting it. See, for example, this attack proposed a few years ago by casascius, which becomes viable with this proposed change.
|
|
|
I've heard that so many times since I joined lol. But seriously it would be great to change that skepticism into acceptance for us girls if more and more of us join up and speak up instead of hiding behind a male nickname forever, which is a pretty sad prospect if u think about it.
The skepticism is there for a reason because the only time people announce themselves as female is usually just to pretend that they are. There's no reason to believe you are a female either regardless or your instance that you are one. I think a better way for women to get involved would be to not come out and announce that they're female straight away and maybe not even mention it at all unless it needs to be because gender is irrelevant here. Usually but not always, not in my case anyway. I wish I could kill that skepticism concept, well I've been here for many weeks now and I haven't asked for any loans or scammed anyone, so far so good That's only because if you did you know you'd be called out straight away with it and get negative feedback faster than you could say there's sand in my vagina. Most 'female' scammers would probably wait a while and try groom their victims before trying a scam. If I was you I'd forget this thread and let it die and just carry on posting as you normally would, because like I said your gender is irrelevant here unless you are wanting to use it to your advantage for whatever reasons. Nobody here denies that pretending-to-be-a-female-and-scamming is common. But gender is not irrelevant: if it were, it wouldn't bring with it the hostility we've seen in this thread. Let's consider some IRL analogies. If a woman walks into my church, I don't think anything unusual about it. If a woman walks into my basement and wants to play WoW with me and my male nerdy buddies, that is unusual. *Now I could care less if the playership of WoW is mostly male, but substitute math club in for WoW, and suddenly I do start to care. A hostile environment for young women interested in math is something to be concerned about (and, for the record, back when I was in a math club in high school (yeah I was that kind of nerd), there were no women in it). Bitcoin compares most closely to that last example. If it's ever going to be taken seriously, it has to have the same gender balance as, say, the USD. I don't think it's particularly relevant whether or not any single individual (e.g. fabiola) is male or female, but the existing gender imbalance of Bitcoin and the occasional hostility that goes along with it absolutely is relevant. * Those were just examples, I'm neither religious nor a WoW player, nor do I live in my parents' basement
|
|
|
Thanks! that fixed it, now i can try to get my coins back.
Thanks again
Greating
Thank you very much, it worked, (first time 0 balance, but with -rescan parameter now it is Ok, passphare works again)
Great to hear, but don't thank me, thank jackjack!
|
|
|
Hello i have a question, ... Now in the original wallet i used for the test i had, 0.8 coins (not BTC) But when QT has started i don't see the 0.8coins back ?
You'll need to run bitcoin-qt with the -rescan option, e.g. close Bitcoin Core, then go to Start -> Run, and type "C:\Program Files\Bitcoin\bitcoin-qt.exe -rescan".
|
|
|
i opened this topic https://bitcointalk.org/index.php?topic=497893.0 Recovered Private Keys with Pywallet but Passphrase is Wrong I have the same problem After i recovered my wallet with pywallet in June and succesfully loaded into bitcoin core/qt , i can not send bitcoins because the passphrase is incorrect I cant remember that i tried it in June that bitcoin qt/core is able to send bitcoins (is passphrase correct at that time) probably not (available balance is the same as earlier, before the salvage error) Here is one of my post in this issue: https://bitcointalk.org/index.php?topic=34028.msg9271649#msg9271649I will try to import private keys from file that pywallet dumpwallet created in this way https://bitcointalk.org/index.php?topic=398155.msg6964230#msg6964230 but i cannot run cat unix command (step 4) , So how can i import my private keys with another method? (the recovered wallet.dat can loaded into bitcoin core but passphrase is incorrect ) Responded to you in the pywallet thread over here. You should probably take it easy with posting to 4 different threads on the same issue... 2 threads (one in the Armory section, one in tech support) are plenty...
|
|
|
|