Bitcoin Forum
May 13, 2024, 11:50:34 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 [45] 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 »
881  Bitcoin / Press / Re: 2013-03-18 Wired: Ring of Bitcoins: Why Your Digital Wallet Belongs ... on: March 18, 2013, 02:02:52 PM
Great idea, great PR. However skipping just one digit from the ring won't stop people brute forcing the key if the ring is stolen. Maybe it wasn't a great idea for Charlie to identify himself like that, it seems a big part of the benefit of that trick is the obscurity.

Well, for all we know he's lying about what the trick actually was.  Wink
882  Bitcoin / Development & Technical Discussion / Re: How to force a rule change by bloating the UTXO set on: March 18, 2013, 12:57:08 PM
Read my top few messages.
883  Bitcoin / Development & Technical Discussion / Re: How to force a rule change by bloating the UTXO set on: March 18, 2013, 07:45:25 AM
Unfortunately it could be up to 2016 blocks before the difficulty resets.  And even then it can only drop by a factor of 4x every 2016 blocks.  If the dropoff in hashrate is steep enough you could be stuck in one-block-per-week-land for a few years, effectively making the currency unusable (and therefore worthless).

IIRC this happened on testnet as the result of some sort of prank by Artforz, which is why it has its own difficulty-adjustment rules.

While you make a good point, in the scenario where you don't have a full copy of the UTXO set it's irrelevant anyway. Without that UTXO set you just don't know what is or isn't a valid transaction, so mining is pointless and validates nothing.
884  Bitcoin / Development & Technical Discussion / Re: Is the 21 million bitcoin limit unchangeable? on: March 18, 2013, 05:58:32 AM
How was I "not quite true"?  Because I didn't bother to specify details?  Was I wrong about the reasons for the change?  The comment wasn't deliberately misleading, either.  There was a change to prevent direct to IP connections, as that had become viewed as a potential security risk.  If the comment was wrong, that was likely accidental.

It's not quite true because you're implying there wasn't a limit. There was a blocksize limit all along, it's just that Satoshi decided it should be reduced from 32MiB to 1MB.

Look at the early commit history of Bitcoin sometime; Satoshi made really misleading comments all the time hiding major changes.
885  Bitcoin / Development & Technical Discussion / Re: Is the 21 million bitcoin limit unchangeable? on: March 18, 2013, 05:40:33 AM
My view is that the association is silly.  The max_blocksize rule was added by Satoshi after the system was already running, in order to remove an attack vector via spamming of the transaction queue, until such time as a more elegant solution could be found.

That's not quite true. Version 1.0 didn't have a 1MB limit, but instead used the same 0x2000000 byte, or 32MiB limit, used for any serialized data. Satoshi later added MAX_BLOCK_SIZE so that miners wouldn't create blocks bigger than 1MB, but larger was still accepted.

Finally the hard limit was reduced by Satoshi to 1MB in commit 172f006020965ae8763a0610845c051ed1e3b522 The commit comment is deliberately misleading: "only accept transactions sent by IP address if -allowreceivebyip is specified"
886  Bitcoin / Development & Technical Discussion / Re: How to force a rule change by bloating the UTXO set on: March 18, 2013, 02:57:39 AM
You suggest the current 1M limit will allow UTXO grows by 52GB per year at most, or about 600k per block at most.

So we can raise the 1M limit to 2M limit, while putting a limit on UTXO growth at 600k. That will allow more legitimate use of blockchain while kill spammy tx like this: http://blockchain.info/address/1PhKfcoFJbgfPcCPYih1RtWJWdYkmE2p4K, and your concern on UTXO bloat is completely solved.

The very idea that we'll just go off an change the limits without a demonstrated problem is why it's political. With the current 1MB hard limit, a 250KB UTXO limit makes sense. (what I would have proposed) Increasing the hard limit in exchange for a UTXO limit still runs into the more important issues about ensuring that miners can operate on low-bandwidth connections.
887  Bitcoin / Development & Technical Discussion / Re: Block #225430 chain fork dataset available on: March 17, 2013, 11:50:56 PM
You are right, I should read the code indeed. A protocol defined in the code rather than in a specification - pros: no maintenance effort, no ambiguity (in theory); cons: difficult to read and understand, difficult to make other implementations (including new versions of the same program). The blockchain fork happened because devs forgot that Berkeley DB was part of the protocol. Without reading the code I find this bit messy.

"no ambiguity" <- that's exactly what failed. In v0.7, db.h, there is the following line:

Code:
class CTxDB : public CDB

That means, create a CTxDB class, that extends the CDB class. That class is from an external library. What's CDB? What version? What does it do? All this stuff is ambiguous. Yet just "include every external" library doesn't work either; how far back do you go? While not as issue now, with really large blocks even subtle stuff like performance differences between different hardware implementations are can cause forks even with identical software.


Believe me, the developers understand the importance of the problem very well. As an example Pieter Wuille and others have been working to prevent OpenSSL differences from causing a fork with IsCanonicalSignature() and similar. I don't happen to agree with Gavin on everything, maybe even not on most things, but I can agree that he has been taking his roll in pushing testing and stability very seriously since he was hired by the Bitcoin Foundation, and for that matter, even further back than that.

There aren't easy solutions to specification problem, and I really think that writing yet another specification in addition to the imperfect one we already have is currently a waste of limited manpower. It might always be a waste of manpower - Bitcoin is in uncharted computer science territory with its extremely strict requirement for consensus.
888  Bitcoin / Development & Technical Discussion / Re: [ANN] bcflick - using TPM's and Trusted Computing to strengthen Bitcoin wallets on: March 17, 2013, 10:13:52 PM
One quick question - I didn't think TPMs have a secure clock. Did I just miss that feature or have you found some other way to link the secure world to real time? You mentioned before using the Date header in an SSLd google.com request seems good enough.

For an implementation of this designed for busy services could simply take advantage of the slowness of flicker itself to keep track of time roughly. Just assume every flicker call takes t time and increment your current time variable.
889  Bitcoin / Development & Technical Discussion / Re: How to force a rule change by bloating the UTXO set on: March 17, 2013, 09:54:36 PM
Don't get me wrong, I think there should be a specific UTXO growth limit just like there is a blocksize limit. However js2012 described it as a solution that would solve the problem totally, which it won't.

If the issue was less politically controversial, not to mention technically risky, I'd propose a patch creating such a limit for the May 15th hardfork. As it stands there's no way it'd happen though.
890  Bitcoin / Development & Technical Discussion / Re: Block #225430 chain fork dataset available on: March 17, 2013, 08:15:09 PM
2. It is beyond belief that validity of a block could be decided by such implementation specific matters like BerkeleyDB record locking.
And, of course, there is no specification of what is a valid block. The code is the specification? So make no changes to the code and we won't have forks?
Please take the code at some point and write the specification of what is a valid block. Then change the code as you like and test if it's ok with the specs.
Also you may find out that the block validity rules are too weird and could refactor them better.
As Mike Hearn says, money could be lost here.


Specifications aren't magic; they're just words on paper. I can put anything into a specification, but it doesn't magically make code actually follow the spec. I can also take the specification and write tests, but again, the tests don't magically make the code follow the specification.

Before commenting further on the topic you need to read the Bitcoin sourcecode yourself. If you can't read it, you have no business commenting on software development anyway. If you can, you'll find that while it isn't perfect and could use some refactorings, all in all understanding the intent of the different parts is fairly easy and thus the code itself acts as a perfectly good specification.



891  Bitcoin / Development & Technical Discussion / Re: How to force a rule change by bloating the UTXO set on: March 17, 2013, 07:56:44 PM
As we increase the max block size, we may have a UTXO index, which is the total number of outputs in a block minus the total number of inputs in a block, and have a hard limit for it

Problem solved.
Some txouts are bigger than others, so maybe the sum of the sizes of the txouts created minus the sum of the sizes of the txouts spent would be more appropriate?

Very good!

Any comments from OP?  Roll Eyes

The idea isn't new. It also doesn't help much because the UTXO growth limit has to scale with the overall blocksize limit or it becomes difficult to reliably get transactions confirmed. Since it has to be a significant fraction it still doesn't solve the underlying scaling problem, only buys you time temporarily.
892  Bitcoin / Bitcoin Discussion / Re: Dangerous precedents set on March 12 2013 on: March 16, 2013, 06:17:14 PM
Satoshis paper makes no mention of a one megabyte block size limit.

Or a non-inflationary money supply.
893  Bitcoin / Development & Technical Discussion / Re: Is the 21 million bitcoin limit unchangeable? on: March 16, 2013, 05:37:09 PM
Gavin's quote is talking about "pressure" from the amount of transactions these businesses will put on the existing limits of the block chain. He's not talking about direct pressure from the businesses...

Quote
14:49   gavinandresen   Luke-Jr: argument for another day, but I can almost guarantee that the blocksize limit will be raised in less than 2 years, just based on pressure from the big businesses using the chain (and no, NOT satoshidice)
14:50   gmaxwell   gavinandresen: If pressure from startups with business plans come in conflict with the health of the system then thats an issue we'll have to resolve.
14:50   gavinandresen   gmaxwell: not startups with business plans, existing companies like BItPay and Coinbase that are seeing exponential growth

Using the chain != Petitioning the Bitcoin foundation


I believe in context "using the chain" refers to the act of using the chain for transactions directly rather than using an off-chain transaction system. Just following that quote was:

Quote
14:51   Luke-Jr   gavinandresen: are you overlooking the potential of other solutions than just growing the block size?
14:51   gmaxwell   Luke-Jr: probably a good discussion for another day.
14:51   gavinandresen   yes, time for me to take a shower and then process the 100 email messages that piled up while I was asleep
14:51   Luke-Jr   gmaxwell: well, if we're "certain" to increase the block size limit "within 2 years", I'd prefer to just discuss and schedule it now :/

If your business model relies on on-chain transactions and is threatened by high fees you have every reason to petition for change in a variety of ways, including to the foundation.
894  Bitcoin / Development & Technical Discussion / Re: Is the 21 million bitcoin limit unchangeable? on: March 16, 2013, 08:53:19 AM
Of course that assumes that the economic majority do want to backtrack to and/or stay with 0.7. If they did not, then maybe the rumours that [users or entities that economically serve as proxies for users] were pressuring for larger block sizes (that is to say, for versions 0.8 and up) might have been exaggerated.

I seem to recall someone suggesting Gavin was under pressure from big businesses to get larger blocks into play, other than that maybe there just happen to be a whole bunch of what maybe amount in this issue to "forum trolls" (said with some affection and tongue in cheek) loudly pushing for something the actual economic majority does not actually consider more important than first ensuring we can even actually in real life use the max block size we already had specified in capital letters in the source code.

Relevant IRC:

Quote
14:49   gavinandresen   Luke-Jr: argument for another day, but I can almost guarantee that the blocksize limit will be raised in less than 2 years, just based on pressure from the big businesses using the chain (and no, NOT satoshidice)
14:50   gmaxwell   gavinandresen: If pressure from startups with business plans come in conflict with the health of the system then thats an issue we'll have to resolve.
14:50   gavinandresen   gmaxwell: not startups with business plans, existing companies like BItPay and Coinbase that are seeing exponential growth

http://bitcoinstats.com/irc/bitcoin-dev/logs/2013/03/12#l6304349
895  Bitcoin / Mining / Re: Call non-pool miners 'block submitters' instead of 'miners' on: March 15, 2013, 08:21:54 PM
I like the term "hashers" myself, short and simple.
896  Bitcoin / Development & Technical Discussion / Re: How to force a rule change by bloating the UTXO set on: March 15, 2013, 04:46:39 PM
I'll refer the religious capitalist wonks to my earlier post on the nature of Bitcoin:

Suffice it to say that such large, amazingly outperforming oligolies are extremely difficult to form on completely unregulated markets.

Bitcoin itself is an oligopoly. What are Bitcoins made of anyway? They're just bits, information, and by themselves information is incredibly, ridiculously cheap. Of course the incredibly low price of information is made possible by the free market itself, specifically the amazingly successful computer industry.

Bitcoin is a system by which every participant creates a shared oligopoly on a particular set of information, the blockchain. From day #1 Bitcoin was about taking information that, if subject to free market forces, would be so incredibly cheap that it'd be basically free and artificially making it expensive. This shared oligopoly, achieved through the rules set out by Satoshi, makes this information incredibly expensive, so much so that 32 bytes of information, a private key, can now be worth millions of dollars.

Basically the decision about how big our shared oligopoly should allow blocks to be is just a decision about what rules we'll follow to make our little bits of otherwise worthless information as valuable as possible. Myself, gmaxwell, and many others happen to think that if we limits blocks to 1MiB each, keeping the regulations as they are, our little oligopoly will maximize the value of that information. Gavin, Mike Hearn, and many others happens to think that if blocks are allowed to be bigger than 1MiB, thus changing the regulations, our little oligopoly will maximize the value of that information.

Don't for a second think any of this discussion is about free market forces. Bitcoin is about artificially subverting free market forces through regulation, for the benefit of everyone participating in the oligopoly that is Bitcoin. It just happens to be that the way to become part of this oligopoly isn't by, say, living in a certain part of the world that's mostly desert, it's by either buying entrance (buying some Bitcoins) or by doing a completely made up activity that has no purpose outside the oligopoly. (mining)
897  Bitcoin / Development & Technical Discussion / Re: Decentralized networks for instant, off-chain payments on: March 15, 2013, 04:36:55 PM
Cool! Also I'm seconding jgarzik's demand to see a demo.  Smiley
898  Bitcoin / Development & Technical Discussion / Re: How to force a rule change by bloating the UTXO set on: March 15, 2013, 05:08:02 AM
And that, kids, is why every hobbyist wants at least one Google-style portable datacentre-in-a-container in his garage, just in case...

Plot for a cyberpunk story: The UTXO set "hording" happened, followed by a tech revolution that suddenly made storage space/network bandwidth a lot cheaper and P2P sharing of it feasible again. Given the enormous incentives remote hackers, physical thieves, and insiders launch an all-out attempt to steal that UTXO set data. Whomever succeeds has done something not dissimilar to stealing the gold out of the national reserve, albeit in this case the "gold" is a constantly perishing commodity and the attempt would also require a co-ordinated launch of enough hashing power to make the new UTXO set actually useful. (maybe done by remotely hacking water heaters around the world?)
899  Bitcoin / Development & Technical Discussion / Re: How to force a rule change by bloating the UTXO set on: March 15, 2013, 04:47:59 AM
I wonder if even BBQcoin would consider such a fun experiment to be their kind of fun..

Consider the politics too! Whose shill / sockpuppet are you!?!?! How much are the Bit-enders paying you to incite this attack upon the Lite-enders' currency? Don't you know that incitement to riot is a crime?

Well this is why I'm perfect for the job: it's already clear to many (if not all) that I am hell-bent on destroying Bitcoin and likely any other crypto-coin. Tongue

But wait! Maybe we can hoist the bit-enders by their own petard! Suppose LiTeCoin actually did volunteer itself as the "victim" chain for this little "experiment" and ended up scaling up to massively Googillianesque scales, becoming so valuable and so huge that not only does it lure the Fortune 100 into adopting it but also only the most technologically pre-prepared of the Fortune 100 could even begin to consider hosting its gigapetabytes of UTXOs let alone its petapetabytes of actual blockchain? Wouldn't it have "won"? Wouldn't that show the bit-enders a thing or two about who is really the most super, most-dooper, most uber uber-chain of them all?!?!

Maybe the only real question here is, are the bit-enders scaredy-cats, that they would hand over such an opportunity to prove a chain's uberness to some other chain instead of jumping at the opportunity to prove their own uberness by making their own test-net chain more valuable than any altchain could ever hope or dream of becoming?

Ha. I remember a discussion I had with someone on IRC awhile back where we were joking that we should port SatoshiDice to one of the alt-coins to kill it in a flood of spam: "Of course, we couldn't just run bots, we'd have to let anyone make real bets. We could even fund the effort from the profit generated! In fact, it might kickstart the alt-coin's adoption, raise the price, and before long we'll be implementing off-chain transactions to ensure we don't kill our golden goose!"

They'd need to move to bitcoin v0.8 first and then hard fork otherwise it's pointless as it would hit the BDB limit.

Of course. An easy task probably given that there aren't many differences between Litecoin and Bitcoin technologically.

It isn't the miners who decide.   It is the economic majority who will buy the miner's coins that decides.  

Re-read my post. I'm well aware the economic majority argument and wrote it specifically to demonstrate how it doesn't always apply.


While we're at it consider how database protection laws (IE "copywrite-like" for databases) could apply to blockchains. Keeping a UTXO set will be very expensive, so it's easy to see how full nodes with that data will want to find some way to be paid to maintain it. Directly paying for access is one option for instance, but the other is for miners to offer free access. For the latter scenario even if the miners don't ask for exclusivity contracts in return, IE you agree to only send your transactions to this mining pool, there are good reasons for a miner to try to ensure that no-one can do a whole-sale copy of their UTXO database, as allowing that, even if paid for, allows for competitors to start up. Violating those agreements can be seen as violating database protection laws in jurisdictions where they exist, in addition to violating the contracts themselves. (note the very similar concepts outlined in the "red balloons" paper)

The transactions included with each block form another source of UTXO data, albeit a source that requires one to stay online constantly. Now you have yet another perverse incentive: if a competitor experiences some downtime for whatever reason you have no reason to want to give them the UTXO data that allows them to start mining again. Of course, they have the option of trying to pay for it, but ultimately the mining pool as a business has to ask the question "Do you accept $x from a competitor, when if you don't accept the payment they may be forced out of business?"

Finally when you create a block you do have an incentive to distribute the transactions associated with that block so as to allow your competitors to build upon that block. But your incentive is for only >50% of the hashing power to have that data, because it's that >50% who will inevitably create a longer chain. Of course, including measurement errors and variance you'll want to target a higher percentage, but the point is miners have incentives to ask other miners for proof they control a certain amount of hashing power, and just not bother relying solved blocks to players who only control a tiny amount of hashing power. Again when the number of miners is sufficiently small this can all be done as legal agreements backed by contract law and database protection laws.

tl;dr: A rational miner has a incentive to deliberately withhold transaction and UTXO data when the cost of maintaining that data is high and the overall number of miners low, and that incentive perversely leads to even further centralization.
900  Bitcoin / Development & Technical Discussion / Re: Cancelling unconfirmed transactions on: March 15, 2013, 04:01:06 AM
Re: retep & deepceleron,
Why promote bad usage habits in the first place?

I'm a pragmatist. If I wrote a patch that replaced transactions with any newer one if the total fees were increased, it would never get into the reference client.

On a conceptual level I see no problem with spamming the memory pool with multiple versions of the same transaction. Obviously the first one to get included in a block makes all the others invalid -- but why is that such a problem?

As an aside you must require a limited resource to be expended to broadcast a transaction on the network for DoS protection. This also implies that the network bandwidth required to run a full node will now be an even further higher multiple of the "blockchain bandwidth" than it already is. How to price that properly is an open question.
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 [45] 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!