Bitcoin Forum
March 28, 2024, 10:25:38 PM *
News: Latest Bitcoin Core release: 26.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: [1] 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 ... 113 »
1  Economy / Economics / Re: About Bitcoin Foundation: Who is Lindsay Holland w/ $160k salary - $13k/month on: April 06, 2015, 10:30:39 PM
Secretary work?

Lindsay WAS the Foundation at the beginning, and did EVERYTHING (including single-handedly organizing the San Jose conference).

She, like me, got a huge salary boost because the Foundation's original policy was to set the exchange rate for paying salaries once per quarter. That policy changed pretty quickly (neither Lindsay or I had any influence on our own salaries or the policies for how they were paid) to "use the CoinDesk price index on payday."

You have to remember that even simple things like "which exchange rate should be used to pay employees" didn't have simple answers back then (nobody had created a cross-exchange, volume-weighted price index).

I believe she left the Foundation partly because of annoying misogynistic Internet trolls like the ones found in this very thread....
2  Bitcoin / Development & Technical Discussion / Re: Bitcoin server - set minimum transaction fee on: April 03, 2015, 08:52:54 PM
Hello fellows!
I'm actually running bitcoind server to receive and send transaction through my own software.
Transaction size of my spends are always near to 100kb/500kb, so how do I set a minimum/pseudo-fixed transaction fee in my bitcoin.conf/software? If you haven't figured out yet what I mean, I would always pay 0.0001 BTC for each spend cause they never exceed 1.000kb.
Hoping in a fast reply  Roll Eyes

You can't, the reference implementation wallet always pays some-amount-per-1000-bytes-of-the-transaction.

The rules for the 0.10 release are:

+ By default, you always pay a fee (this is different from previous releases that would send transactions without a fee if they had high enough priority)  (run with -sendfreetransactions=1 to get the old behavior of sending high-priority transactions without a fee)
+ By default, the fee-per-kilobyte is estimated based on recent transaction confirmation history.

To get close to what you want (pay a fixed fee per transaction), run with -paytxfee=0.whatever  : that tells the wallet code to pay 0.whatever BTC per 1000 bytes. Most transactions are about 500 bytes big.

See here:  http://core2.bitcoincore.org/smartfee/fee_graph.html  ... for how high to make -paytxfee=0.whatever based on how long you're willing to wait for the first confirmation (that page graphs estimates from the latest&greatest fee estimation code from Alex Morcos that will hopefully be in the 0.11 Bitcoin Core release).
3  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: March 17, 2015, 05:49:42 PM
Finally a reasonable question:

Gavin, could you please explain the reason behind trying to guess and hard-code the optimal limit, instead of doing something adaptive, like "limit = median(size of last N blocks) * 10"?

The problem people are worried about if the maximum block size is too high:  That big miners with high-bandwidth, high-CPU machines will drive out either small miners or I-want-to-run-a-full-node-at-home people by producing blocks too large for them to download or verify quickly.

An adaptive limit could be set so that some minority of miners can 'veto' block size increases; that'd be fine with me.

But it wouldn't help with "I want to be able to run a full node from my home computer / network connection." Does anybody actually care about that? Satoshi didn't, his vision was home users running SPV nodes and full nodes being hosted in datacenters.

I haven't looked at the numbers, but I'd bet the number of personal computers in homes is declining or will soon be declining-- being replaced by smartphones and tablets. So I'd be happy to drop the "must be able to run at home" requirement and just go with an adaptive algorithm. Doing both is also possible, of course, but I don't like extra complexity if it can be helped.

It is hard to tease out which problem people care about, because most people haven't thought much about the block size and confuse the current pain of downloading the chain initially (pretty easily fixed by getting the current UTXO set from somebody), the current pain of dedicating tens of gigabytes of disk space to the chain (fixed by pruning old, spent blocks and transactions), and slow block propagation times (fixed by improving the code and p2p protocol).


PS: my apologies to davout for misremembering his testnet work.

PPS: I am always open to well-thought-out alternative ideas. If there is a simple, well-thought-out proposal for an adaptive blocksize increase, please point me to it.

4  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: March 17, 2015, 02:12:59 PM
Gavin gets it wrong. Big deal.
You know, there's a reason in business, why it's usually not the code-monkeys who get to make strategic decisions.
Are you this annoying in person, or just online?

I spent last week talking to some of the largest Bitcoin businesses (much bigger than Paymium/Bitcoin-Central or anything anybody in #bitcoin-assets is involved with), and they all want the maximum block size to increase.

The poll in this thread says people support it by a three-to-one margin.

It is going to happen sooner or later. I want it to happen sooner because Very Bad Things will happen if we get to 100% full 1MB blocks:
Quote
At 100% we're up at a huge 7744 seconds (more than 2 hours)! If the network were ever to reach this 100% level, though, the problems would be much worse as 10% of all transactions would still not have received a confirmation after 22800 seconds (6.3 hours).
http://hashingit.com/analysis/34-bitcoin-traffic-bulletin

I'm busy writing benchmarks, finding bugs in current code, and generally making sure nothing will break when we increase the maximum block size. If you want to be helpful instead of being an annoying troll, I have a TODO list you could help out with. Although the last time you agreed to help out, Dave, you didn't follow through on your promises (do you remember when you agreed to help with the testnet?).
5  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: March 11, 2015, 05:50:44 PM
Let's put a bullet in the head of this argument once and for all.
If people drive away, it frees up space in blocks, it's a self-correcting problem that nicely filters out people that thought they needed the Bitcoin-level bulletproof security level and finally realize they didn't.

With economic reasoning like that, I see why Paymium has such low transaction volume (if you're making massive profits because your costs are so much lower processing so many fewer transactions than the bigger exchanges, then I'll adjust my priors).

6  Bitcoin / Development & Technical Discussion / Re: Slowing down block propagation on: March 07, 2015, 06:15:01 PM
Miner profit in fiat currency = number of transactions * average transaction fee * btc-to-fiat exchange rate
Should this not be:
Miner revenue in fiat currency = number of transactions * average transaction fee * btc-to-fiat exchange rate?

Yes, I meant revenue, not profit.

I need to stop saying "profit" entirely, even when I mean "profit"-- it has different meanings for economists and ordinary people.
7  Bitcoin / Development & Technical Discussion / Re: Slowing down block propagation on: March 07, 2015, 02:12:39 PM
How do TX relayers get paid?

Why do we need transaction relayers? What vital function do they provide?

Miners need to be connected to each other, and to transaction creators (individual users, exchanges, merchants, online wallets, etc).

And transaction creators need to be able to connect to miners, but it seems to me transaction fees should certainly be enough incentive for miners to arrange for there to be plenty of opportunity for transaction creators to send them fee-paying transactions (it's cheap to run nodes that have tens of thousands of incoming connection slots).

8  Bitcoin / Development & Technical Discussion / Re: Slowing down block propagation on: March 05, 2015, 09:18:27 PM
i am nearly convinced now that a big block size together with 1BLT is a threat for bitcoin as miners dont have any incentive to make smaller blocks.

Why do you want miners to have an incentive to make smaller blocks?

Smaller blocks means fewer transactions, so fewer opportunities to collect fees, so less profit.

Miner profit in fiat currency = number of transactions * average transaction fee * btc-to-fiat exchange rate

Experience (and common sense) says that more usage of Bitcoin means a higher btc-to-fiat exchange rate, so if you want to maximize miner's fee revenue then increasing the number of transactions is the obvious way to do it.

If you think that putting an artificial cap on the number of transactions will increase overall miner profit, then I urge you to find a Real Economist and talk to them about the wisdom of trying to use production quotas to keep prices artificially high.

9  Bitcoin / Development & Technical Discussion / Re: Number of m-of-n ouputs per transaction on: February 26, 2015, 09:53:11 PM
... I managed to be wrong twice:  I forgot about the AreInputsStandard check for P2SH transactions that makes any transaction with more than 15 signature operations non-standard.

So if you REALLY need a m-of-16-to-20 transaction, use a non-standard raw CHECKMULTISIG, don't bother with Script gymnastics to try to workaround the 520-byte push limit.

10  Bitcoin / Development & Technical Discussion / Re: Number of m-of-n ouputs per transaction on: February 25, 2015, 05:20:15 PM
Very nice work, DeathAndTaxes.

The 0.10 release makes almost all P2SH Script forms standard, opening up possibilities for working around the 520-byte-push limit.

Warning: half baked thoughts off the top of my head here, check my work and TEST TEST TEST:

There isn't room in 520-bytes for all the compressed public keys needed for m of 16-20. Can we safely move the public keys out of the serialized P2SH onto the scriptSig stack?

e.g. go from a scriptSig that looks like:

Code:
0 signature  serialized(1 pubkey1 ... pubkey20 20 CHECKMULTISIG)

to:

Code:
0 signature pubkey1 ... pubkey20 serialized( 1 ... something ... 20 CHECKMULTISIG)

That's easy to do unsafely; ... something ... is just:

Code:
21 ROLL ... repeated 20 times

That's unsafe because anybody can redeem it with any 20 keys.

To be safe, you need a secure digest of the 20 public keys inside the serialized P2SH stuff. We've got HASH160 to create 20-byte digests, so we can get 26-bytes-per-pubkey with:

Code:
21 ROLL DUP HASH160 pubkey1hash EQUALVERIFY

Using PICK instead of ROLL you can probably save a byte per pubkey; if it can be done in 25 bytes then that gets under the 520-byte-push limit.

Aside: It would've been lovely if Script had a "hash the top N items on the stack, and push the result onto the top of the stack" operator.  Ah well.

BUT you're now putting 33+26 = 59 bytes per key into the scriptSig, so the 1650-byte-for-scriptSig-IsStandard limit will bite. If I counted everything correctly (and I almost certainly didn't), you could get 1 through 6 -of-20 as standard (20-of-20 as non-standard but valid).

EDIT:  I already see a mistake:  pushing 21 onto the stack requires two bytes, not one.....
11  Bitcoin / Development & Technical Discussion / Re: Are New Bitcoin Releases Going to Take Longer & Longer to Adopt by Majority? on: February 19, 2015, 07:05:59 PM
I think it will depend on the release.

Adoption of 0.10.0 is looking really good: https://getaddr.bitnodes.io/dashboard/?days=90#user-agents
... going from about 3% of nodes to 12% in just the last three days.

Different versions existing on the network isn't a problem until there is some incompatible change in either the consensus code (a hard fork) or the p2p networking protocol (and even that doesn't have to be a problem if there are "bridge" nodes that speak both versions of the protocol and relay blocks/transactions across the incompatible networks).
12  Bitcoin / Development & Technical Discussion / Re: Individual Block Difficulty Based on Block Size on: February 19, 2015, 05:06:47 PM
I think there's a large amount of cargo culting going on regarding the hash rate rather than useful analysis of threat models, attacker capabilities, and exactly what proof of work accomplishes.

I agree.

My guess is that we will end up with a very secure system with a modest amount of hashing in the future, because PoW hashing does three things:

1) Gives a steady 10-minute 'heartbeat' that limits how quickly new coins are produced
2) Makes it expensive to successfully double-spend confirmed transactions
3) Makes it expensive to censor transactions

The first becomes less important over time as the block subsidy halves.

I think we could do a lot to mitigate the second (see https://gist.github.com/gavinandresen/630d4a6c24ac6144482a for a partly-baked idea).

And I think the third might be mitigated naturally as we scale up and optimize the information sent across the network (there will be strong incentives to create "boring" blocks that don't include or exclude transactions everybody else is excluding or including).
13  Bitcoin / Development & Technical Discussion / Re: Individual Block Difficulty Based on Block Size on: February 17, 2015, 08:19:02 PM
Interesting idea, but I'm afraid I disagree with your premise.

There is no tragedy-of-the-commons race to zero transaction fees, because miners do not have infinite bandwidth, memory or CPU to accept and validate transactions.

We used to have a tragedy-of-the-commons situation with zero-fee transactions, but we solved that by rate-limiting them based on priority. And we have a working market for zero-fee transactions (see the graph here).

Assuming network bandwidth is the eventual bottleneck, and assuming there is demand for transactions to fill the available network-wide bandwidth (even if that demand is transaction spammers), nodes will start dropping transactions before they relay them. Prioritizing them based on fee paid and dropping the lowest fee/kb transactions will result naturally in a working market for fee-paying transactions.

As justusranvier points out, off-the-blockchain deals between transaction creators and miners doesn't change that logic, because low-fee transactions that are not broadcast break the O(1) block propagation assumption and have a direct cost to the miner.


I think you are trying to solve a different problem: I think you are trying to ensure that "enough" fees are paid to secure the network as the block subsidy goes away. Yes?
14  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 17, 2015, 06:50:50 PM
This is where my conversation with Gavin fell apart.  He was not able to acknowledge the concept of a too-high limit.  His reasoning was that since the limit was only one-sided (blocks with size above it are prevented) that it couldn't be too high.

Huh what?

I am not proposing infinitely sized blocks, so I obviously acknowledge the concept of a too-high limit as being plausible.

If you want to continue the conversation, please be very explicit about what problem you think needs solving, and how whatever solution you're proposing solves that problem.

We might agree or disagree on both of those points, but we won't have a productive conversation if you can't say what problem you are trying to solve.

To summarize my position: I see one big problem that need solving:

Supporting lots (millions, eventually billions) of people transacting in Bitcoin.
  Ideally at as low a cost as possible, as secure as possible, and in the most decentralized and censorship-resistant way possible.

It is hard to get consensus on HOW to solve that problem, because no solution is obviously lowest cost, most secure, and most decentralized all at the same time, and different people assign different weights to the importance of those three things.

My bias is to "get big fast" -- I think the only way Bitcoin thrives is for lots of people to use it and to be happy using it. If it is a tiny little niche thing then it is much easier for politicians or banks to smother it, paint it as "criminal money", etc. They probably can't kill it, but they sure could make life miserable enough to slow down adoption by a decade or three.

"Get big fast" has been the strategy for a few years now, ever since the project became too famous to fly under the radar of regulators or the mainstream press.

The simplest path to "get big fast" is allowing the chain to grow. All the other solutions take longer or compromise decentralization (e.g. off-chain transactions require one or more semi-trusted entities to validate those off-chain transactions). I'm listening very carefully to anybody who argues that a bigger chain will compromise security, and those concerns are why I am NOT proposing an infinite maximum block size.

There is rough consensus that the max block size must increase. I don't think there is consensus yet on exactly HOW or WHEN.
15  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 06, 2015, 01:34:07 AM
If I ever go insane and suggest increasing the 21 million coin limit, please put me on your ignore list.

I will remind everybody again of Satoshi's second public post where he talked about scalability:

Quote
Long before the network gets anywhere near as large as that, it would be safe for users to use Simplified Payment Verification (section Cool to check for double spending, which only requires having the chain of block headers, or about 12KB per day. Only people trying to create new coins would need to run network nodes. At first, most users would run network nodes, but as the network grows beyond a certain point, it would be left more and more to specialists with server farms of specialized hardware. A server farm would only need to have one node on the network and the rest of the LAN connects with that one node.

The bandwidth might not be as prohibitive as you think. A typical transaction would be about 400 bytes (ECC is nicely compact). Each transaction has to be broadcast twice, so lets say 1KB per transaction. Visa processed 37 billion transactions in FY2008, or an average of 100 million transactions per day. That many transactions would take 100GB of bandwidth, or the size of 12 DVD or 2 HD quality movies, or about $18 worth of bandwidth at current prices.
If the network were to get that big, it would take several years, and by then, sending 2 HD movies over the Internet would probably not seem like a big deal.

Satoshi Nakamoto

If you didn't do your homework and thought that Bitcoin == 1MB blocks forever, well, that's your fault.

I signed up for a Bitcoin that would scale.
16  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: February 01, 2015, 11:35:10 PM
What's the problem with paying 10 bucks instead of 10 cents to securely transfer a million dollars?

What's wrong?

Lets say those million dollar transactions are 250 bytes.  That is 4,000 of them in a 1MB block.

So $40,000 total reward to the miner -- about eight times current block reward.

BUT YOU ARE SECURING TRANSACTIONS WORTH SOMETHING LIKE 2,000 TIMES MORE VALUABLE THAN TODAY'S TRANSACTIONS (estimated average transaction USD value for today's average transaction is about $380). And I GUARANTEE that attackers would have a much easier time pulling off a double-spend of one million-dollar transaction than 1,000 $1,000 transactions.

The math for "large value transactions will generate enough fees to secure the chain" just doesn't work.
The math for "lots of small transactions will generate enough fees to secure the chain" might.

Also:

I still haven't heard a coherent argument on why large value transactions are necessarily also high-fee transactions.

I'd suggest you go research existing high-value-payment networks and see what typical fees are for multi-million dollar transactions. FEDWIRE is running at 6 transactions per second, average transaction value over $6million, with fees per transaction UNDER ONE DOLLAR.

Why? Because if you are giving somebody one million dollars for something, you almost certainly have built up real-world trust, and probably have a longstanding relationship, signed contracts, etc etc.

If you think Bitcoin is different, please explain the scenario where I send a stranger who I don't trust (so have to rely completely on the blockchain) $1million for something.
17  Bitcoin / Bitcoin Discussion / Re: Bitcoin 20MB Fork on: January 31, 2015, 03:12:28 AM
It's not just connection speed that will prevent normal people from running Bitcoin;  the best internet service I can get at my house has a 250gb monthly limit.

Also, I'm currently running full nodes on computers with 500GB hard drives.  Is that the message?  "Buy bigger computers and move somewhere that offers better internet service" ?

I think we should target somebody with a "pretty good" computer and a "pretty good" home internet connection.

And assume that network bandwidth, CPU and storage will continue to grow at about the rates they've been growing for the last 30 or more years (see the wikipedia pages on Moore's Law and Nielson's Law for pointers to discussions on those).

250gb per month is plenty for a 20MB block size
(20MB every ten minutes times 6 blocks per hour times 24 hours/day times 31 days/month == 90GB; we currently transmit all transaction data twice (haven't optimized that yet), so double that and you're still well under 250gb per month).


I believe it is extremely important to maintain the fundamental properties that Satoshi laid out -- because the system he described is the system that all of us who own bitcoin bought in to.

If the collective decision is to change some of those fundamental properties, then there must be extremely good reasons to do so.

On the block size issue, Satoshi said on Sun, 02 Nov 2008 on the metz-dowd cryptography mailing list (in reply to a question about scalability):
Quote
Long before the network gets anywhere near as large as that, it would be safe for users to use Simplified Payment Verification (section Cool to check for double spending, which only requires having the chain of block headers, or about 12KB per day. Only people trying to create new coins would need to run network nodes. At first, most users would run network nodes, but as the network grows beyond a certain point, it would be left more and more to specialists with server farms of specialized hardware. A server farm would only need to have one node on the network and the rest of the LAN connects with that one node.

The bandwidth might not be as prohibitive as you think. A typical transaction would be about 400 bytes (ECC is nicely compact). Each transaction has to be broadcast twice, so lets say 1KB per transaction. Visa processed 37 billion transactions in FY2008, or an average of 100 million transactions per day. That many transactions would take 100GB of bandwidth, or the size of 12 DVD or 2 HD quality movies, or about $18 worth of bandwidth at current prices.

If the network were to get that big, it would take several years, and by then, sending 2 HD movies over the Internet would probably not seem like a big deal.

When I first heard about Bitcoin, it was small enough I could read everything, and I did, including all of those mailing list posts. The promise of a system that could scale up to rival Visa is part of the vision that sold me on Bitcoin.

I feel bad suggesting that we limit the block size at all, or that the target be home computers and internet connections -- but I think there are plausible concerns about centralization risk, and I think starting small and scaling up as technology advances is a reasonable compromise.
18  Bitcoin / Development & Technical Discussion / Re: Fake block timestamps can cause a temporary fork? on: January 19, 2015, 11:12:00 PM
Why would a miner want only half the network to build on their block?

That makes no sense... what is the +2hr timestamp miner trying to accomplish?


It would create a temporary fork and perhaps some confusion. Maybe even break some badly coded bitcoin applications if they do not handle forks well.

No, it really wouldn't, any more than a business-as-usual temporary blockchain race/fork creates confusion or breaks applications. Like the one-block fork we had today.
19  Bitcoin / Development & Technical Discussion / Re: Is provoking a fork on purpose a good thing ? on: January 17, 2015, 10:04:02 PM
Imagine that you discover a fork condition between V1 and V2 of bitcoin core.
Surely enough, this should be reported to github, and test data must be updated.

No, please report security-critical issues (including consensus bugs) to the bitcoin-security mailing list: bitcoin-security@lists.sourceforge.net
20  Bitcoin / Development & Technical Discussion / Re: Fake block timestamps can cause a temporary fork? on: January 17, 2015, 09:59:06 PM
Why would a miner want only half the network to build on their block?

That makes no sense... what is the +2hr timestamp miner trying to accomplish?
Pages: [1] 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 ... 113 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!