Bitcoin Forum
September 19, 2017, 05:42:18 PM *
News: Latest stable version of Bitcoin Core: 0.15.0.1  [Torrent]. (New!)
 
   Home   Help Search Donate Login Register  
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 [17] 18 19 20 21 22 23 24 25 26 27 28 29 »
  Print  
Author Topic: WARNING! Bitcoin will soon block small transaction outputs  (Read 56596 times)
jdbtracker
Hero Member
*****
Offline Offline

Activity: 687


Minimum Effort/Maximum effect


View Profile
May 09, 2013, 08:52:02 PM
 #321

Quote
Is there a web page that gives some statistics on fees in a block?

blockchain.info

Quote
Is there a market for a DVD of the blockchain to bootstrap people with challenging circumstances?  Today, I have fiber optic to the wall of the house, it terminates within 3' of where I am typing.  My mother has a parcel of land next to her farm and it might be good for me to live close to her as she is getting old and lives alone, it is on a dialup.  I don't think I become less savvy or less important if I want to think about her safety and welfare.

yup copy it from the bitcoin directory called Blocks in windows 7 C:\Users\users\AppData\Roaming\Bitcoin

and for the last question. yes you get .00000003 in change, but that is a good point, would that change be blocked with a .00005430 limit?

If you think my efforts are worth something; I'll keep on keeping on.
I don't believe in IQ, only in Determination.
1505842938
Hero Member
*
Offline Offline

Posts: 1505842938

View Profile Personal Message (Offline)

Ignore
1505842938
Reply with quote  #2

1505842938
Report to moderator
          ▄█████▄
        ▄█████████▄
      ▄████▀   ▀████▄
    ▄████▀   ▄ ▄█▀████▄
  ▄████▀   ▄███▀   ▀████▄
▄████▀   ▄███▀   ▄   ▀████▄
█████   ███▀   ▄███   █████
▀████▄   ▀██▄▄███▀   ▄████▀
  ▀████▄   ▀███▀   ▄████▀
    ▀████▄       ▄████▀
      ▀████▄   ▄████▀
        ▀███  ████▀
          ▀█▄███▀
.
|
.
|
          ▄█████▄
        ▄█████████▄
      ▄████▀   ▀████▄
    ▄████▀   ▄ ▄█▀████▄
  ▄████▀   ▄███▀   ▀████▄
▄████▀   ▄███▀   ▄   ▀████▄
█████   ███▀   ▄███   █████
▀████▄   ▀██▄▄███▀   ▄████▀
  ▀████▄   ▀███▀   ▄████▀
    ▀████▄       ▄████▀
      ▀████▄   ▄████▀
        ▀███  ████▀
          ▀█▄███▀
unthy
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1505842938
Hero Member
*
Offline Offline

Posts: 1505842938

View Profile Personal Message (Offline)

Ignore
1505842938
Reply with quote  #2

1505842938
Report to moderator
1505842938
Hero Member
*
Offline Offline

Posts: 1505842938

View Profile Personal Message (Offline)

Ignore
1505842938
Reply with quote  #2

1505842938
Report to moderator
1505842938
Hero Member
*
Offline Offline

Posts: 1505842938

View Profile Personal Message (Offline)

Ignore
1505842938
Reply with quote  #2

1505842938
Report to moderator
bg002h
Donator
Legendary
*
Offline Offline

Activity: 1358


I outlived my lifetime membership:)


View Profile WWW
May 10, 2013, 01:33:19 AM
 #322

The more people who run a full node, the greater the decentralization[1][2].

I can't run a full node full time right now because there is no upload throttling. The Bitcoin client slows my Internet down to literally unusable (4sec+ pings).

Not because of hard drive space. I've got 900 GB free right now.

What kind of Internet connection do you have? I'm running a full node on a refurb low end Mac mini...I can't tell that I'm running it...I do have a 100mbit/s connection, but I think I would do fine with 1/10 that.

Hardfork aren't that hard.
1GCDzqmX2Cf513E8NeThNHxiYEivU1Chhe
solex
Legendary
*
Offline Offline

Activity: 1078


100 satoshis -> ISO code


View Profile
May 10, 2013, 01:53:03 AM
 #323


and for the last question. yes you get .00000003 in change, but that is a good point, would that change be blocked with a .00005430 limit?


OK. I'll bite. General question:

Why can't it be a standard feature of wallets that, during preparation of a transaction, any utxo < coin_dust is eliminated and the dust amount added to the transaction fee. Surely, this would help prevent a lot of the new dust spam causing blockchain bloat. Does any wallet do this?

astutiumRob
Full Member
***
Offline Offline

Activity: 197



View Profile WWW
May 10, 2013, 02:24:16 AM
 #324

It increases the costs of that dataset that cannot be pruned

There's no real reason the dataset cannot be pruned - i've been playing with a DB copy of a blockchain, looking at ways of "removing" the records for accounts with a nil balance (amount out = total amounts in) where date is > 30 days ago

I don't *need* as a _user_ of bitcoins the whole blockchain, if I could get "balances at point in time" and the journal entries after that.

I similarly dont really *need* (although I might _choose) to have all my addresses maintain a chain of completed transactions - if I could have a way to "move" the balance to another address within my wallet, I could then "discard" that address.

My "real life" wallet doesnt care which ATM each of the notes came out of, I have a "balance" (occasionally) in there I can spend - in fact having that "tracking" decreases anonymity significantly.

A lightweight client is a key to mass adoption (amongst a number of other things).

It's great that my children can empty a moneybox and see a £2 coin and know from the year "that was the one from Grandma on my 6th birthday". It rapidly becomes irrelevant when its a pile of coins getting spent on a beachball - getting the sand out of your shoes becomes much more pressing. In the same way I dont need to know that 4mBTC came from me testing -QT to a non QT client - it's just 4mBTC to be spent, which due to the size and age of the bitcoins will probably cost me more in fees to use than its worth.

Imagine buying a car for £5000 and taking 500x£10 notes to the deaer but you find they cant sell it to you because they came from 702 different amounts of change from your wages, and some are "worth" less when spending than £10 because they're notes only printed that morning, or were made up of 200x5p transactions... in the "real" world £5k is £5k is £5k not some variable equivalent that might eventually be 5k

 Huh

www.astutium.com - domains | hosting | vps | servers | cloud - proud to accept bitcoins. UK colocation for BFL and KNC ASICs in Tier3+ DC
Register Domains with BTC
Want to make some bitcoins ? Miner on ebay | Buy GH/s
marcus_of_augustus
Legendary
*
Offline Offline

Activity: 2380



View Profile
May 10, 2013, 03:12:15 AM
 #325

Quote
Imagine buying a car for £5000 and taking 500x£10 notes to the deaer but you find they cant sell it to you because they came from 702 different amounts of change from your wages, and some are "worth" less when spending than £10 because they're notes only printed that morning, or were made up of 200x5p transactions... in the "real" world £5k is £5k is £5k not some variable equivalent that might eventually be 5k

Yes, this is another manifestation of the weak fungibility problem of bitcoin ... that also manifests as pseudo-anonymity and not strong anonymity. I think Satoshi mentions something about lightweight clients only keeping the previous 2 TX records deep for each coin in the DB.

DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
May 10, 2013, 03:19:22 AM
 #326

It increases the costs of that dataset that cannot be pruned

There's no real reason the dataset cannot be pruned - i've been playing with a DB copy of a blockchain, looking at ways of "removing" the records for accounts with a nil balance (amount out = total amounts in) where date is > 30 days ago

I think you misunderstand.  Nobody is saying the blockchain can't be pruned.  IT CAN be pruned however the UXTO (set of unspent outputs which can still be inputs for future txs) CAN'T be pruned.  That is fine because generally the UXTO is going to grow slower than the blockchain (people tend to spend unspent outputs creating roughly the same number of unspent outputs).  There is one exception.  That is UNECONOMICAL outputs.

If you have a 0.0000001 output but it would cost 100x as much in fees to spend it would you spend it?  Of course not.  Kinda like mailing a penny (at a cost of $0.46) to your bank to apply to your mortgage principal.  Nobody does that it doesn't make economical sense.  So these uneconomically outputs are likely NEVER going to be spent.  Each one that is produced won't be spent and thus won't be pruned and will remain in the UXTO forever (or a very long time on average) this is causing the UXTO to bloat and will continue to bloat as there is no reason for anyone to ever spend these outputs (and spending is what allows an output to be pruned).

The UXTO is the critical resources.  In order to validate tx quickly the UXTO needs to be in memory.  So what happens when the UXTO is 32GB? 64GB? 200GB?  Now if those are "valid" outputs likely to be used in future tx well that is just the cost of being a full node.  But when 50%, 70%, 95%+ of the outputs are just unspendable garbage it greatly increases the processing requirements of full nodes without any benefit, to anyone.

Quote
I don't *need* as a _user_ of bitcoins the whole blockchain, if I could get "balances at point in time" and the journal entries after that.
Of course you don't which is the whole point of pruning the blockchain however you do need to retain a copy of every unspent output otherwise when you receive tx or block containing that output as an input in a new tx you can't validate the tx or block.  If the input is coming to you, you can't even know if the tx/block is valid or just some nonsense garbage that an attacker sent to trick you into thinking your got paid.

This unprunable dataset is a subset of the blockchain however tx below the dust thresholding simply bloat this.
justusranvier
Legendary
*
Offline Offline

Activity: 1400



View Profile WWW
May 10, 2013, 04:17:25 AM
 #327

In order to validate tx quickly the UXTO needs to be in memory.  So what happens when the UXTO is 32GB? 64GB? 200GB?  Now if those are "valid" outputs likely to be used in future tx well that is just the cost of being a full node.  But when 50%, 70%, 95%+ of the outputs are just unspendable garbage
...they'll get pushed to swap space along with all the other memory pages that haven't been accessed for a while? We expect caching algorithms and virtual memory to still be a thing in the future, right?
gmaxwell
Staff
Legendary
*
Offline Offline

Activity: 2296



View Profile
May 10, 2013, 04:48:24 AM
 #328

The UXTO is the critical resources.  In order to validate tx quickly the UXTO needs to be in memory.  So what happens when the UXTO is 32GB? 64GB? 200GB?  Now if those are "valid" outputs likely to be used in future tx well that is just the cost of being a full node.  But when 50%, 70%, 95%+ of the outputs are just unspendable garbage it greatly increases the processing requirements of full nodes without any benefit, to anyone.
It doesn't need to be in _ram_ in needs to be in fast reliable storage— online storage not nearline or offline, not on a tape jukebox in the basement or on on far away storage across a WAN—, and the validation time depends on how fast it is. If you put it on storage with a 10ms random access time and your block has 2000 transactions with 10 inputs each, you're looking at 200 seconds just to fetch the inputs which is just going to utterly really bad network convergence and cause a ton of hashrate loss due to forks and make people need more confirmations for security.  But in practice it's not quite that bad since _hopefully_ a lot of spent outputs were recently created.

The 'memory' stuff is mostly a tangent, the issue is that the utxo data can't be pruned. All full validators must have access to it— bloat in this dataset pressures people to run SPV nodes instead of full validators... which risks a loss of decenteralization, loss of motivations by miners to behave honestly, etc.

Bitcoin will not be compromised
Peter Todd
Legendary
*
Offline Offline

Activity: 1106


View Profile
May 10, 2013, 06:48:32 AM
 #329

It doesn't need to be in _ram_ in needs to be in fast reliable storage— online storage not nearline or offline, not on a tape jukebox in the basement or on on far away storage across a WAN—, and the validation time depends on how fast it is. If you put it on storage with a 10ms random access time and your block has 2000 transactions with 10 inputs each, you're looking at 200 seconds just to fetch the inputs which is just going to utterly really bad network convergence and cause a ton of hashrate loss due to forks and make people need more confirmations for security.  But in practice it's not quite that bad since _hopefully_ a lot of spent outputs were recently created.

The fact that usually it works because most of the outputs were recently created is incredibly dangerous. If the ratio of best case to worst case performance gets bad enough the attacker just has to come along with a block spending outputs that weren't recently created, or otherwise picked in a way where retrieval happens to be slow, to knock slower miners offline. Even worse is if they can come up with two blocks where each of those blocks trigger performance problems on one implementation but not the other they can split the network. They don't even have to mine those blocks themselves if the transactions in them are standard enough that they can get someone else to mine them.

In Bitcoin any performance problem can become a serious security problem. We only get away with it now because computers are so fast in comparison to the transaction volume and 10 minute target, but if we start needing to "optimize" things, including solutions like aggressively passing around transaction hashes rather than transactions themselves when a new block is propagated, we open ourselves up to serious security problems.

ecliptic
Sr. Member
****
Offline Offline

Activity: 322


View Profile
May 10, 2013, 07:06:14 AM
 #330

So is this basicaly because there is highly illegal shit like CP embedded in the blockchain forever via nano-transactions?

Of course this would never be admitted, but it comes right on the heels of rumors of its use for this purpose
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218


Gerald Davis


View Profile
May 10, 2013, 07:28:11 AM
 #331

So is this basicaly because there is highly illegal shit like CP embedded in the blockchain forever via nano-transactions?

Of course this would never be admitted, but it comes right on the heels of rumors of its use for this purpose

No that has nothing to do with this.  One could still "embed" data in the blockchain by just ensuring the outputs are larger than the dust threshold.   
Le Happy Merchant
Hero Member
*****
Offline Offline

Activity: 634



View Profile
May 10, 2013, 08:38:15 AM
 #332


We are two different kinds of nerd.

kokjo
Legendary
*
Offline Offline

Activity: 1050

You are WRONG!


View Profile
May 10, 2013, 08:46:18 AM
 #333

likely to be true. but due to the context(bitcoin, CS, and internet) Distributed hash tables is what DHT means.

"The whole problem with the world is that fools and fanatics are always so certain of themselves and wiser people so full of doubts." -Bertrand Russell
Gavin Andresen
Legendary
*
Offline Offline

Activity: 1652


Chief Scientist


View Profile WWW
May 10, 2013, 04:44:15 PM
 #334

The fact that usually it works because most of the outputs were recently created is incredibly dangerous. If the ratio of best case to worst case performance gets bad enough the attacker just has to come along with a block spending outputs that weren't recently created, or otherwise picked in a way where retrieval happens to be slow, to knock slower miners offline.

Who gets to decide how slow is too slow?

Mining these days requires investing in ASIC hardware. Solo mining or running a pool will very soon require investing in a reasonably fast network connection and a machine with at least a few gigabytes of memory.

Knocking the slowest N% of solo miners/pools off the network every year (where N is less than 20 or so) is not a crisis. That is the way free-market competition works.

How often do you get the chance to work on a potentially world-changing project?
jonytk
Member
**
Offline Offline

Activity: 106



View Profile
May 10, 2013, 10:31:39 PM
 #335

Now, we will be regulated to only sending transactions of a certain size.  No free market choice here...

Shame of them limiting the amount to one satoshi! it should be 1/100000 of a satoshi.....

What kind of argument is that?  Roll Eyes
You have a point, but Bitcoin started with an understanding that 1 satoshi was the minimum.  Now, we're being told that the limit is 5340 satoshis, with no free-market input on the matter.  It's rather disappointing.  Individuals should be able to decide what size of transaction is too small - we shouldn't all be forced to suddenly abide by the same arbitrary rule.

5340 satoshis is negligible, less than a US or Euro cent, and a very sensible minimum. This cutoff is a needed arbitrary rule which mirrors the real-world where fiat sub-cent transactions are also unwelcome.  The 5340 will be reduced as BTC value increases.

This whole thread is a fuss about a benefit interpreted wrongly.

The Achilles Heel of Bitcoin is being swamped by transactions worth less than a cent because, unlike fiat coinage transactions, Bitcoin transactions are stored on thousands of servers for years or forever.



This...


It increases the costs of that dataset that cannot be pruned

There's no real reason the dataset cannot be pruned - i've been playing with a DB copy of a blockchain, looking at ways of "removing" the records for accounts with a nil balance (amount out = total amounts in) where date is > 30 days ago

I think you misunderstand.  Nobody is saying the blockchain can't be pruned.  IT CAN be pruned however the UXTO (set of unspent outputs which can still be inputs for future txs) CAN'T be pruned.  That is fine because generally the UXTO is going to grow slower than the blockchain (people tend to spend unspent outputs creating roughly the same number of unspent outputs).  There is one exception.  That is UNECONOMICAL outputs.

If you have a 0.0000001 output but it would cost 100x as much in fees to spend it would you spend it?  Of course not.  Kinda like mailing a penny (at a cost of $0.46) to your bank to apply to your mortgage principal.  Nobody does that it doesn't make economical sense.  So these uneconomically outputs are likely NEVER going to be spent.  Each one that is produced won't be spent and thus won't be pruned and will remain in the UXTO forever (or a very long time on average) this is causing the UXTO to bloat and will continue to bloat as there is no reason for anyone to ever spend these outputs (and spending is what allows an output to be pruned).

The UXTO is the critical resources.  In order to validate tx quickly the UXTO needs to be in memory.  So what happens when the UXTO is 32GB? 64GB? 200GB?  Now if those are "valid" outputs likely to be used in future tx well that is just the cost of being a full node.  But when 50%, 70%, 95%+ of the outputs are just unspendable garbage it greatly increases the processing requirements of full nodes without any benefit, to anyone.

Quote
I don't *need* as a _user_ of bitcoins the whole blockchain, if I could get "balances at point in time" and the journal entries after that.
Of course you don't which is the whole point of pruning the blockchain however you do need to retain a copy of every unspent output otherwise when you receive tx or block containing that output as an input in a new tx you can't validate the tx or block.  If the input is coming to you, you can't even know if the tx/block is valid or just some nonsense garbage that an attacker sent to trick you into thinking your got paid.

This unprunable dataset is a subset of the blockchain however tx below the dust thresholding simply bloat this.

Absolutely,
 the solution could be "send all dust transactions in a transaction to miners as fees or back as change" IF the destination address IS empty(or under 0.1btc),

that will help with the spam and spam like transactions, example: 1000 million chinese/indian creating new addreses, going to a faucet and receiving satoshis that will never be used because they will lost their wallet.dat in hd crash... If they want to use the faucet they should get 0.1 btc first.


pjheinz
Member
**
Offline Offline

Activity: 84


MEC - MFLtNSHN6GSxxP3VL8jmL786tVa9yfRS4p


View Profile
May 11, 2013, 05:40:32 AM
 #336

Is this why I sent a tx of .025 BTC with a .001 transaction fee 3 hours ago and it still is only seen by 1 peer Huh

♠♠♠https://(malware)♠♠♠ Most popular BTC-E bot made! ♠♠♠ Free Trial Available!! ♠♠♠
Peter Todd
Legendary
*
Offline Offline

Activity: 1106


View Profile
May 11, 2013, 08:28:06 AM
 #337

The fact that usually it works because most of the outputs were recently created is incredibly dangerous. If the ratio of best case to worst case performance gets bad enough the attacker just has to come along with a block spending outputs that weren't recently created, or otherwise picked in a way where retrieval happens to be slow, to knock slower miners offline.

Who gets to decide how slow is too slow?

Mining these days requires investing in ASIC hardware. Solo mining or running a pool will very soon require investing in a reasonably fast network connection and a machine with at least a few gigabytes of memory.

Knocking the slowest N% of solo miners/pools off the network every year (where N is less than 20 or so) is not a crisis. That is the way free-market competition works.

BTC Guild right now.

Gavin Andresen
Legendary
*
Offline Offline

Activity: 1652


Chief Scientist


View Profile WWW
May 11, 2013, 06:43:49 PM
 #338

Who gets to decide how slow is too slow?
BTC Guild right now.
Okey dokey.

Have you contributed any patches to p2pool to make it more efficient / easier to install / etc? If not, why not if you're so worried about centralization?

(honest question, I don't keep up with p2pool development because I'm personally not terribly worried about mining centralization)

How often do you get the chance to work on a potentially world-changing project?
boonies4u
Hero Member
*****
Offline Offline

Activity: 826



View Profile
May 11, 2013, 07:37:12 PM
 #339

Who gets to decide how slow is too slow?
BTC Guild right now.
Okey dokey.

Have you contributed any patches to p2pool to make it more efficient / easier to install / etc? If not, why not if you're so worried about centralization?

(honest question, I don't keep up with p2pool development because I'm personally not terribly worried about mining centralization)


Bitcoin mining for profit will be like farming profitably IRL. Farms gets larger and # of farmers gets smaller.

Key word : for profit
jgarzik
Legendary
*
Offline Offline

Activity: 1470


View Profile
May 11, 2013, 08:25:49 PM
 #340

One realistic scenario is that some players mine at a loss, simply because they find other value in mining -- keeping bitcoin decentralized, keeping bitcoin secure, processing non-standard transactions, etc.

Jeff Garzik, bitcoin core dev team and BitPay engineer; opinions are my own, not my employer.
Donations / tip jar: 1BrufViLKnSWtuWGkryPsKsxonV2NQ7Tcj
Pages: « 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 [17] 18 19 20 21 22 23 24 25 26 27 28 29 »
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!