Bitcoin Forum
October 22, 2018, 08:04:40 PM *
News: Make sure you are not using versions of Bitcoin Core other than 0.17.0 [Torrent], 0.16.3, 0.15.2, or 0.14.3. More info.
 
   Home   Help Search Donate Login Register  
Pages: [1] 2 »  All
  Print  
Author Topic: Is it time to think about decimal precision ?  (Read 641 times)
spartacusrex
Hero Member
*****
Offline Offline

Activity: 678
Merit: 508



View Profile
December 04, 2017, 02:51:01 PM
 #1

I honestly never thought this day would or even could come so soon, but now I'm not so sure.

With 8 decimal places, if BTC hits 1 million, 1 satoshi is 1cent.. So we can't do sub-cent micro txns. An area Lightning Networks will utterly transform and finally make possible. You could be paying fractions every time you turn a light on. Maaaany use cases.

Since a 'fork' takes, well, at least a couple of years to pull off, and since 5 years from now 'may' be too late, I guess this may have to be thought about for the very next fork (dare I mention it) ?

Wow.. Is that a possibility ?

Is there a plan yet - I'm sure there are many options.. Is there a preferred current best guess ? (Just go float->double ? )

Can it be done with a soft fork ? Sounds pretty hard-forkish to me, but they can do amazing things now SegWit's activated.

 






Life is Code.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1540238680
Hero Member
*
Offline Offline

Posts: 1540238680

View Profile Personal Message (Offline)

Ignore
1540238680
Reply with quote  #2

1540238680
Report to moderator
1540238680
Hero Member
*
Offline Offline

Posts: 1540238680

View Profile Personal Message (Offline)

Ignore
1540238680
Reply with quote  #2

1540238680
Report to moderator
1540238680
Hero Member
*
Offline Offline

Posts: 1540238680

View Profile Personal Message (Offline)

Ignore
1540238680
Reply with quote  #2

1540238680
Report to moderator
HeRetiK
Hero Member
*****
Offline Offline

Activity: 896
Merit: 764


the forkings will continue until morale improves


View Profile
December 04, 2017, 04:10:32 PM
 #2

I like your bullishness but I think we're still far off from BTC hitting 1 million, ie. I think there are more pressing matters that should be solved first.

Without having a proper scaling solution widely deployed anything below 0.0001 BTC is effectively impossible to transact. In my opinion thinking about increasing decimal precision will only make sense once we're close to making dust transactable again.

Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising, making it easier and thus faster to deploy. Then again crypto is a weird place, so who knows what counter arguments against an increase of decimal precision arise. It would be interesting to see how much the effective impact on block size would be, for example.

spartacusrex
Hero Member
*****
Offline Offline

Activity: 678
Merit: 508



View Profile
December 04, 2017, 04:57:58 PM
 #3

I like your bullishness but I think we're still far off from BTC hitting 1 million, ie. I think there are more pressing matters that should be solved first.

6 months go I would have agreed absolutely, but it's pretty crazy out there. It may take 5 good years to get another fork in place.

Without having a proper scaling solution widely deployed anything below 0.0001 BTC is effectively impossible to transact. In my opinion thinking about increasing decimal precision will only make sense once we're close to making dust transactable again.

Lightning is round the corner. Next year it will start being used properly. I think we're very close now. And there is nothing blocking it's implementation. It's definitely coming.

Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising, making it easier and thus faster to deploy. Then again crypto is a weird place, so who knows what counter arguments against an increase of decimal precision arise. It would be interesting to see how much the effective impact on block size would be, for example.

I think there will be disagreement about whether to make it a one off 'doubling' of precision (float -> double) or a more permanent variable precision representation.

I can see arguments for both. Although - lol - a 'double' should do it. They may find other things to do with the numbers.

Life is Code.
DannyHamilton
Legendary
*
Offline Offline

Activity: 2198
Merit: 1384



View Profile
December 04, 2017, 06:28:46 PM
 #4

I think there will be disagreement about whether to make it a one off 'doubling' of precision (float -> double) or a more permanent variable precision representation.

The bitcoin blockchain does not currently use floating point numbers to represent transaction values.  It uses integers.

Spendulus
Legendary
*
Online Online

Activity: 2030
Merit: 1076



View Profile
December 05, 2017, 01:06:06 AM
 #5

I like your bullishness but I think we're still far off from BTC hitting 1 million, ie. I think there are more pressing matters that should be solved first.

Without having a proper scaling solution widely deployed anything below 0.0001 BTC is effectively impossible to transact. In my opinion thinking about increasing decimal precision will only make sense once we're close to making dust transactable again.

Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising...

Decimal Point Rising....

A movie title?

haltingprobability
Member
**
Offline Offline

Activity: 98
Merit: 20


View Profile
December 05, 2017, 03:07:36 AM
 #6

I think there will be disagreement about whether to make it a one off 'doubling' of precision (float -> double) or a more permanent variable precision representation.

The bitcoin blockchain does not currently use floating point numbers to represent transaction values.  It uses integers.

I'm actually a little confused about this. According to the dev-guide, TxOut.value is an int64_t. 21M Bitcoins * 100M satoshis = 2.1 quadrillion satoshis. But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800). This means that the upper 23 bits must be zero, no?
doctor-s
Member
**
Offline Offline

Activity: 73
Merit: 10


View Profile
December 05, 2017, 03:30:39 AM
 #7

Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising, making it easier and thus faster to deploy. Then again crypto is a weird place, so who knows what counter arguments against an increase of decimal precision arise. It would be interesting to see how much the effective impact on block size would be, for example.

From memory there was a very specific data related reason why bitcoin was capped at 21 million.

Something along the lines of  (bad example:) there's 8 bits in a byte, and therefore we have 0 digit figures - increasing the digits requires adding a full extra byte which increases data requirements / processing requirements / bandwidth / etc.

Pretty sure there may be a legit opposition to this, but then again, it may be inconsequential at this point in time. Perhaps it only mattered 8 years ago.
peter0093
Newbie
*
Offline Offline

Activity: 56
Merit: 0


View Profile
December 05, 2017, 04:53:53 AM
 #8

now a day the a day decimal pointing of the bitcoin well be arising,
DannyHamilton
Legendary
*
Offline Offline

Activity: 2198
Merit: 1384



View Profile
December 05, 2017, 04:57:02 AM
 #9

But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

Kakmakr
Legendary
*
Offline Offline

Activity: 1442
Merit: 1129

★ ChipMixer | Bitcoin mixing service ★


View Profile
December 05, 2017, 06:16:09 AM
 #10

But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

How many of the 2,100,000,000,000,000 Satoshi is currently in circulation?

Subtract 1 000 000+ from Satoshi's stash
Subtract coins being send to Burn addresses
Subtract Lost coins <amount unknown>
Subtract Coin that are hoarded/Cold storage <not in circulation>
Subtract the coins that are not mined yet.

We should not be looking at the Total amount, but rather how many is in circulation. ^smile^

haltingprobability
Member
**
Offline Offline

Activity: 98
Merit: 20


View Profile
December 05, 2017, 06:33:44 AM
 #11

But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

Oops, fat-fingered the calculator... <blush> anyway, it's just a few orders of magnitude, lol. But yeah, this clears it up.

At some point, it might make sense to start talking about moving up to 96-bit integers to allow division down to nano-satoshis. I don't see it requiring any major debate, it's a +4 byte delta on tx sizes and people will be wanting finer divisions when price gets high enough.
crabby
Member
**
Offline Offline

Activity: 128
Merit: 10


View Profile
December 05, 2017, 06:34:06 AM
 #12

But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

How many of the 2,100,000,000,000,000 Satoshi is currently in circulation?

Subtract 1 000 000+ from Satoshi's stash
Subtract coins being send to Burn addresses
Subtract Lost coins <amount unknown>
Subtract Coin that are hoarded/Cold storage <not in circulation>
Subtract the coins that are not mined yet.

We should not be looking at the Total amount, but rather how many is in circulation. ^smile^


When you write software, especially something as precise as Bitcoin, you would never be so negligent as to write code that doesn't consider every corner case. So no, removing those not in circulation would never be an option.
cr1776
Legendary
*
Offline Offline

Activity: 2030
Merit: 1009


View Profile
December 05, 2017, 09:57:19 AM
 #13

But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

How many of the 2,100,000,000,000,000 Satoshi is currently in circulation?

Subtract 1 000 000+ from Satoshi's stash
Subtract coins being send to Burn addresses
Subtract Lost coins <amount unknown>
Subtract Coin that are hoarded/Cold storage <not in circulation>
Subtract the coins that are not mined yet.

We should not be looking at the Total amount, but rather how many is in circulation. ^smile^


Adding precision is a completely different thing than adding precision AND stealing people’s bitcoin.  I can see most people supporting the former, but the only ones supporting the later doing so for nefarious reasons.  

spartacusrex
Hero Member
*****
Offline Offline

Activity: 678
Merit: 508



View Profile
December 05, 2017, 10:22:24 AM
 #14

Even if BTC only went to 100k.. haha.. it would still require sub-satoshi payments if you were counting every light switch on\off, or every single byte of data sent from a certain server.

Since I DO see that happening in the next 10 years, and it could take 5 years to pull off a fork (IF we can even pull another one off successfully, since I have a feeling it's only going to  get harder the larger Bitcoin gets ), then I think the next fork is probably the one to aim for.

Also - I would definitely think over compensation is the order of the day. IPv6 with it's 4 byte addition is way too small. They should have gone 8 and be done with it. They're just going to have to do it all again. Literally DECADES..

I know it takes more power to compute and store.. but I would still add 8 bytes. Orders of magnitude more than 4. Then that would be it. Honest miners could limit the minimum spend. But I'd also be up for a variable precision solution, if someone had a good one.

Quote
I don't see it requiring any major debate, it's a +4 byte delta on tx sizes..

I'll get the popcorn.
 
..

Can it be done as a soft-fork ?




Life is Code.
HeRetiK
Hero Member
*****
Offline Offline

Activity: 896
Merit: 764


the forkings will continue until morale improves


View Profile
December 05, 2017, 11:51:45 AM
 #15

I like your bullishness but I think we're still far off from BTC hitting 1 million, ie. I think there are more pressing matters that should be solved first.

6 months go I would have agreed absolutely, but it's pretty crazy out there. It may take 5 good years to get another fork in place.

As I mention in the latter part of my comment, I'm rather positive about a precision update being less debatable than a block size increase. On the other hand you're not wrong. It's indeed something that could be worth looking at rather sooner than later. But in my opinion 5 years for getting a hardfork in place is still way too pessimistic. But who knows, there may be some nuances to the problem that are not obvious at a first glance.


Without having a proper scaling solution widely deployed anything below 0.0001 BTC is effectively impossible to transact. In my opinion thinking about increasing decimal precision will only make sense once we're close to making dust transactable again.

Lightning is round the corner. Next year it will start being used properly. I think we're very close now. And there is nothing blocking it's implementation. It's definitely coming.

I also think that lightning is very close now. But the timeframe from "very close" to production ready to the actual deployment and real life usage may still be 1-2 years away. I will be gladly proven wrong though.


Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising, making it easier and thus faster to deploy. Then again crypto is a weird place, so who knows what counter arguments against an increase of decimal precision arise. It would be interesting to see how much the effective impact on block size would be, for example.

From memory there was a very specific data related reason why bitcoin was capped at 21 million.

[...]

The technical reason is the size of the integer datatype that Bitcoin is currently using, OP is suggesting to use a larger integer datatype instead Smiley


Once we're there my educated guess would be that such a fork could be deployed fairly undisputed, as I don't see any reasons for contentious camps about decimal precision arising...

Decimal Point Rising....

A movie title?

nice catch Grin


[...]

Can it be done as a soft-fork ?

I too would love an answer on that question.

torwig
Member
**
Offline Offline

Activity: 112
Merit: 10

C++/Golang Dev


View Profile
December 05, 2017, 12:22:50 PM
 #16

It's really bad bad very bad idea!
Only integer numbers.
haltingprobability
Member
**
Offline Offline

Activity: 98
Merit: 20


View Profile
December 06, 2017, 03:00:35 AM
 #17

Even if BTC only went to 100k.. haha.. it would still require sub-satoshi payments if you were counting every light switch on\off, or every single byte of data sent from a certain server.

Nobody's going to be counting that kind of IoT stuff on the blockchain, there just isn't enough space (bytes are too valuable).

Quote
Since I DO see that happening in the next 10 years, and it could take 5 years to pull off a fork (IF we can even pull another one off successfully, since I have a feeling it's only going to  get harder the larger Bitcoin gets ), then I think the next fork is probably the one to aim for.

*shrug - I'm not saying we'll never need it but I think it won't be difficult.

Quote
I know it takes more power to compute and store.. but I would still add 8 bytes. Orders of magnitude more than 4. Then that would be it. Honest miners could limit the minimum spend. But I'd also be up for a variable precision solution, if someone had a good one.

That kind of sub-division can be done off-chain and makes sense to do off-chain, anyway (i.e. micro-payment channels). Obviously, you can't enforce a sub-satoshi payment on-chain but then a satoshi is very small, so you might be able to set up your micro-payments where you "pre-pay" the next satoshi, and then count down your usage with sub-satoshi resolution (e.g. audio-streaming by the byte or something).

Quote
Can it be done as a soft-fork ?

Depends on how you define soft-fork. It can be done without any change to the block itself, so it only requires a client upgrade to process a new tx format. This has happened already many times.
haltingprobability
Member
**
Offline Offline

Activity: 98
Merit: 20


View Profile
December 06, 2017, 03:03:54 AM
 #18

It's really bad bad very bad idea!
Only integer numbers.

Not to worry, Bitcoin is never going to use floating-point arithmetic because rounding errors would screw up the total number of Bitcoins over time - cumulative error grows exponentially with number of operations. Integer-math guarantees that the number of satoshis in circulation will always balance correctly. Smiley
ir.hn
Member
**
Offline Offline

Activity: 168
Merit: 20

Blockchain is a Digital Constitution


View Profile
December 06, 2017, 03:18:33 AM
 #19

It's almost too late already.  Bitcoin is the global reserve crypto because every altcoin starts out requiring bitcoin to buy it on an exchange.  This is where the majority of sustainable bitcoin demand comes from.  48% of the crypto market is altcoins.

The problem is that the smallest unit of bitcoin is now worth more than the largest unit of many altcoins.  This makes trading altcoins with bitcoin very inconvenient.  If this continues a more convenient coin will be used to trade altcoins instead, etherium which is divisible to 18 places.

This process is already underway and unless something is done right away, it will continue.  I know you all love your satoshi, but it is already holding bitcoin back as the other big altcoins take over.

Kakmakr
Legendary
*
Offline Offline

Activity: 1442
Merit: 1129

★ ChipMixer | Bitcoin mixing service ★


View Profile
December 06, 2017, 06:49:42 AM
 #20

But 2.1 quadrillion only requires 41 bits of precision to represent (0x1E8F1C10800).

Are you sure about that?

21,000,000 * 100,000,000 = 2,100,000,000,000,000

0x1E8F1C10800 = 2,100,000,000,000

You seem to be off by a  factor of 1000.

This means that the upper 23 bits must be zero, no?

If there ever were a situation where all the possible bitcoins were sent to a single output (that really is impossible, but lets use it for an upper bound), then the resulting value would be:
0x775f059e40090

This would require 51 bits, leaving an extra 9 bits that remain unused.

How many of the 2,100,000,000,000,000 Satoshi is currently in circulation?

Subtract 1 000 000+ from Satoshi's stash
Subtract coins being send to Burn addresses
Subtract Lost coins <amount unknown>
Subtract Coin that are hoarded/Cold storage <not in circulation>
Subtract the coins that are not mined yet.

We should not be looking at the Total amount, but rather how many is in circulation. ^smile^


Adding precision is a completely different thing than adding precision AND stealing people’s bitcoin.  I can see most people supporting the former, but the only ones supporting the later doing so for nefarious reasons.  



I think we are missing each other with this post. By subtract, I mean that you should subtract those coins from this equation NOT steal people's coins. ^smile^

The coins will still be there, but they are out of circulation. So you cannot count them as available coins. This will leave us with a lot less Satoshi to play with.

Or, am I missing your point? ^smile^

Pages: [1] 2 »  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!