Bitcoin Forum
June 14, 2024, 09:59:19 PM *
News: Voting for pizza day contest
 
   Home   Help Search Login Register More  
Poll
Question: btc
1 - 12 (70.6%)
1 - 5 (29.4%)
Total Voters: 17

Pages: « 1 2 [3] 4 5 »  All
  Print  
Author Topic: btc  (Read 5652 times)
liondani
Member
**
Offline Offline

Activity: 97
Merit: 10

Inch by Inch,Play by Play


View Profile
June 03, 2015, 11:45:33 AM
 #41

BitShares = centralized scamcoin, how the f*ck can anyone put that in the "top 10".

can you elaborate further?
can you prove what you are saying or do you just say what you heard from others?...

Inch by Inch, Play by Play
Bitrated user: liondani.
Fuserleer
Legendary
*
Offline Offline

Activity: 1064
Merit: 1016



View Profile WWW
June 03, 2015, 04:27:55 PM
Last edit: June 03, 2015, 04:58:01 PM by Fuserleer
 #42


I know that 100,000 TPS sounds impossible, but so does an "automatic blockchain that pays for its own development with its profits"


100,000 TPS isn't impossible, its just impossible to do on a block chain in a true P2P decentralized manner without some form of centralization, or "magic trick".  It seems to me BitShares may be doing both as per these 2 quotes in the link you provided.

"...assuming that all the witness nodes have internet connections capable..." so it is indeed confined/centralized to a set of nodes

and

"...with an average transaction size of about 100 bytes."

The latter concerns me, the absolute minimum core basic requirements of a secure verifiable transaction are a sender, a receiver, a value and a signature.  Just the signature alone with a 256bit key is in the order of 70 bytes,  sender pubkey is 32, receiver RIPEMD160 address is 20, and a value is 8 which is a total of 130 bytes with the bare minimum.  If the key space is reduced to 160bit, then it will just fit in 100 bytes, but with a huge loss of security.

I'm assuming that to achieve this 100,000 TPS something similar to this happens:

Transactions are filtered through these "witness nodes" and I send 100 BTS to A.  If within a certain time frame, A moves it to B, and B to C, etc etc that TX isn't recorded on the block chain until such a time that it lives at X for longer than a specified period of time.  The 100 BTS movements between A -> B-> C....X are not recorded in full on the block chain (if at all), only the transaction A -> X

Taking that approach could indeed give you very high TX throughput, but if that is the method used (or something similar) it's a total hack in my opinion and may well lead to issues later.  

Of course I'm speculating as I don't have the time to research this properly, so if you could provide some links/docs/something that details how this works rather than me hunting, I'd appreciate it.  I stand by the fact that it is not possible to do, on a block chain, while recording all transactions to said chain, and allow any node to be a full node without special requirements.

Anyway, I'm getting off topic.  180,000 TPS might sound great to some, but Monero destroys BitShares in this arena with the best TPS ceiling of "infinity," so there's that option too of course.  In other words, the bitcoin community has infinitely more options than they are currently looking at, and I am just trying to show them that the state of crypto circa 2015 is "not your dad's crypto"

If Monero really has stated "infinity" as their TPS limit, then someone there really needs a reality check!  Regardless of what is possible on a block chain or not, the laws of thermodynamics will step in and dispatch a tough and thorough spanking waaaaay before "infinity" is even close Smiley

IMO the only way to achieve anything near a sustainable VISA level transaction throughput, stay in keeping with real decentralization (no special node sub-sets that are selected or voted), not perform any "tricks" which may compromise security, AND have all transactions on a public ledger is to scale horizontally, and NOT vertically!

Chain based ledgers can't scale vertically past a certain point, no matter how big your bag of tricks, nor your processing setup, horizontal is the only way and by that I mean a distributed and partitioned ledger of the ilk that we are doing over here.  No one has even attempted to do this, because its assumed impossible or too difficult, and if it is so be it, at least it was attempted.  However it's not impossible, we've ran it in many betas now and its is very close to being ready for use.


I like what you've done with e-munie.  You should come work for the BitShares blockchain.  Just submit a proposal, and the community would certainly vote you into a paid position (that's how BitShares members fund development).

https://bitsharestalk.org/index.php/board,61.0.html?PHPSESSID=2170a8f0b09b8fa2bdc7d35908ab4517


Heh thanks but no thanks, I've ploughed my life and everything I have into eMunie and I'm not jumping ship, ever Smiley


EDIT
----

So I did some more digging and came across this:

"...the idea being that if transactions have their signatures validated as they propagate across the network, a witness can have any number of computers surrounding him that validates all of these signatures, and then he gets a list of transactions and puts them in a block, and he doesn’t have to check those signatures himself, because he has got all these other nodes surrounding him that are dividing up the task."

Can someone clarify this?  Witness nodes, which build the blocks DO NOT check transaction signatures themselves, but rely on 3rd parties (which may be dishonest) to inform them that the signature for said transactions validate?  How does a witness node know if a 3rd party is providing false information regarding a transaction, claiming that it contains a valid signature when it may not?  If this happens, how then does the network resolve it, someone, somewhere must be doing a full validation of those 100,000 TPS to ensure that all transactions really are legitimate.

ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
June 03, 2015, 05:46:11 PM
 #43


I know that 100,000 TPS sounds impossible, but so does an "automatic blockchain that pays for its own development with its profits"


100,000 TPS isn't impossible, its just impossible to do on a block chain in a true P2P decentralized manner without some form of centralization, or "magic trick".  It seems to me BitShares may be doing both as per these 2 quotes in the link you provided.

"...assuming that all the witness nodes have internet connections capable..." so it is indeed confined/centralized to a set of nodes

and

"...with an average transaction size of about 100 bytes."

The latter concerns me, the absolute minimum core basic requirements of a secure verifiable transaction are a sender, a receiver, a value and a signature.  Just the signature alone with a 256bit key is in the order of 70 bytes,  sender pubkey is 32, receiver RIPEMD160 address is 20, and a value is 8 which is a total of 130 bytes with the bare minimum.  If the key space is reduced to 160bit, then it will just fit in 100 bytes, but with a huge loss of security.

I'm assuming that to achieve this 100,000 TPS something similar to this happens:

Transactions are filtered through these "witness nodes" and I send 100 BTS to A.  If within a certain time frame, A moves it to B, and B to C, etc etc that TX isn't recorded on the block chain until such a time that it lives at X for longer than a specified period of time.  The 100 BTS movements between A -> B-> C....X are not recorded in full on the block chain (if at all), only the transaction A -> X

Taking that approach could indeed give you very high TX throughput, but if that is the method used (or something similar) it's a total hack in my opinion and may well lead to issues later.  

Of course I'm speculating as I don't have the time to research this properly, so if you could provide some links/docs/something that details how this works rather than me hunting, I'd appreciate it.  I stand by the fact that it is not possible to do, on a block chain, while recording all transactions to said chain, and allow any node to be a full node without special requirements.

Anyway, I'm getting off topic.  180,000 TPS might sound great to some, but Monero destroys BitShares in this arena with the best TPS ceiling of "infinity," so there's that option too of course.  In other words, the bitcoin community has infinitely more options than they are currently looking at, and I am just trying to show them that the state of crypto circa 2015 is "not your dad's crypto"

If Monero really has stated "infinity" as their TPS limit, then someone there really needs a reality check!  Regardless of what is possible on a block chain or not, the laws of thermodynamics will step in and dispatch a tough and thorough spanking waaaaay before "infinity" is even close Smiley

IMO the only way to achieve anything near a sustainable VISA level transaction throughput, stay in keeping with real decentralization (no special node sub-sets that are selected or voted), not perform any "tricks" which may compromise security, AND have all transactions on a public ledger is to scale horizontally, and NOT vertically!

Chain based ledgers can't scale vertically past a certain point, no matter how big your bag of tricks, nor your processing setup, horizontal is the only way and by that I mean a distributed and partitioned ledger of the ilk that we are doing over here.  No one has even attempted to do this, because its assumed impossible or too difficult, and if it is so be it, at least it was attempted.  However it's not impossible, we've ran it in many betas now and its is very close to being ready for use.


I like what you've done with e-munie.  You should come work for the BitShares blockchain.  Just submit a proposal, and the community would certainly vote you into a paid position (that's how BitShares members fund development).

https://bitsharestalk.org/index.php/board,61.0.html?PHPSESSID=2170a8f0b09b8fa2bdc7d35908ab4517


Heh thanks but no thanks, I've ploughed my life and everything I have into eMunie and I'm not jumping ship, ever Smiley


EDIT
----

So I did some more digging and came across this:

"...the idea being that if transactions have their signatures validated as they propagate across the network, a witness can have any number of computers surrounding him that validates all of these signatures, and then he gets a list of transactions and puts them in a block, and he doesn’t have to check those signatures himself, because he has got all these other nodes surrounding him that are dividing up the task."

Can someone clarify this?  Witness nodes, which build the blocks DO NOT check transaction signatures themselves, but rely on 3rd parties (which may be dishonest) to inform them that the signature for said transactions validate?  How does a witness node know if a 3rd party is providing false information regarding a transaction, claiming that it contains a valid signature when it may not?  If this happens, how then does the network resolve it, someone, somewhere must be doing a full validation of those 100,000 TPS to ensure that all transactions really are legitimate.

The critical error made in this post is the assumption that storing, processing or transmitting a given amount of data will take a fixed amount of resources for ever. The history of technology over that last 200 years has already proven this assumption to be completely wrong.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
Fuserleer
Legendary
*
Offline Offline

Activity: 1064
Merit: 1016



View Profile WWW
June 03, 2015, 05:52:19 PM
 #44


The critical error made in this post is the assumption that storing, processing or transmitting a given amount of data will take a fixed amount of resources for ever. The history of technology over that last 200 years has already proven this assumption to be completely wrong.


Please elaborate.

100 bytes of data will always be 100 bytes of data, regardless of what technological advancements are made in storing, transmitting it, or processing it.  All you can hope for is that these methods of managing it progress closer to the optimal over time.

The ultimate "fixed amount of resources" required to process that 100 bytes in any manner are governed by the laws of thermodynamics.  So ultimately, there is a fixed amount of resource required to perform an action on that 100 bytes, and it stays in place forever, we just aren't anywhere near it.

ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
June 03, 2015, 06:05:19 PM
 #45

...

Please elaborate.

100 bytes of data will always be 100 bytes of data, regardless of what technological advancements are made in storing, transmitting it, or processing it.  All you can hope for is that these methods of managing progress closer to the optimal over time.

The ultimate "fixed amount of resources" required to process that 100 bytes in any manner are governed by the laws of thermodynamics.  So ultimately, there is a fixed amount of resource required to perform an action on that 100 bytes, and it stays in place forever, we just aren't anywhere near it.

The critical question is that amount of resources that are consumed rather than the amount of data that is processed. Take for example the first hard drive developed by IBM in 1956. http://www.extremetech.com/computing/90156-the-history-of-computer-storage-slideshow/6
Quote
The first hard disk drive shipped in 1956 with the IBM 305 RAMAC computer. The computer itself was vast — about 30 feet by 50 feet (9m x 15m) — and the storage device itself, the very first commercial hard disk drive, was a 1.5-meter cube. The drive had 50 24-inch platters and a total capacity of 5 million characters (5MB), or the equivalent of 64,000 punchcards. Just two read/write heads were used to access the entire array of platters. The platters only spun at 1200 RPM, too, which meant the average access time was very slow — around one second.
Now compare this with a modern 1TB SSD drive. The latter can handle  200,000 times as much data while using a minuscule fraction of the resources.

The math is actually very simple if the exponential rate of data growth is less than the exponential decline on the resources required to handle say 100 bytes of data, then the amount of resources is actually falling at an exponential rate.

Edit 1: I was born in 1957 so I have experienced this relationship between data and resources for my entire life.

Edit 2: The 5MB hard drive in 1956 was far less sustainable and far more centralizing than the 1TB SSD is today.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
Fuserleer
Legendary
*
Offline Offline

Activity: 1064
Merit: 1016



View Profile WWW
June 03, 2015, 06:12:22 PM
 #46

...

Please elaborate.

100 bytes of data will always be 100 bytes of data, regardless of what technological advancements are made in storing, transmitting it, or processing it.  All you can hope for is that these methods of managing progress closer to the optimal over time.

The ultimate "fixed amount of resources" required to process that 100 bytes in any manner are governed by the laws of thermodynamics.  So ultimately, there is a fixed amount of resource required to perform an action on that 100 bytes, and it stays in place forever, we just aren't anywhere near it.

The critical question is that amount of resources that are consumed rather than the amount of data that is processed. Take for example the first hard drive developed by IBM in 1956. http://www.extremetech.com/computing/90156-the-history-of-computer-storage-slideshow/6
Quote
The first hard disk drive shipped in 1956 with the IBM 305 RAMAC computer. The computer itself was vast — about 30 feet by 50 feet (9m x 15m) — and the storage device itself, the very first commercial hard disk drive, was a 1.5-meter cube. The drive had 50 24-inch platters and a total capacity of 5 million characters (5MB), or the equivalent of 64,000 punchcards. Just two read/write heads were used to access the entire array of platters. The platters only spun at 1200 RPM, too, which meant the average access time was very slow — around one second.
Now compare this with a modern 1TB SSD drive. The latter can handle  200,000 times as much data while using a minuscule fraction of the resources.

The math is actually very simple if the exponential rate of data growth is less than the exponential decline on the resources required to handle say 100 bytes of data, then the amount of resources is actually falling at an exponential rate.

Ok, that is technological advancement, and the resource requirements at any moment in time may be less than they were before.  I have little doubt that in 50 years time, my iWatch V10 can process 100k TPS with ease, but that is not the issue I was raising and your response also sidesteps away from your original statement which I countered.

My point was that as of today, it is not possible to achieve 100k TPS on a block chain, without using tricks and hacks, yet maintain the true nature of what crypto-currency is meant to be.  The resources required to do that vertically is too great, in 10-20-50 years time it may not be the case, but everyone wants this now, not in 50 years.

ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
June 03, 2015, 06:20:17 PM
 #47

...

Ok, that is technological advancement, and the resource requirements at any moment in time may be less than they were before.  I have little doubt that in 50 years time, my iWatch V10 can process 100k TPS with ease, but that is not the issue I was raising and your response also sidesteps away from your original statement which I countered.

My point was that as of today, it is not possible to achieve 100k TPS on a block chain, without using tricks and hacks, yet maintain the true nature of what crypto-currency is meant to be.  The resources required to do that vertically is too great, in 10-20-50 years time it may not be the case, but everyone wants this now, not in 50 years.

Yes but one cannot justify crippling Bitcoin to the current technology for ever. Furthermore many of us are looking at Bitcoin for its future rather than present value. Even over the life of Bitcoin we have a significant example. The cost of sending 1MB of data at the start of 2009 is the same as sending 21 MB of data in mid 2016. It is called Nielsen's Law http://www.nngroup.com/articles/law-of-bandwidth/

To come to the VISA example should credit cards not have been launched in the 1940's because it was not possible to scale to the levels of the VISA network of today with the punch card and tabulating machine technology of the day?

Edit: One the subject of the iWatch V10 it could easily process 100k TPS or more in 50 years, but only the 100k TPS that is approved by the censor board at Apple. This is because of the DRM in the device.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
Fuserleer
Legendary
*
Offline Offline

Activity: 1064
Merit: 1016



View Profile WWW
June 03, 2015, 06:30:40 PM
 #48

...

Ok, that is technological advancement, and the resource requirements at any moment in time may be less than they were before.  I have little doubt that in 50 years time, my iWatch V10 can process 100k TPS with ease, but that is not the issue I was raising and your response also sidesteps away from your original statement which I countered.

My point was that as of today, it is not possible to achieve 100k TPS on a block chain, without using tricks and hacks, yet maintain the true nature of what crypto-currency is meant to be.  The resources required to do that vertically is too great, in 10-20-50 years time it may not be the case, but everyone wants this now, not in 50 years.

Yes but one cannot justify crippling Bitcoin to the current technology for ever. Furthermore many of us are looking at Bitcoin for its future rather than present value. Even over the life of Bitcoin we have a significant example. The cost of sending 1MB of data at the start of 2009 is the same as sending 21 MB of data in mid 2016. It is called Nielsen's Law http://www.nngroup.com/articles/law-of-bandwidth/

To come to the VISA example should credit cards not have been launched in the 1940's because it was not possible to scale to the levels of the VISA network of today with the punch card and tabulating machine technology of the day?

I'm not suggesting it should be crippled, merely that it can only live within its technological means due to its architecture at that moment in time.  The architecture imposes its own limits because of the lack of available resources to perform better, or conversely, the resources available (or lack of) impose the limits onto Bitcoin.   This doesn't cripple anything, because over time as the resources available increase (or efficiency increases), Bitcoin is able to take advantage of them.

The same is for any currency, be it Bitcoin, Nxt, and others, they are limited in operation by the efficiency of the resources demanded by the implementation, at that moment in time.

There is no way to work around the resource limits available at any one moment in time in a vertical manner, which is the problem here with these "high TPS currencies", be it bandwidth, processing, storage...you can only be as efficient as the most efficient available resources will allow...past that, you have to start chopping things out and reduce the work load.

I'm sure that VISA didn't have 1 person verifying and doing all those punch card transactions, they had many, so it didn't matter that the resources available to do it were extremely inefficient.  

Block chains are like that scenario of just having 1 person, no matter how many nodes you have everyone of them is verifying the same data as everyone else, which is exactly the same (in terms of performance) as having just one verifying all the data.

mamamae
Legendary
*
Offline Offline

Activity: 1188
Merit: 1001



View Profile
June 03, 2015, 06:34:09 PM
 #49

not if we compress the data at least


The critical error made in this post is the assumption that storing, processing or transmitting a given amount of data will take a fixed amount of resources for ever. The history of technology over that last 200 years has already proven this assumption to be completely wrong.


Please elaborate.

100 bytes of data will always be 100 bytes of data, regardless of what technological advancements are made in storing, transmitting it, or processing it.  All you can hope for is that these methods of managing it progress closer to the optimal over time.

The ultimate "fixed amount of resources" required to process that 100 bytes in any manner are governed by the laws of thermodynamics.  So ultimately, there is a fixed amount of resource required to perform an action on that 100 bytes, and it stays in place forever, we just aren't anywhere near it.

reality ? you fell to Scammers after being in an ICO , IPO
(more like any other stock and index in the world ICO or not got your portfolio down 25 % or 85 %)
Now  SEC is helping you getting back up your lost money maybe....
Fuserleer
Legendary
*
Offline Offline

Activity: 1064
Merit: 1016



View Profile WWW
June 03, 2015, 06:35:12 PM
 #50

not if we compress the data at least


The critical error made in this post is the assumption that storing, processing or transmitting a given amount of data will take a fixed amount of resources for ever. The history of technology over that last 200 years has already proven this assumption to be completely wrong.


Please elaborate.

100 bytes of data will always be 100 bytes of data, regardless of what technological advancements are made in storing, transmitting it, or processing it.  All you can hope for is that these methods of managing it progress closer to the optimal over time.

The ultimate "fixed amount of resources" required to process that 100 bytes in any manner are governed by the laws of thermodynamics.  So ultimately, there is a fixed amount of resource required to perform an action on that 100 bytes, and it stays in place forever, we just aren't anywhere near it.

Block & transaction data is high entropy, if you compress it, it takes up more space.

You can compress some sets of 100 bytes to < 100 bytes, but most sets of 100 bytes will compress to > 100 bytes.  Thus, all sets of 100 bytes, always require 100 bytes

ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
June 03, 2015, 06:46:54 PM
 #51

...

I'm not suggesting it should be crippled, merely that it can only live within its technological means due to its architecture at that moment in time.  The architecture imposes its own limits because of the lack of available resources to perform better, or conversely, the resources available (or lack of) impose the limits onto Bitcoin.   This doesn't cripple anything, because over time as the resources available increase (or efficiency increases), Bitcoin is able to take advantage of them.

The same is for any currency, be it Bitcoin, Nxt, and others, they are limited in operation by the efficiency of the resources demanded by the implementation, at that moment in time.

There is no way to work around the resource limits available at any one moment in time in a vertical manner, which is the problem here with these "high TPS currencies", be it bandwidth, processing, storage...you can only be as efficient as the most efficient available resources will allow...past that, you have to start chopping things out and reduce the work load.

I'm sure that VISA didn't have 1 person verifying and doing all those punch card transactions, they had many, so it didn't matter that the resources available to do it were extremely inefficient.  Block chains are like that scenario, no matter how many nodes you have everyone of them is verifying the same data as everyone else, which is exactly the same (in terms of performance) as having just one verifying all the data.

Actually in the 1940's and up to the 1970's the primary methods of transactions were cash and other bearer instruments so nobody had to verify any ledger since there was no ledger to verify. It was simply too expensive to have even one entity doing the verification. With the more widespread availability of mainframe computers in the 1970's data processing costs came down and it became viable to have centralized ledgers so the widespread use of credit cards and in the 1990's debit cards became possible. Now as data processing costs have further come down it is becoming viable to have decentralized ledgers such as Bitcoin, Monero etc.

I really must emphasize that a historical perspective over the last 50 years and going back 200 years or more is critical to properly understand this issue.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
Fuserleer
Legendary
*
Offline Offline

Activity: 1064
Merit: 1016



View Profile WWW
June 03, 2015, 06:49:29 PM
 #52

...

I'm not suggesting it should be crippled, merely that it can only live within its technological means due to its architecture at that moment in time.  The architecture imposes its own limits because of the lack of available resources to perform better, or conversely, the resources available (or lack of) impose the limits onto Bitcoin.   This doesn't cripple anything, because over time as the resources available increase (or efficiency increases), Bitcoin is able to take advantage of them.

The same is for any currency, be it Bitcoin, Nxt, and others, they are limited in operation by the efficiency of the resources demanded by the implementation, at that moment in time.

There is no way to work around the resource limits available at any one moment in time in a vertical manner, which is the problem here with these "high TPS currencies", be it bandwidth, processing, storage...you can only be as efficient as the most efficient available resources will allow...past that, you have to start chopping things out and reduce the work load.

I'm sure that VISA didn't have 1 person verifying and doing all those punch card transactions, they had many, so it didn't matter that the resources available to do it were extremely inefficient.  Block chains are like that scenario, no matter how many nodes you have everyone of them is verifying the same data as everyone else, which is exactly the same (in terms of performance) as having just one verifying all the data.

Actually in the 1940's and up to the 1970's the primary methods of transactions were cash and other bearer instruments so nobody had to verify any ledger since there was no ledger to verify. It was simply too expensive to have even one entity doing the verification. With the more widespread availability of mainframe computers in the 1970's data processing costs came down and it became viable to have centralized ledgers so the widespread use of credit cards and in the 1990's debit cards became possible. Now as data processing costs have further come down it is becoming viable to have decentralized ledgers such as Bitcoin, Monero etc.

I really must emphasize that a historical perspective over the last 50 years and going back 200 years or more is critical to properly understand this issue.

I think we'll have to agree to disagree, purely because I think one of us is not understanding what the other is trying to convey.

Good talk though Smiley

kazuki49
Sr. Member
****
Offline Offline

Activity: 350
Merit: 250



View Profile
June 03, 2015, 06:53:36 PM
 #53

I think the correct term for Monero is not "infinite TPS" but "virtually infinite TPS", it can scale to infinity but it doesnt mean you can make an infinite number or even a ridiculously big number of transactions right now all of a sudden.
Fuserleer
Legendary
*
Offline Offline

Activity: 1064
Merit: 1016



View Profile WWW
June 03, 2015, 06:57:34 PM
 #54

I think the correct term for Monero is not "infinite TPS" but "virtually infinite TPS", it can scale to infinity but it doesnt mean you can make an infinite number or even a ridiculously big number of transactions right now all of a sudden.

technically it cant scale to infinity at all, because that would require more energy than there is in an infinite number of universes.  Its very frustrating how the word "infinite" is thrown around without any understanding of what it means and the implications of such.

A more correct term would be "unbounded", or "unlimited", which suggest exactly what it is, there are no limits imposed by the protocol.

kazuki49
Sr. Member
****
Offline Offline

Activity: 350
Merit: 250



View Profile
June 03, 2015, 06:59:28 PM
 #55

I think the correct term for Monero is not "infinite TPS" but "virtually infinite TPS", it can scale to infinity but it doesnt mean you can make an infinite number or even a ridiculously big number of transactions right now all of a sudden.

technically it cant scale to infinity at all, because that would require more energy than there is in an infinite number of universes.  Its very frustrating how the word "infinite" is thrown around without any understanding of what it means and the implications of such.

A more correct term would be "unbounded", or "unlimited", which suggest exactly what it is, there are no limits imposed by the protocol.

technically the term is correct too:

Quote
vir·tu·al·ly
ˈvərCH(əw)əlē/
adverb
adverb: virtually

    1.
    nearly; almost.
    "virtually all those arrested were accused"
    synonyms:   effectively, in effect, all but, more or less, practically, almost, nearly, close to, verging on, just about, as good as, essentially, to all intents and purposes, roughly, approximately; More

and how do you know the universe isn't infinite? are you Satoshi our god?
Fuserleer
Legendary
*
Offline Offline

Activity: 1064
Merit: 1016



View Profile WWW
June 03, 2015, 07:04:21 PM
 #56

I think the correct term for Monero is not "infinite TPS" but "virtually infinite TPS", it can scale to infinity but it doesnt mean you can make an infinite number or even a ridiculously big number of transactions right now all of a sudden.

technically it cant scale to infinity at all, because that would require more energy than there is in an infinite number of universes.  Its very frustrating how the word "infinite" is thrown around without any understanding of what it means and the implications of such.

A more correct term would be "unbounded", or "unlimited", which suggest exactly what it is, there are no limits imposed by the protocol.

Quote
technically the term is correct.

vir·tu·al·ly
ˈvərCH(əw)əlē/
adverb
adverb: virtually

    1.
    nearly; almost.
    "virtually all those arrested were accused"
    synonyms:   effectively, in effect, all but, more or less, practically, almost, nearly, close to, verging on, just about, as good as, essentially, to all intents and purposes, roughly, approximately; More

How do you define "almost" in terms of infinity when infinity itself has no upper boundary?  You don't know where infinity "stops" (because it doesn't), so you can never know if you are almost there or not.

The definition of infinity is abstract, so its impossible to quantify it in a non-abstract manner as "virtually" attempts to do Smiley

Fuserleer
Legendary
*
Offline Offline

Activity: 1064
Merit: 1016



View Profile WWW
June 03, 2015, 07:05:32 PM
 #57

are you Satoshi our god?

In one of the infinite universes, yes I am Wink

ArticMine
Legendary
*
Offline Offline

Activity: 2282
Merit: 1050


Monero Core Team


View Profile
June 03, 2015, 07:05:47 PM
 #58

I think the correct term for Monero is not "infinite TPS" but "virtually infinite TPS", it can scale to infinity but it doesnt mean you can make an infinite number or even a ridiculously big number of transactions right now all of a sudden.

technically it cant scale to infinity at all, because that would require more energy than there is in an infinite number of universes.  Its very frustrating how the word "infinite" is thrown around without any understanding of what it means and the implications of such.

A more correct term would be "unbounded", or "unlimited", which suggest exactly what it is, there are no limits imposed by the protocol.

Yes this is correct. The limits should be imposed by the market given what is practical using the technology of the day and not baked into the protocol. That is why adaptive limits that are market driven is the way to go here. In this fashion as the cost of data processing falls the TPS can increase while maintaining the same level of decentralization.

Concerned that blockchain bloat will lead to centralization? Storing less than 4 GB of data once required the budget of a superpower and a warehouse full of punched cards. https://upload.wikimedia.org/wikipedia/commons/8/87/IBM_card_storage.NARA.jpg https://en.wikipedia.org/wiki/Punched_card
kazuki49
Sr. Member
****
Offline Offline

Activity: 350
Merit: 250



View Profile
June 03, 2015, 07:07:32 PM
 #59

I think the correct term for Monero is not "infinite TPS" but "virtually infinite TPS", it can scale to infinity but it doesnt mean you can make an infinite number or even a ridiculously big number of transactions right now all of a sudden.

technically it cant scale to infinity at all, because that would require more energy than there is in an infinite number of universes.  Its very frustrating how the word "infinite" is thrown around without any understanding of what it means and the implications of such.

A more correct term would be "unbounded", or "unlimited", which suggest exactly what it is, there are no limits imposed by the protocol.

Quote
technically the term is correct.

vir·tu·al·ly
ˈvərCH(əw)əlē/
adverb
adverb: virtually

    1.
    nearly; almost.
    "virtually all those arrested were accused"
    synonyms:   effectively, in effect, all but, more or less, practically, almost, nearly, close to, verging on, just about, as good as, essentially, to all intents and purposes, roughly, approximately; More

How do you define "almost" in terms of infinity when infinity itself has no upper boundary?  You don't know where infinity "stops" (because it doesn't), so you can never know if you are almost there or not.

The definition of infinity is abstract, so so its impossible to quantify it in a non-abstract manner as "virtually" attempts to do Smiley

This is not the point of the word infinity, it means without any limit and its exactly what we have in Monero, the add of "virtually" is just a honest distortion to the fact we are limited to the amount of energy we can extract and store, like you said early.
Fuserleer
Legendary
*
Offline Offline

Activity: 1064
Merit: 1016



View Profile WWW
June 03, 2015, 07:14:06 PM
 #60

I think the correct term for Monero is not "infinite TPS" but "virtually infinite TPS", it can scale to infinity but it doesnt mean you can make an infinite number or even a ridiculously big number of transactions right now all of a sudden.

technically it cant scale to infinity at all, because that would require more energy than there is in an infinite number of universes.  Its very frustrating how the word "infinite" is thrown around without any understanding of what it means and the implications of such.

A more correct term would be "unbounded", or "unlimited", which suggest exactly what it is, there are no limits imposed by the protocol.

Quote
technically the term is correct.

vir·tu·al·ly
ˈvərCH(əw)əlē/
adverb
adverb: virtually

    1.
    nearly; almost.
    "virtually all those arrested were accused"
    synonyms:   effectively, in effect, all but, more or less, practically, almost, nearly, close to, verging on, just about, as good as, essentially, to all intents and purposes, roughly, approximately; More

How do you define "almost" in terms of infinity when infinity itself has no upper boundary?  You don't know where infinity "stops" (because it doesn't), so you can never know if you are almost there or not.

The definition of infinity is abstract, so so its impossible to quantify it in a non-abstract manner as "virtually" attempts to do Smiley

This is not the point of the word infinity, it means without any limit and its exactly what we have in Monero, the add of "virtually" is just a honest distortion to the fact we are limited to the amount of energy we can extract and store, like you said early.

Please show me how you are more or less, practically, almost, nearly, close to, verging on, just about, as good as, essentially, to all intents and purposes, roughly, approximately, extracting all the energy from the universe to get close to infinity processing capability? Smiley

Pages: « 1 2 [3] 4 5 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!