Bitcoin Forum

Bitcoin => Development & Technical Discussion => Topic started by: vinodjax on September 09, 2018, 10:04:41 AM



Title: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 09, 2018, 10:04:41 AM
Hey everyone,

I have fixed the blockchain scaling issue. I am creating a network with the necessary fixes in place. I am also writing a paper so everyone can understand the fix. Please advise what are the further steps to take from here.

My vision is to build a Value Network so that everyone can have access to micropayments and to create a society where people are motivated to add value by means of scientific inventions.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: aleksej996 on September 09, 2018, 10:55:01 AM
There was never an issue with figuring out the way to scale a blockchain, there was just a debate on which solution we will implement.
That debate resulted in a fork as people couldn't agree.

I am glad that you have another solution, but this is a 'problem' that even Satoshi gave a simple solution for.
It is a kinda of a virtual problem, not a natural one. The block limit was purposely implemented for spam protection.
So it is very simple if you just want to solve it simplest and quickest way possible, but off-chain solution ended up being followed by most.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 09, 2018, 12:20:31 PM
Hello aleksej996,

Glad to see you post here. I appreciate you sharing your points.

 I am talking about scaling payments and data to millions of transactions per second without any "Central validator hub". I am not aware of any present solution that does the same. The lightning network works only in theory because it still needs everyone to be in a payment channel with someone. Just writing those "LOCKED" payment channels requires a lot of on-chain space.

I have a solution which I have been working on for quite some time which solves the scalability trilemma.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 09, 2018, 12:32:40 PM
I am glad that you have another solution, but this is a 'problem' that even Satoshi gave a simple solution for.
It is a kinda of a virtual problem, not a natural one. The block limit was purposely implemented for spam protection.
So it is very simple if you just want to solve it simplest and quickest way possible, but off-chain solution ended up being followed by most.

Here's my view for your statement. If 10 billion people in the world used blockchain for payments. That's at least a million transactions a second. Currently, the Blocksize is 1 MB. If the minimum BTC transaction size is 250 bytes.  ~4,000 transactions per block. That's 6.66 Transactions per second.

I have a solution (proof of concept in GoLang) which scales up to 1M transactions per second with a smaller block size without compromising on decentralization / security / future scalability.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 09, 2018, 01:33:26 PM
Hey ETFbitcoin,

Thank you for taking the time to post.


Release your whitepaper and make sure someone do peer review on your paper. Release it on it's own testnet or release the source code also helps.

I am going to do exactly that. I have the proof of concept stress tested.  Will be publishing the white paper within a month and releasing the test network within 2 months and main network within 3 months.

It would be great if you can do a peer review too and point to the issues I have so that I can improve upon thigs. Of course this system also comes with a Turing complete VM and I am planning to run the entire project as a smart contract with tokens issued in a DAO style to anyone who adds value. Will share the business plan and other stuff at a later stage after the white paper is ready.

You mention LN's disadvantage, not why it only works on theory since LN already used on both testnet and mainnet.

Yes, thanks for correcting me. What I meant was that it works only in theory for "unlimited scaling".

Assuming current block size is 1MB (which isn't because we use block weight limit with 4 kWu limit) or AFAIK 1.000.000 bytes in reality, then that means 1 transaction only have size about 1 byte and i can't see how would you fit transaction (input, output and signature). But i've no idea about GoLang.

ETFbitcoin, It has nothing to do with GoLang. Data is data everywhere. My solution algorithm reaches sharding consensus with infinitesimally low failure rate and extremely high fault tolerance.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: gmaxwell on September 09, 2018, 02:33:49 PM
The block limit was purposely implemented for spam protection
This claim has is just one of the pieces of unsupported nonsense pumped out by paid shills, it's so sad that so many people have just accepted it because of the number of sock accounts that have constantly repeated it. :(

As far as the thread goes, many people have made similar claims before. Their proposals have inevitably turned out to banal rehashings of more or less obvious ideas that the Bitcoin community considered and rejected as far back as 2010 because they don't respect some critical element of what makes Bitcoin Bitcoin that the proposer themselves wasn't aware of as a result of so much bitcoin promotional material giving an incomplete view of the system. The most common things I've seen such proposals do is introduce unfounded third party trust (usually in miners, sometimes in other random node operators) and in doing so eliminate the system's trustlessness.

Regardless, instead of saying that you have an idea that achieves some grand effect, you should simply publish the idea.  Crowing about it just burns your and everyone elses time without making the world a better place or having an opportunity to improve your understanding.



Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 09, 2018, 02:46:08 PM

As far as the thread goes, many people have made similar claims before. Their proposals have inevitably turned out to banal rehashings of more or less obvious ideas that the Bitcoin community considered and rejected as far back as 2010 because they don't respect some critical element of what makes Bitcoin Bitcoin that the proposer themselves wasn't aware of as a result of so much bitcoin promotional material giving an incomplete view of the system. The most common things I've seen such proposals do is introduce unfounded third-party trust (usually in miners, sometimes in other random node operators) and in doing so eliminate the system's trustlessness.

Regardless, instead of saying that you have an idea that achieves some grand effect, you should simply publish the idea.  Crowing about it just burns your and everyone else's time without making the world a better place or having an opportunity to improve your understanding.


Agreed. I will be publishing my paper very shortly in this same thread.

Gmaxwell, since you seem to be an old hand. Let me convince you super quick as to how the solution works.

Important statement #1 :

Ask yourself from a 40,000 feet view.

What is a currency?

A currency is something that is pegged to a base value Like Gold. All the scaling issues of blockchains start at the coin base. Since PoW is NOT reusable, one simply has to issue a coin base equal to the amount of work that they have performed.

For eg. Let's assume the base target is 100. If someone finds a nonce to create a hash that has a target below 100, he's issued let's say 1 coin. If someone spends double the amount of hash power, they have to be issued 2 coins. i.e. the coin base shouldn't be fixed. What this means is that all chains following this consensus have an equal store of value hence facilitating inter-chain transfers at face value.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 09, 2018, 02:54:29 PM
Here's some more info that I have quickly typed in for you to look at :

The fix is simple.

Bitcoin has solved most of the puzzle with blockchain.

We simply propose a few fixes :

1) The coin value should be directly pegged to the issuance of the coins. Which means the coin base transaction should be equal to the difficulty factor of the target that's met.

For eg. If the target should be less than 100,000 - then one coin shall be issued.

If the target is less than 50,000 - then two coins shall be issued in the coin base.

This way multiple chains are equal. Given this. People can move accounts and transfer money between different chains.

Any chain with a proof of work chain is valid and it's easy to estimate the amount of value in them.

How do we prevent double spend?

Remember, the proof of Work is actually a timestamp and a sure way to measure time. To achieve decentralized consensus, we simply accept the chain as valid which holds in a section of its headers - the full sequence of the mother chains' timestamps (block hashes). I have a specific algorithm which measures the trustability of a chain. Will be releasing it along with the reference client.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 09, 2018, 03:02:16 PM

Regardless, instead of saying that you have an idea that achieves some grand effect, you should simply publish the idea.  Crowing about it just burns your and everyone elses time without making the world a better place or having an opportunity to improve your understanding.


I am going to try and publish the paper in the coming week along with a proof of concept test network. Will publish the source code for that so that maybe you can then provide me detailed feedback.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 09, 2018, 03:26:34 PM
A quick example image that I made with draw.io to explain my views.

https://pasteboard.co/HD6ALBV.png


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: Anonymous Kid on September 11, 2018, 06:59:57 PM
A quick example image that I made with draw.io to explain my views.

https://pasteboard.co/HD6ALBV.png
Does this not mean that the main chain or "monster chain" as you called it would get insanely big? How does that scale?


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 12, 2018, 07:28:05 AM
It doesn't become big. There's more details involved which will I will publish in my paper. I just gave a quick overview. If you are a developer, you are welcome to join me in coding the reference client if you will be interested and I can provide you more information.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: aleksej996 on September 12, 2018, 10:06:53 AM
This claim has is just one of the pieces of unsupported nonsense pumped out by paid shills, it's so sad that so many people have just accepted it because of the number of sock accounts that have constantly repeated it. :(

Well this is very surprising to me.
If it wasn't for stopping the blockchain of getting too big, why was it implemented then?


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 12, 2018, 11:25:42 AM
This claim has is just one of the pieces of unsupported nonsense pumped out by paid shills, it's so sad that so many people have just accepted it because of the number of sock accounts that have constantly repeated it. :(

Well this is very surprising to me.
If it wasn't for stopping the blockchain of getting too big, why was it implemented then?

I am not completely sure of the motivation behind that decision.

If I were to guess, Satoshi probably implemented that block size limit so that it allows for guaranteed full propagation to all the nodes regardless of their geographic location and latency before the 10 minute interval kicks in, so the nodes have enough time to at least easily verify a block and hence be ready to validate the next one.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: Piggy on September 14, 2018, 04:39:37 AM
vinodjax, could you please tell what is your background, what do you do? Is this work your are doing is something related with your job, a side project or something else?

In any case i'm waiting to see what you came up with.  :)


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 14, 2018, 09:20:10 AM
vinodjax, could you please tell what is your background, what do you do? Is this work your are doing is something related with your job, a side project or something else?

In any case i'm waiting to see what you came up with.  :)

I am a software enthusiast / IT security enthusiast turned into an entrepreneur (gaming/casino industry). I am the founder of a multi-million dollar gaming brand. (at least valued at $10m using a 3x revenue valuation formula.

Our casino brands for the Indian market are :

www.khelo365.com
www.deccanrummy.com

Like I said, I have not worked for anyone. Fixing the blockchain scaling problem is what I have been working on for the last 4 months.

Sure, Once I publish the paper, you will understand my concept.

Our vision is to build a payment processor that scales for millions of transactions per second using blockchain technology where we issue a currency underwritten by CPU Power.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: aliashraf on September 14, 2018, 09:48:05 AM

1) The coin value should be directly pegged to the issuance of the coins. Which means the coin base transaction should be equal to the difficulty factor of the target that's met.

For eg. If the target should be less than 100,000 - then one coin shall be issued.

If the target is less than 50,000 - then two coins shall be issued in the coin base.

This way multiple chains are equal. Given this. People can move accounts and transfer money between different chains.

Any chain with a proof of work chain is valid and it's easy to estimate the amount of value in them.
It seems to me a kinda chaotic version of my own proposal PoCW. I say chaotic because it is more deconstructional and we should deal with a whole new data structure which is more complex than what is used in legacy blockchain. This is definitively a disadvantage: More sophisticated the model less predictable and more error prone it is.

As of the basic idea, rewarding miners proportional to their blocks' difficulty which is relatively similar to what I  suggested in PoCW, obviously I admire it.

Few questions by the way:
1- How is it possible to keep track of difficulty and total coin supply?

2- Full nodes should validate all chains for inter-chain transactions. How this is handled?

3- Immutability of a confirmed transaction on the blockchain is not guaranteed only by the work done on the containing block it is achieved by total work of all of the next blocks in the chain. Dividing work between sub-chains escalates re-write attacks on weak chains. What's your mitigation?



Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 14, 2018, 11:17:44 AM

1) The coin value should be directly pegged to the issuance of the coins. Which means the coin base transaction should be equal to the difficulty factor of the target that's met.

For eg. If the target should be less than 100,000 - then one coin shall be issued.

If the target is less than 50,000 - then two coins shall be issued in the coin base.

This way multiple chains are equal. Given this. People can move accounts and transfer money between different chains.

Any chain with a proof of work chain is valid and it's easy to estimate the amount of value in them.
It seems to me a kinda chaotic version of my own proposal PoCW. I say chaotic because it is more deconstructional and we should deal with a whole new data structure which is more complex than what is used in legacy blockchain. This is definitively a disadvantage: More sophisticated the model less predictable and more error prone it is.

As of the basic idea, rewarding miners proportional to their blocks' difficulty which is relatively similar to what I  suggested in PoCW, obviously I admire it.

Few questions by the way:
1- How is it possible to keep track of difficulty and total coin supply?

2- Full nodes should validate all chains for inter-chain transactions. How this is handled?

3- Immutability of a confirmed transaction on the blockchain is not guaranteed only by the work done on the containing block it is achieved by total work of all of the next blocks in the chain. Dividing work between sub-chains escalates re-write attacks on weak chains. What's your mitigation?




As of the basic idea, rewarding miners proportional to their blocks' difficulty which is relatively similar to what I  suggested in PoCW, obviously I admire it.

The purpose of technology is to help mankind and not the other way around. So, if it has to come down to redoing the entire data structure or coming up with a fresh perspective, as long as it solves the problem of scaling transactions in an immutable and decentralized way, it should be fine. Nothing is Chaotic.


As of the basic idea, rewarding miners proportional to their blocks' difficulty which is relatively similar to what I  suggested in PoCW, obviously I admire it.

Interesting, I would like to look more deeply into your proposal if you can share with me the links to your concept.

1- How is it possible to keep track of difficulty and total coin supply?

You don't need to. You simply set it as a hardcoded value as part of the consensus protocol. Let's say the base target is 1,000,000 which requires 1 Th/s to compute a hash lower than 1,000,000.

Then, let's say: a hash generation for base target achievement rewards 100 coins.

If someone computes a hash lesser than a target of 500,000, assuming it takes 2 Th/s, then you simply reward them 200 coins.

This way just by looking at the hashes, you can compute the difficulty of the chain and the total coin supply. Also, you can add uncle features like in ethereum so that you can have lesser block times and more ways of calculating the heaviest chain.

To briefly answer your question, you can keep track of difficulty from the adjusted targets. You can keep track of the coin supply simply by figuring out how difficult of a hash was found. If the hash is a multiple of a base target, issue them x number of coins.

2- Full nodes should validate all chains for inter-chain transactions. How this is handled?

It's not needed.

Let's say we both have to transfer funds. As long you are getting paid by a transfer agent on your own chain, you don't really need to be worried about how the transfer agent is managing his risk. As long as the transfer agent has a copy of the mother chain and a copy of the daughter chains ( 1- from where he is transferring payments and 2- from where he's transferring payments to)  (OR) payment channels between the different parties involved - he doesn't need to know any more information about any further chains.

The transfer agent simply accepts the heaviest chain as valid (and the most valuable in case of a tie) in his subnetwork (indicated by a separate network ID on the broadcasted messages).

To briefly answer your question, No one needs to verify all the chains. Every block on the daughter chains simply link to the next blockhash of the mother chain in their headers. Hence, everyone just needs a copy of the mother chain and their respective daughter chain to find out the truth in a decentralized fashion. The heaviest, most valuable chain which is time valid (validated by checking the sequence of motherchain block hashes mentioned in the daughter chain) in a subnetwork is assumed to be the true chain.

3- Immutability of a confirmed transaction on the blockchain is not guaranteed only by the work done on the containing block it is achieved by total work of all of the next blocks in the chain. Dividing work between sub-chains escalates re-write attacks on weak chains. What's your mitigation?

Proof of Work is not reusable.

What is stopping 51% attacks on bitcoin? It's simply because miners find more value in writing new blocks and creating coins for themselves instead of distorting an existing system and hence leading to a value in the currency - Not to mention the madness it will take to waste that kind of resources just to rewrite a few blocks with a not so certain outcome. It's called a free market.

Based on my above statement, the root of trust of bitcoin or anything that has to deal with linking to the checksum of blocks from the past comes down to people acting with honesty for profit.

Simply assume that there are a lot of systems out there like Bitcoin Cash, Bitcoin, Ethereum etc. and imagine if anyone could say just looking at their block-hashes, how many coins each miner and hence the entire system would have. That would effectively make all the coins of equal value because they are underwritten by an exact amount of CPU power (given that we also allow uncle blocks). Now, one could make an argument like yours that longer chains and chains with more hash rate are more trusted and their valuations could be higher than the weaker chains. In a free market this simpy means that it's a high risk payment and the transfer agent may charge higher fees for such a transfer. The main person at risk here is the payment transfer agent and this would amount to business risk in a free-market and the respective entities would resolve this. Maybe, the top miners would choose to act as transfer agents.



Note: Give me until the end of this month. I will have a working proof of concept which you can fork and run and stress test the system for yourself.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: aliashraf on September 14, 2018, 03:22:19 PM
Ok dude. I'm looking forward for your next 'steps'.

Meanwhile:
Be more careful with the whole thing. A decentralized, permissionless consensus system is a hell of a software project. Hostile environment, sophisticated state machine, heavy game theory related issues, multiple socio-politico-economical factors involved, ... just take it more serious than a usual software project.

You should reconcile with literature in the field. Many brilliant people have contributed to crypto ecosystem and you can always find precious ideas reading their works.

As of my PoCW proposal, you can find it here (https://bitcointalk.org/index.php?topic=4438334.0).


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: franky1 on September 14, 2018, 05:38:07 PM
solution to onchain scaling
1. reduce the sigops limit per tx..  no one should get to have 20% control of a block for thier tx
benefit. more tx per block if a greedy tx'er tries and less time needed to validate each tx (its what would have solved the linier validation problem that legacy transactions had that certain paid devs used as an excuse to push segwit through)
(p.s some of the new/reintroduced OPCodes actually make 'malleability problem' possible again.. (facepalm))

2. let the 4mb WEIGHT be 4mb LEGACY space. that way you can take out all the jumbled code of x4 and just get back to normal WEIGHT=MB used and not the 'virtual' space used. 4mb should be 4mb 1 mb should be 1mb its the whole point of the limits

3. if space is the problem then stop adding in new features that increase the average bytes per TX. EG confidential payments will add more bytes to a TX.
if you really love privacy so much then go use LN and be private and leave bitcoins mainnet to be lean and also FULLY AUDITABLE where by people can see funds from block reward to current owner. to prove value is real.. hiding it will make people trust the currency less (oops did i just counter the other plans devs have to ruin bitcoin by screaming it cant scale it aint a payment network and soon it aint auditable(the plans of killing a currency are by taking away its utility))

4. go back to basics
bring back a fee mechanism thats not like the old one where the amount held by UTXO negates the bytes used to get the fee low. yea the old fee formulae made it cheap for rich UTXO and expensive for small holders (facepalm)

but instead a (simplified for conversational purpose)
bytes used * (144/confirms of UTXO)
meaning if the coins only have 1confirm it will cost them 144x more then normal.. thus spammers (over the ordinary use) pay more after all if they want to transact more than once a day. lock funds into LN and go play


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 14, 2018, 05:41:58 PM
Ok dude. I'm looking forward for your next 'steps'.

Meanwhile:
Be more careful with the whole thing. A decentralized, permissionless consensus system is a hell of a software project. Hostile environment, sophisticated state machine, heavy game theory related issues, multiple socio-politico-economical factors involved, ... just take it more serious than a usual software project.

You should reconcile with literature in the field. Many brilliant people have contributed to crypto ecosystem and you can always find precious ideas reading their works.

As of my PoCW proposal, you can find it here (https://bitcointalk.org/index.php?topic=4438334.0).

Thanks for the link. Can you please check your PM? I am interested in knowing more about your vision.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 14, 2018, 05:44:31 PM
solution to onchain scaling
1. reduce the sigops limit per tx..  no one should get to have 20% control of a block for thier tx
benefit. more tx per block if a greedy tx'er tries and less time needed to validate each tx (its what would have solved the linier validation problem that legacy transactions had that certain paid devs used as an excuse to push segwit through)
(p.s some of the new/reintroduced OPCodes actually make 'malleability problem' possible again.. (facepalm))

2. let the 4mb WEIGHT be 4mb LEGACY space. that way you can take out all the jumbled code of x4 and just get back to normal WEIGHT=MB used and not the 'virtual' space used. 4mb should be 4mb 1 mb should be 1mb its the whole point of the limits

3. if space is the problem then stop adding in new features that increase the average bytes per TX. EG confidential payments will add more bytes to a TX.
if you really love privacy so much then go use LN and be private and leave bitcoins mainnet to be lean and also FULLY AUDITABLE where by people can see funds from block reward to current owner. to prove value is real.. hiding it will make people trust the currency less (oops did i just counter the other plans devs have to ruin bitcoin by screaming it cant scale it aint a payment network and soon it aint auditable(the plans of killing a currency are by taking away its utility))

4. go back to basics
bring back a fee mechanism thats not like the old one where the amount held by UTXO negates the bytes used to get the fee low. yea the old fee formulae made it cheap for rich UTXO and expensive for small holders (facepalm)

but instead a (simplified for conversational purpose)
bytes used * (144/confirms of UTXO)
meaning if the coins only have 1confirm it will cost them 144x more then normal.. thus spammers (over the ordinary use) pay more after all if they want to transact more than once a day. lock funds into LN and go play

I don't think this will scale to millions of tx / second. Please correct me if I am wrong. LN still needs locked multi-sig onchain transfers. If 2 million people were to use bitcoin in an ideal best case scenario, we are talking about 1 M locked transfers. Plus only 4200 locked transfers (deposits) per 10 minutes. Don't get me wrong, LN may perfectly work. But I am just not sure how often will people need to replenish the bonds (multi-sig) transactions and if a single chain can only provide so much space for such capacity.

Besides,

1) it also has to come down to a lot of availability (in terms of funds) given the number of HOPS it might take to reach your payment destination.
2) If a new account (public key) were to receive a transfer on the system, there's no chance you can do it via LN because you simply have to create a new state for the account. So, in a worst case scenario if there are a lot of new accounts that are on-boarded. It would simply be able to onboard only 4200 transactions per 10 minutes and hence take 5 years to onboard 1 billion users (Not to mention the huge disk space that the chain now takes up).
3) I personally believe that deflation is more of a rich get richer scheme. Money, in my opinion, was created for the sole purpose of getting people motivated to contribute to society by employing their skills, not simply making more with it (without any risk). Deflation simply means that people will end up hoarding more and it's never going to be adopted mainstream in that scenario because the mindset of the users would be to hoard it. I understand this is a sensitive point and hence I have simply stated that it's my opinion.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: franky1 on September 15, 2018, 03:34:16 PM
I don't think this will scale to millions of tx / second.

we do not need millions a second..
think about it 1mill a second
= 60m a minute
= 3.6b an hour
= 86billion a day.
thats ludicrous

first of all visa stats and mastercard show that people on average do 45 transactions a month. which is 1.5 a day
obviously bitcoin is not as acceptable as visa/mastercard. so for now we should be working on less than visa and scale up as we go.

so with that said. we dont need a 86BILLION tx a day network we dont even need a 8.6 billion tx a day network. because not everyone is going to use bitcoin everyday right now.

so forget 1m tx a second. forget 100k tx a second.

the stupid fools that think what the community are begging for are move from <7 to  >million overnight and shout out "gigabytes by midnight" , are the small minded people that never actually think rationally or logically.
i really find it funny that the onchain scalers are not screaming for millions per second. but instead steady progress over time.. and its actually the offchain people who dont understand, dont trust or have been paid to give up on blockchains that are screaming for millions per second and gigabytes by midnight as the only options.

the true comunity are begging for progressive onchain rises over logical time. not huge leaps of onchain jumps or lse needing to abandon blockchain utility for some locked fed reserve network of swapping account of promissory notes

what we need to do is not let bitcoin get stifled so much that it ends up as a 0 tx a second and then just 5 mega factory TX's every 10 minutes where the LN factories swap their reserves with other factories onchain.. (bank reserve swaps) yea the devs want bitcoin to be the backend of the LN network. where reserves are swapped.. yep thats right they want bitcoin as a bankers reserve currency and where LN is the bankers payment system for their clients(channel users)

anyway we need to preserve onchain utility for normal people. that involves going back to basics. and then expand as we go too

so lets start by making tx's lean. to gt to a possible 4200tx per mb. then with 4mb being allowed by the devs. lets open the legacy limit so the full4mb can be used by such 16,800tx for 4mb.
16,800 =2million a day
(oh look the 2million users a day in YOUR 2million ideal scenario)

so using actual stats from visa of spending per day. 2million users would only need 4mb blocks of normal lean tx's where legacy transactions can utilise the full 4mb

also to note in LN 2 million users with LN is actually an average of each user needing 5 channels so thats 10million tx's to open and 10million to close.
which means people have to preplan and organise funds for about a 10 day lockin.

...

and before you rebut that blockchains cant scale.. that was like the deluded script of kodak saying digital photography cant scale because of floppy disk data limits. (look how that turned out)
if you want to doubt the limits of data and the internet. please go tell skyp that HD video calls dont work or go complain to EAgames that online gaming cant work. or speak to youtube, twitch, and other livestream sits cant work

or do the rational thing, and realise technology expands all the time and things progress as time goes on.
but whatever you do dont even try to say blockchains are broke and the only way forward is locking funds into channels(accounts) that need someone else to sign for a payment and needs every middleman and the destination to be online to accept payment.. as that is the same AND worse system than what bankers already offer


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: aliashraf on September 15, 2018, 08:47:06 PM
@franky1

Where have you been Frank, by the way?

Nowadays, we are short of true bitcoiners, the kind of people who have not given up with bitcoin and are still committed to the cause: an electronic p2p cash system. Where the hell are other guys?

I feel so lonely sometimes here the stupid block size debate has polarized this community in the worst way ever and we are short of true bitcoin advocates, keep posting more.

As of tx/s throughput I would say like 100 tx/s suffices for near future and is achievable by few improvements definitively including
1- a block time decrease
2- using schnorr signatures
for mid term we need decentralization by getting rid of pools to be prepared for sharding.

We should never ever trust  any second layer protocol/solution for scaling or other serious issues which bitcoin is designed for. Putting utilities on top of bitcoin is not bad, actually it would be very helpful as long as such utilities are not designed to compete with bitcoin and alienate people from the core system.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: franky1 on September 15, 2018, 11:31:52 PM
@franky1

Where have you been Frank, by the way?

Nowadays, we are short of true bitcoiners, the kind of people who have not given up with bitcoin and are still committed to the cause: an electronic p2p cash system. Where the hell are other guys?

I feel so lonely sometimes here the stupid block size debate has polarized this community in the worst way ever and we are short of true bitcoin advocates, keep posting more.

As of tx/s throughput I would say like 100 tx/s suffices for near future and is achievable by few improvements definitively including
1- a block time decrease
2- using schnorr signatures
for mid term we need decentralization by getting rid of pools to be prepared for sharding.

We should never ever trust  any second layer protocol/solution for scaling or other serious issues which bitcoin is designed for. Putting utilities on top of bitcoin is not bad, actually it would be very helpful as long as such utilities are not designed to compete with bitcoin and alienate people from the core system.

1. a blocktime decrease doesnt help. it actually affects many other things and has to many impacts on other things to outweight a pro:con balance to make it plausible.
its far better to stay at 4mb per 10 min than do 1mb for 2.5min

2. schnorr sigs are useful for multisigs because it compact multiparties signatures to show as just 1 signature to hide who signs it. there are security risks as people can make a 2of 5 multisig. pretend its a 2 of 2 address and get someone to deposit funds into it (thinking they have absolute 50% control) and then the 2 of the 4 conmen sign it without the user who deposited funds knowing who actually signed it. and the transaction then appears like he consented to it. thus no legal route of recovery due to lack of proof that he didnt consent.
in a normal lean legacy payment system, you dont need dual signing or second party authorisation. thus no multisig, thus no compacting.. that was the point of bitcoin. not needing others. like i said going back to simple lean transactions is best

3. other things can be done instead, like opcodes that just need 1 sig even if there are 500 inputs from the same address.
so that if 500 different people paid into one address i wont need to sign all 500 times. but just once because its all in one address.. which is done by certain opcode not a schnorr

4. other things like new rules that if a mempool is more than 1mb but a block is less than 1mb then the pool is doing an obvious 'empty block' and thus gets auto rejected. thus ensuring pools are incentivised to fill blocks or not get a block reward. and thus help prevent pools backlogging transactions or trying to bottleneck the network by delaying confirmations of transactions by avoid adding them.. which is also a solution to one of the '51% attack vectors'. after all no one is silly enough to 51% a network just to reverse their coffee purchase. so the only 'threat' of a 51% attack is to bottleneck the network by not including tx's

5. if you want to decrease the blocktime to make it more user friendly for the 'i dont wanna wait at a cashier desk for 10 minute' argument.. well sorry but 2.5mins is just as lengthy a wait and doesnt really solve a queue build up at a retailer.
but that can be solved easily by funding a address a retailer owns(bartab analogy) and then let the retailer deduct balance internally on their system.

6. as for sharding. well thats just 2nd layer again. sharding is just LN factories of many tx's and multiple parties when you rub away alot of the buzzwording. sharding is from a banking analogy just like regional bank branches .. id say its better than reducing blocktime. but at the same time its then formalising the infrastructure into what banks do by forcing certain peoples activities being routed to a specific small pool of nodes

after all its far easier to deposit $90 into a starbucks legacy address and they just give you an in app balance of $90(30 coffee's) then to do all the convoluted effort of, make 5 LN locked channels funded with $18 per channel for good 'connectivity'/chance of success, but has the risks of channel offline situations(not able to spend $18) .. just for you to be able to buy at most 30 coffees(as long as all routes are avalable on the days after each $18 channel funds are spent).
think about it 10 onchain tx's to setup and eventualy close 5 channels. and no guarantee of being able to spend all $90.

where as just paying starbucks $90 in 1 legacy tx and they give you an inapp balance is so much easier for everyone and the network and even onchain fees

and especially with the risks that OTHER LN people will spend those $18 channel values by routing through you.. in short LN doesnt guarantee your $90 gets you 30 coffees.. but just depositing $90 (prepay) into a legacy address does

and thats the real funny part. those 'it cant scalers' think that the debate is about buying 1 coffee 3 times a day cant scale so needs offchain systems forced on everyone. when infact it just needs regular coffee drinkers to bulk buy(bartab) their coffee, but done simply using legacy transaction. rather than alternative networks requiring everyone to be online allday every day and where routing is required thus many people signing and agreeing to be part of your payment..

devs are over complicating solutions to issues they stifled and created themselves, all so they can own one of the precious future hubs/shards of nodes to get loads of fee's to repay their investors


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: vinodjax on September 16, 2018, 05:30:07 AM
Hey franky1,

Thanks for taking the time to post here. You seem to be an old hand. I welcome your participation and I suggest to take the following comments as a friendly reasoning exercise rather than trying to find ways to disprove you. I have tried to use your own points against you. Please find my comments below.


we do not need millions a second..
think about it 1mill a second
= 60m a minute
= 3.6b an hour
= 86billion a day.
thats ludicrous

I believe by "we" you are referring to the people who share your views? Because, any great technology should benefit all of mankind and not just 7-100 people per second.

Remember Kodak?

Kodak: Nobody wants instant photographs. They want to snap them, bring them to our studio and we will deliver them and we have a great capacity! 7 photos per second combining all our international locations.

And today, we have: number of photos taken: 38,000 / second. 38 thousand per second. And taking a photo is less required than us needing to store our knowledge and value (including money) in an immutable fashion.

Just think about this : how many times in a day are you concerned about safeguarding your knowledge / value / sharing true knowledge / acquiring wisdom (knowing more about the truth) AS COMPARED TO wanting to take a photograph?

Your statement is analogous with Kodak in the way :

Why do people need 38 thousand photos per second? We will grow organically and now are supporting 7 photos per second. Very soon, we will reach 100 photos per second. It's ludicrous that people want to take 38 thousand photos per second.

Ironically, Kodak invented the first digital camera. They were just too conservative to let go of their existing cash-cow and take a tangential direction which proved to be the future.

My request to you is this: Consider all options seriously from a value perspective (in terms of value added to society) instead of us taking sides.

first of all visa stats and mastercard show that people on average do 45 transactions a month. which is 1.5 a day
obviously bitcoin is not as acceptable as visa/mastercard. so for now we should be working on less than visa and scale up as we go.

Visa / Master card support about 45,000 tx per second. If you are talking only about payments, maybe 45,000 should be enough as an average. But, in my opinion, immutable data has a lot more applications than simply acting as a mint.

so with that said. we dont need a 86BILLION tx a day network we dont even need a 8.6 billion tx a day network. because not everyone is going to use bitcoin everyday right now.

You are right about Bitcoin, Bitcoin in it's present form may do just well with 100 tx / second. My intention is to build a new network that could provide a different utility for people than bitcoin.

so forget 1m tx a second. forget 100k tx a second.

the stupid fools that think what the community are begging for are move from <7 to  >million overnight and shout out "gigabytes by midnight" , are the small minded people that never actually think rationally or logically.
i really find it funny that the onchain scalers are not screaming for millions per second. but instead steady progress over time.. and its actually the offchain people who dont understand, dont trust or have been paid to give up on blockchains that are screaming for millions per second and gigabytes by midnight as the only options.

I feel that people just find a lot of applications for Satoshi's data structure beyond payments. If it has to support all the possible applications, then millions / second is needed.

the true comunity are begging for progressive onchain rises over logical time. not huge leaps of onchain jumps or lse needing to abandon blockchain utility for some locked fed reserve network of swapping account of promissory notes

what we need to do is not let bitcoin get stifled so much that it ends up as a 0 tx a second and then just 5 mega factory TX's every 10 minutes where the LN factories swap their reserves with other factories onchain.. (bank reserve swaps) yea the devs want bitcoin to be the backend of the LN network. where reserves are swapped.. yep thats right they want bitcoin as a bankers reserve currency and where LN is the bankers payment system for their clients(channel users)

I am of the understanding that the entire vision of Satoshi is to build a payment system which can't be inflated simply by some government deciding to issue more currency or the banks / governments making bad decisions viz. Effective governance. LN is an awesome payment transfer network and I don't see how it can go bad because you still retain complete control over the transfer other than paying the fees. The world has always been value based (NO free lunch). I really don't see how LN or any payment channel or an off-chain scaling solution deviates from the vision of Satoshi or hardcore Bitcoin folks like you.

anyway we need to preserve onchain utility for normal people. that involves going back to basics. and then expand as we go too

so lets start by making tx's lean. to gt to a possible 4200tx per mb. then with 4mb being allowed by the devs. lets open the legacy limit so the full4mb can be used by such 16,800tx for 4mb.
16,800 =2million a day
(oh look the 2million users a day in YOUR 2million ideal scenario)

I would like to offer a different perspective to your view. Money was invented to keep track and to motivate people to pursue things based on their self-interest, add value to society and to reward people equivalently based on their effort. People found volatility in money (because of more printing due to government-schemes gone bad) and started investing in items which are a hedge against inflation - like real estate, gold, etc. which adjust to inflation as the market is based on supply and demand.

What do you think about also assisting with decentralization of land registers so that it adds more fuel to the entire vision / ideology? In that scenario, we will need more and more transactions.


so using actual stats from visa of spending per day. 2million users would only need 4mb blocks of normal lean tx's where legacy transactions can utilise the full 4mb

also to note in LN 2 million users with LN is actually an average of each user needing 5 channels so thats 10million tx's to open and 10million to close.
which means people have to preplan and organise funds for about a 10 day lockin.


There are ~7 billion people on this planet. if you need mass adoption, then you need to provide everyone an equal opportunity to be on the system. Each of these people will want to transact to consume food as a bare minimum. That's 7 billion transactions per day needed already, assuming that we wish to realize Satoshi's vision.

I agree with the locked amounts causing an entry barrier. The solution I am proposing doesn't require everyone to have locked transfers.

...

and before you rebut that blockchains cant scale.. that was like the deluded script of kodak saying digital photography cant scale because of floppy disk data limits. (look how that turned out)
if you want to doubt the limits of data and the internet. please go tell skyp that HD video calls dont work or go complain to EAgames that online gaming cant work. or speak to youtube, twitch, and other livestream sits cant work


Ok, I understand your analogy. And I understand the importance of keeping an open mind. Here are my views for the same.

Let's breakdown your analogies :

1) the deluded script of Kodak saying digital photography cant scale because of floppy disk data limits. (look how that turned out)

So, the above issue was resolved because compact disks were invented, mobile phone cameras came into the picture and many more. I agree that technology expands all the time.


2) please go tell skyp that HD video calls dont work or go complain to EAgames that online gaming cant work.

These companies were simply present at the right place at the right time. They came in with their products after internet latency was reasonably down. The businesses that came earlier when internet latency was high simply went out of business.

It matters whether the chicken came first OR the egg.

You have to understand, deep down, the masses don't care about the technology, only about it's application - otherwise it would be the scientists who will be the wealthiest people in the world. While wealth is not the defining trait of success, in my opinion, I think the real winners in the world are scientists.

Now, let's think more about the the time-chain technology?

I am going to focus more on block-chain / time-chain rather than bitcoin itself.

The trust of the entire system comes from people reaching consensus by expending energy on proof of work.

So, the improvements we are needing for a natural progression are :

1) Internet latency becomes better / decreases significantly so that propagation of blocks is going to be near instant.
2) Storage / computing power becomes more. (Now, this is relatively more easy).

One can choose to wait for these changes and focus on existing systems (like Kodak did) or one can take a different direction / approach into actually changing things.

Besides:

Remember, money was invented so that people can effectively trade without wasting time on negotiations and on building an integrative society where people can focus on their strengths instead of understanding everything in order to trade fairly.

Imagine everyone in the world wasting 10 minutes waiting to transfer money to someone? That's a lot of productivity and hence value that's lost from the ecosystem. I believe that's the end goal of any scaling solution: To make faster transfers available.


or do the rational thing, and realise technology expands all the time and things progress as time goes on.
but whatever you do dont even try to say blockchains are broke and the only way forward is locking funds into channels(accounts) that need someone else to sign for a payment and needs every middleman and the destination to be online to accept payment.. as that is the same AND worse system than what bankers already offer.


I agree that technology expands. But that begins from people like us having conversations like the one we are having here and someone gets an idea, a vision from that and builds upon it. I am happy to be reasoning with you here.

With that being said, I don't agree with your statement that LN is worse than bankers. Satoshi's / decentralization's vision is simply to make things transparent and decentralized. Banks are still a closed system. LN is NOT. And bitcoin or most cryptocurrencies for that matter can easily be audited, so nobody really has an edge. If there are LN nodes doing transfers, that's mainly because it's a free market and they don't need to secure a license by going through a lot of compliance and due diligence checks and it's still in line with the whole vision / idea.


Title: Re: What are the steps to take to publish the solution for blockchain scaling?
Post by: aliashraf on September 16, 2018, 09:53:15 AM

As of tx/s throughput I would say like 100 tx/s suffices for near future and is achievable by few improvements definitively including
1- a block time decrease
2- using schnorr signatures
for mid term we need decentralization by getting rid of pools to be prepared for sharding.

We should never ever trust  any second layer protocol/solution for scaling or other serious issues which bitcoin is designed for. Putting utilities on top of bitcoin is not bad, actually it would be very helpful as long as such utilities are not designed to compete with bitcoin and alienate people from the core system.

1. a blocktime decrease doesnt help. it actually affects many other things and has to many impacts on other things to outweight a pro:con balance to make it plausible.
its far better to stay at 4mb per 10 min than do 1mb for 2.5min
Block time decrease outweighs block size increase in many aspects including and not limited to reducing mining variance and helping with decentralization.

Quote
2. schnorr sigs are useful for multisigs because it compact multiparties signatures to show as just 1 signature to hide who signs it.
Schnorr signatures are more compact and easier to verify plus their inherent support for multisig.This schema is mathematically proven safe.

Quote
6. as for sharding. well thats just 2nd layer again. sharding is just LN factories of many tx's and multiple parties when you rub away alot of the buzzwording. sharding is from a banking analogy just like regional bank branches .. id say its better than reducing blocktime. but at the same time its then formalising the infrastructure into what banks do by forcing certain peoples activities being routed to a specific small pool of nodes
Disagree. Sharding is not second layer. Every shard is a public consensus based blockchain and is secured by accumulative work.  The thing is we have not a matured sharding proposal in hand and we are not ready because of centralization of mining that makes it absolutely insecure for sharding to be operational. Right now with just one shard (the main chain) we live in the edges of centralization with multiple shards it would be a nightmare.

Quote
after all its far easier to deposit $90 into a starbucks legacy address and they just give you an in app balance of $90(30 coffee's) then to do all the convoluted effort of, make 5 LN locked channels funded with $18 per channel for good 'connectivity'/chance of success ...
Right to the point. Recurring payments are among the least important challenges for bitcoin to get mass adopted.


-------------------------------------------------------------------------------------------------------------------------------------------------

You are right about Bitcoin, Bitcoin in it's present form may do just well with 100 tx / second. My intention is to build a new network that could provide a different utility for people than bitcoin.

I feel that people just find a lot of applications for Satoshi's data structure beyond payments. If it has to support all the possible applications, then millions / second is needed.
A universal machine has an inherent problem: decidability. We all remember disasterous Ethereum DAO hack.

To be more focused, I suggest taking into consideration that we are discussing in a consensus based decentralized system context. Bitcoin is an instance of such a system, the first instance actually and with a specific application segment: monetary systems.

I have always been a believer in decentralization of social structures but after bitcoin I started to understand that freedom does not happen simultaneously in every single aspect of social life and we need to start from somewhere and patiently wait for consequences and the domino effect,  we should start from money.

The greatest achievement of Satoshi was not PoW and bitcoin, identifying monetary systems as the most strategic and geopolitical battle field was. Technology comes after vision. As a bitcoiner I believe in Satoshi vision that implies both feasibility and importance of decentralization of money.

I appreciate your courage and passion for a more general and bigger application domain but I think money is not just money we are talking about the focal point of almost every socio-economic process in modern society.