gigabyted
|
|
January 21, 2019, 11:17:49 PM |
|
PoS can not be decentralized and can not be secure. You will never know if an opponent has taken control of all or part of the nodes, all nodes can be compromised or hacked. There will be no greater security than just the PoW, there will actually be less.
I believe its easier to hack 2 or 3 pools in order to control a network over PoW (and often 1 pool has more than 50% of it, so a single target to hit) than hacking all the nodes of a network. Pools also run more code than a node so attack surface is greater. Network can not be controlled "over PoW" by hacking pools, because hashing power is in hands of miners not pools itself. I don't know how a PoS network can be secure and how a PoS based coin can guarantee privacy, because PoS is inherently centralized. Maybe privacy is not intended for zano? Not sure how much you have mined, but miners provide power to pools, pool is running the node so it can double spend and do all kind of things with that node, as long as the pow algo is the same miners don't know much about what they really do... No surprised that everyone is worried about pools that owners near large chunk of bitcoin network...
|
|
|
|
Entusiasta
Member
Offline
Activity: 108
Merit: 10
|
|
January 22, 2019, 02:33:55 PM |
|
PoS can not be decentralized and can not be secure. You will never know if an opponent has taken control of all or part of the nodes, all nodes can be compromised or hacked. There will be no greater security than just the PoW, there will actually be less.
I believe its easier to hack 2 or 3 pools in order to control a network over PoW (and often 1 pool has more than 50% of it, so a single target to hit) than hacking all the nodes of a network. Pools also run more code than a node so attack surface is greater. Network can not be controlled "over PoW" by hacking pools, because hashing power is in hands of miners not pools itself. I don't know how a PoS network can be secure and how a PoS based coin can guarantee privacy, because PoS is inherently centralized. Maybe privacy is not intended for zano? Not sure how much you have mined, but miners provide power to pools, pool is running the node so it can double spend and do all kind of things with that node, as long as the pow algo is the same miners don't know much about what they really do... No surprised that everyone is worried about pools that owners near large chunk of bitcoin network... If a pool double spends it will lose all its miners, so it can not "control a network over PoW"; yes, it can do all kind of things with that node but not with the network. With PoS you will never know if a malicious actor has sabotaged all or part of the nodes. There is not solution to 51% PoW attack because PoW is the only solution to Byzantine Generals Problem. The cryptocurrency community must accept the risk of 51% PoW attack, unless wishes to renounce decentralization and privacy.
|
|
|
|
gigabyted
|
|
January 23, 2019, 02:24:04 PM |
|
PoS can not be decentralized and can not be secure. You will never know if an opponent has taken control of all or part of the nodes, all nodes can be compromised or hacked. There will be no greater security than just the PoW, there will actually be less.
I believe its easier to hack 2 or 3 pools in order to control a network over PoW (and often 1 pool has more than 50% of it, so a single target to hit) than hacking all the nodes of a network. Pools also run more code than a node so attack surface is greater. Network can not be controlled "over PoW" by hacking pools, because hashing power is in hands of miners not pools itself. I don't know how a PoS network can be secure and how a PoS based coin can guarantee privacy, because PoS is inherently centralized. Maybe privacy is not intended for zano? Not sure how much you have mined, but miners provide power to pools, pool is running the node so it can double spend and do all kind of things with that node, as long as the pow algo is the same miners don't know much about what they really do... No surprised that everyone is worried about pools that owners near large chunk of bitcoin network... If a pool double spends it will lose all its miners, so it can not "control a network over PoW"; yes, it can do all kind of things with that node but not with the network. With PoS you will never know if a malicious actor has sabotaged all or part of the nodes. There is not solution to 51% PoW attack because PoW is the only solution to Byzantine Generals Problem. The cryptocurrency community must accept the risk of 51% PoW attack, unless wishes to renounce decentralization and privacy. With zano if im not mistaking you would need 51% of hashrate and 51% of all coins to attack it, but dont take my words for it, check with team for confirmation!
|
|
|
|
hashappliance
Copper Member
Newbie
Offline
Activity: 51
Merit: 0
|
|
January 23, 2019, 06:10:38 PM |
|
PoS can not be decentralized and can not be secure. You will never know if an opponent has taken control of all or part of the nodes, all nodes can be compromised or hacked. There will be no greater security than just the PoW, there will actually be less.
I believe its easier to hack 2 or 3 pools in order to control a network over PoW (and often 1 pool has more than 50% of it, so a single target to hit) than hacking all the nodes of a network. Pools also run more code than a node so attack surface is greater. Network can not be controlled "over PoW" by hacking pools, because hashing power is in hands of miners not pools itself. I don't know how a PoS network can be secure and how a PoS based coin can guarantee privacy, because PoS is inherently centralized. Maybe privacy is not intended for zano? From our whitepaper: You can download the full whitepaper here: https://zano.org/1.2 Zano’s PoS Implementation Ring signatures allow the transaction creator to provide a set of possible public keys for signature verification, thus keeping their identity indistinguishable from other users. The concept of an anonymous, secure PoS mechanism seemed to be unachievable. The basis of PoS is a so-called kernel, which is a data structure that depends on the transaction output and includes: • Current timestamp with 15-second granularity. • Key Image, which corresponds to each transaction output. A keyimage is comprised of 32 pseudorandom bytes derived from the key in such a way that it is impossible to reconstruct the key, given only its image. • Stake Modifier. An additional 64 pseudo-random bytes derived from the last PoS and PoW blocks, which disallows any predictability of the stakemodifier in the blocks ahead. During mining, a user is allowed to sign a block, if Hash(kernel) < CoinAmount ∗ PosTarget, where CoinAmount is the amount of coins in a particular output, and PosTarget is an adaptive parameter that works to keep the block creation rate constant. The PoS miner iterates the timestamp (within the allowed boundaries) for each of their unspent outputs (UTXO) and checks to see if they possess a UTXO that’s keyimage satisfies the PoSTarget formula above. In the event of a ”winning” result, they spend this particular output, anonymously, with a ring-signature. Note: The signature includes the keyimage (used in the kernel), but not the key itself, which is why it does not compromise anonymity. The miner signs both the transaction and the block and broadcasts the new block to the network.
|
|
|
|
cryptobtcnut
Newbie
Offline
Activity: 31
Merit: 0
|
|
January 24, 2019, 11:47:35 AM |
|
What would be the size of Zano blockchain if it contained all BTC transactions. BTC is around 200 GB atm, how much does Zano improve this aspect?
How many transactions per second is Zano capable of?
|
|
|
|
crypjunkie
|
|
January 24, 2019, 08:52:35 PM |
|
What would be the size of Zano blockchain if it contained all BTC transactions. BTC is around 200 GB atm, how much does Zano improve this aspect?
How many transactions per second is Zano capable of?
My personal estimations are -80% of BTC blockchain size via blockchain pruning, but TPS will be magnitudes larger than any other cryptonote @ present.
|
|
|
|
crypto_zoidberg (OP)
|
|
January 25, 2019, 10:16:06 PM Last edit: April 23, 2019, 01:30:07 AM by mprep |
|
Hi folks!
A few days ago we've launched first public testnet, we just put there new PoW hash(WildKeccak2 Alpha version) and made a bunch of changes. We still have tons of things to do, but we decided to open it for those who want to take a look into it.
https://medium.com/@zano_project/zano-testnet-c92723087d09
Feel free to write bug reports on github: https://github.com/zano-project/zano/issues
PS: This is testnet, don't get crazy, please don't do a lot of mining and don't rise difficulty - this testnet will be killed, and we'll run bunch of next testnets before we'll get ready for production main net.
What would be the size of Zano blockchain if it contained all BTC transactions. BTC is around 200 GB atm, how much does Zano improve this aspect?
How many transactions per second is Zano capable of?
This question is very important actually, and as soon as we got stable testnet and finished with urgent improvements we'll run stress test's on testnet and see exactly how it act on hight load. Performance is one of our priorities!
Oh, and I almost forgot, we finally got a brief history of the Boolberry project for those who have not followed it since 2014: https://medium.com/@BoolberryBBR/a-brief-history-of-boolberry-c4048d692272Enjoy! PS: Just to clarify - Zano project had been derived from Boolberry codebase.
|
|
|
|
cryptobtcnut
Newbie
Offline
Activity: 31
Merit: 0
|
|
January 27, 2019, 02:28:12 PM |
|
What would be the size of Zano blockchain if it contained all BTC transactions. BTC is around 200 GB atm, how much does Zano improve this aspect?
How many transactions per second is Zano capable of?
This question is very important actually, and as soon as we got stable testnet and finished with urgent improvements we'll run stress test's on testnet and see exactly how it act on hight load. Performance is one of our priorities! Do let us know the results.
|
|
|
|
onecryptoguy
Jr. Member
Offline
Activity: 64
Merit: 2
|
|
January 27, 2019, 07:57:34 PM |
|
For a non-technical reader, what are the practical differences between Mimblewimble projects (Grin, Beam) and what Zano brings to the table, especially regarding the main 2 aspects: privacy and blockchain size? I mean from a users point of view...
|
|
|
|
Bowtiesarecool
|
|
January 28, 2019, 10:14:44 AM |
|
For a non-technical reader, what are the practical differences between Mimblewimble projects (Grin, Beam) and what Zano brings to the table, especially regarding the main 2 aspects: privacy and blockchain size? I mean from a users point of view...
Spot on mate. Been waiting for a "tell-me-like-I'm-5yo" delivery of the differences between these MW implementations. Wouldn't mind having it delivered as an infographics or table of comparison. Handy for quick consultation and sharing
|
|
|
|
gigabyted
|
|
January 29, 2019, 02:08:48 AM |
|
For a non-technical reader, what are the practical differences between Mimblewimble projects (Grin, Beam) and what Zano brings to the table, especially regarding the main 2 aspects: privacy and blockchain size? I mean from a users point of view...
Spot on mate. Been waiting for a "tell-me-like-I'm-5yo" delivery of the differences between these MW implementations. Wouldn't mind having it delivered as an infographics or table of comparison. Handy for quick consultation and sharing I guess team will come with a better answer and more official as well. But to what i've heard, MW is mostly a layer 2 it can be added to pretty much any other layer 1 tech out there. So it's not really comparable like 2 different layer 1 projects would be. I guess the team will come out with a more official statement because this question is asked a lot lately (and with reason).
|
|
|
|
Newton90
|
|
February 01, 2019, 03:26:59 PM |
|
any updates about launch month/date?
|
|
|
|
crypjunkie
|
|
February 01, 2019, 10:07:18 PM |
|
any updates about launch month/date?
No date yet, depends on testnet performance, best estimate is before the end of February.
|
|
|
|
Bowtiesarecool
|
|
February 04, 2019, 07:36:25 PM |
|
For a non-technical reader, what are the practical differences between Mimblewimble projects (Grin, Beam) and what Zano brings to the table, especially regarding the main 2 aspects: privacy and blockchain size? I mean from a users point of view...
Spot on mate. Been waiting for a "tell-me-like-I'm-5yo" delivery of the differences between these MW implementations. Wouldn't mind having it delivered as an infographics or table of comparison. Handy for quick consultation and sharing I guess team will come with a better answer and more official as well. But to what i've heard, MW is mostly a layer 2 it can be added to pretty much any other layer 1 tech out there. So it's not really comparable like 2 different layer 1 projects would be. I guess the team will come out with a more official statement because this question is asked a lot lately (and with reason). Thanks mate! That suddenly puts a lot of things and perspective and completely alters the way I've been looking at some MW blockchain branding. Almost feeling like a dupe for falling for some. Looking forward to further info from the team though. Seems like one just can't stop learning in this industry
|
|
|
|
crypto_zoidberg (OP)
|
|
February 07, 2019, 09:19:00 PM |
|
Hi folks!
I've been doing some research on difficulty adjustments function and want to share results and ideas, analysis and critics are very welcome.
Motivation. Some miners were doing one of the typical "greedy mining" strategies - abruptly raised hashrate, keep mining until difficulty get adjusted, and then dropped all hashrate, leaving a network stuck with one blocks/hour or even worse. For miners/pools this strategy is profitable because it let them get "cheap" blocks(mined with low difficulty), but this is definitely a problem for cryptocurrency - in such situation transactions confirmations takes hours, and other fair miners are quitting because mining on high difficulty is getting unprofitable. I want to remind that at the early days Boolberry was suffered from this, and I've decided to take this historical data as "real-life" example for testing new difficulty adjustment.
Analysis. 1. Boolberry blocks history was reviewed and picked up a representative period, where hashrate was raised and dropped in this typical manner. Then, using difficulty associated with the blocks and their timestamps estimated hashrate was derived. Here what I've got:
That was a first week of June 2014, and as it seen on marked areas - there was a hard time for the network, blocks were coming once per hour and i had to ask loyal miners to make some blocks even though it was unprofitable. Ok, drama is over
Then, using this hashrate numbers associated with time became possible to calculate estimate blocks flow with timestamps and difficulty values progression, and this simulation had been run for bunch different variations of adjustment functions. At this graph showed original cryptonote adjustment function(the blue one), couple variations with “adjustment window”, and currently preferred version(green one):
It calculated as combination of original adjustment functions for two different periods:
D = (D720(historical_data) + D200(historical_data))/2
Where: - D720 - function which calculates difficulty based on 720 blocks window with 60 / 60 cuts - D200 - function which calculates difficulty based on 200 blocks window with 5 / 5 cuts
Conclusions. As seen on the curve this adjustment algorithm adaptation is definitely better by correction of D200, at the same time this still protected from timestamps manipulations due to the heavy influence of D720.
This function still a draft, we gonna make more simulations to check possible fluctuations, performance problems. Simulation is pretty naive, it doesn’t do absolutely realistic conditions, but it gives a pretty clear picture of adjustment function behaviour.
Would love to see some feedback&critics.
|
|
|
|
onecryptoguy
Jr. Member
Offline
Activity: 64
Merit: 2
|
|
February 07, 2019, 09:38:29 PM |
|
I'm not an expert, just a simple miner, and what I see seems much better... the difficulty adjustment responds faster to hashrate changes up and down the curve, so this is very promising. Good job, CZ!
|
|
|
|
|
versprichnix
|
|
February 08, 2019, 12:34:15 AM |
|
Small instances should have an advantage, for example a miner with 50% of the hashrate has a disadvantage of 50%, a miner with 90% of all hashrate has a disadvantage of 90%, a miner with 1% of all hashpower has a disadvantage of 1%. Very small miners do not have any disadvantage in this case.
|
|
|
|
crypto_zoidberg (OP)
|
|
February 08, 2019, 01:38:35 AM |
|
Small instances should have an advantage, for example a miner with 50% of the hashrate has a disadvantage of 50%, a miner with 90% of all hashrate has a disadvantage of 90%, a miner with 1% of all hashpower has a disadvantage of 1%. Very small miners do not have any disadvantage in this case.
Can you clarify what you mean by "advantage" here? What I am doing is trying to protect relatively small miners, against big miners which can do these greedy mining policies. But actually it's not about big or small miners, it's about fast and stable adaptation to conditions I guess.
|
|
|
|
MoneroCrusher
Jr. Member
Offline
Activity: 88
Merit: 1
|
|
February 08, 2019, 05:52:04 AM |
|
I have an idea that I want to see implement in Monero because it has the same problem but on a smaller percentage scale (10-15% since May 2018)
Try a weighted average instead of a normal average in the example of 720 blocks, take the first 50 blocks and give them a weight of 50% take block 51-100 and give them a weight of 25% take block 101-360 and give them a weight of 12.5% take block 361-720 and give them a weight of 12.5%
720 is not necessary though, cut it down to a smaller sample and keep the same ratio, i.e. you could do 360 blocks instead of 720, or even 180 blocks
|
|
|
|
|