Bitcoin Forum
July 04, 2024, 02:31:06 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 ... 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 [302] 303 304 305 306 307 308 »
6021  Alternate cryptocurrencies / Altcoin Discussion / Re: Can Bitcoin maximalism be defeated in the reality of adoption? on: June 10, 2016, 05:47:22 PM
Unlike Armstrong, I don't believe the masses can ever be saved. Never has that been the case, e.g. in the French Revolution they accomplished in the end totalitarianism under Napoleon. Nature purges the masses such as with the Black Death that killed a majority of Europe's population. I believe the powers-that-be will win control over the mainstream monetary system and they and the masses will decline together in a morass. I hold my hope in the bifurcation of the economy into a dying Industrial Age (which the powers-that-be and masses control) and a fledgling Knowledge Age which is controlled by the hackers and knowledge creators.

I understand Moldbug's "Only One Currency Can Win" which Satoshi also apparently validated.

And I have explained how I think a bifurcated economy could violate Gresham's Law and allow for two monetary units to coexist globally analogous to how national currencies coexist due to a Coasian barrier of individuals not trading directly internationally.

The one unit will be controlled by the banksters and the masses and this will be the 666 slavery system. The other crypto-currency unit will be for those who hold their "unit-of-account" in "units-of-knowledge". You see as a hacker, you don't care about money as a store-of-value, power-law distribution enslavement paradigm. We care about code and the freedom (power) to build (software, 3D printing designs, etc). The models of remuneration employed thus far for "open source" have depended mostly on large corporate funding. This has locked us hackers into the Theory of the Firm rigor mortis paradigm but remember knowledge can't be enslaved nor financed monetarily. Whereas what we really need is direct remuneration via micro payments.

Note the banksters need this alternative monetary unit too. For they need a place to hide their wealth from their own morass, and because if they squelch all knowledge formation, they will destroy their cancer's host and perish. Precious metals are no longer even functional in the competition against knowledge.

Do understand that everyone who is creative and likes to build things is a hacker.

I believe I know what needs to be done in terms of an altcoin. But getting it done may be more than one man could do timely.
6022  Bitcoin / Bitcoin Discussion / Re: Bitcoin IS basically DESTROYED on: June 10, 2016, 05:39:30 PM
6023  Bitcoin / Bitcoin Discussion / Re: A Question for Bitcoin Maximalist on: June 10, 2016, 05:35:42 PM
6024  Alternate cryptocurrencies / Altcoin Discussion / Re: The Ethereum Paradox on: June 10, 2016, 02:15:52 PM
The ASIC can choose to use DRAM instead and amortize the power consumption over the row buffer.

Here's a possible concrete design of what you are hinting at:

The ASIC will have as many (pipelined) siphash circuits as are needed to max out the memory
bandwidth of all the DRAM you're gonna hook up to it.
For each row on each memory bank, the ASIC has a queue of in-row counter-indices.
Each entry in the queue thus represents a "stalled thread".
The siphash circuits fill up these queues at a mostly uniform rate.
Once a queue fills up it is flushed to one of the memory controllers on the ASIC that
will perform the corresponding atomic updates on the memory row.

This looks pretty efficient, but all those queues together actually take up a lot of memory,
which the ASIC needs to access randomly.

So it seems we've just recreated the problem we were trying to solve.
But we can aim for the total queue memory to be maybe 1% of the total DRAM.

There's a lot of parameters here and many different ways to optimize with
combinations of speed, energy cost, and hardware cost...

Right we can't queue any where near 2^14 counter indices per row, else nothing gained. Good point.

If the compute units are much more efficient and faster than we can fully leverage within the memory bandwidth and optimal counter indices queue limits, we could also consider enumerate multiple nonces and discarding those that don't match a subset of page rows, i.e. process the cuckoo table in chunks with multiple passes. So that decreases the effective memory size for each pass. The edge pruning would need to be done after all passes. Can't the pruning be done without random access?

Basically what I am driving at is we can't rely on the row buffer load power cost being greater than only 1 hash. We'd need wider margins to be very sure of ASIC resistance, such as 1024 row buffer loads for 1 hash. Wink

tromp you are a smart guy, so please don't go publishing my idea before I get to market. You might be able to deduce what I am up to based on the prior paragraph.

You'll receive proper credit as you deserve it. This is your invention.

Don't ruin your future fame by deducing and publishing what I am up to prematurely, given the liberal hints I have given you and then ruin the ability to leverage in the market with first mover advantage. Some shitcoin take the idea and run with it, then the impact will be diluted. Please for the love of God, don't have start having delusions that Blockstream is your friend. Do you enjoy being used.
6025  Alternate cryptocurrencies / Altcoin Discussion / Re: The DAO FAIL on: June 10, 2016, 12:50:11 PM
Yet, to keep The DAO from capsizing, Tual said he thinks members need to accomplish three objectives – rounding out its list of curators, amending its governance model to see after the concerns of voting members and investing in sound business ideas.

Bribe the curators.

This is just the corruption of government we are all fighting to remove with block chains and then we bring it right back again and place this $150+ million UNREGULATED honeypot to attract the flies.

Since it is unregulated, the criminal mindsets will be drawn to this like flies to honey.

It is just a matter of time before The DAO is in political bickering and probably worse with the power vacuum being always a "winner take all" phenomenon.
6026  Alternate cryptocurrencies / Altcoin Discussion / Re: The DAO FAIL on: June 10, 2016, 12:40:34 PM
I don't consider DAO as fail, give it few more months it will surely come back stronger and better after all known issues have been fixed.

The game theory issues appear to be fundamental. Meaning they can not be fixed.

It is analogous to wanting to build a skyscaper to the moon. We can't do it. No matter how good our technology is.

Some things are truly impossible.
6027  Alternate cryptocurrencies / Altcoin Discussion / Re: The Ethereum Paradox on: June 10, 2016, 12:30:00 PM
You need millions to find 2^12 of them wanting to access a single row.
With 2^15 threads you expect 1 access per row.
Correct on 2^15 but not on the conclusion.

With 2^27 you expect 2^12 accesses per row. That's hundreds of millions.

That would give you 2^12 hashes per row buffer load. That isn't necessary to break ASIC resistance. 2^18 (256k threads) gives 8 hashes per row buffer load and might be enough to make it computation bound to a range of perhaps 100-to-1 improvement of power consumption on the ASIC.

Or in any case, 2^20 (million threads) gives 32 hashes per row buffer.

And remember my use case is 2^20 counters not 2^29, so in my use case the ASIC only needs 1000s of threads to make it computation bound up to the limits of the maximum efficiency advantage of the ASIC.

A CUDA gpu can apparently do 671 million threads

Not in hardware. It can only run a few thousand in hardware (simultaneously).
So the 671 have to run in thousands of batches, one after the other.

Yeah I also stated in the prior post that the GPU isn't likely optimized enough to cause major advantage. The ASIC is the big concern.

I don't think it is safe to assume that ASICs can't be designed to support millions of very efficient threads for this very customized computation

Your ASIC would require an insanely huge die and be orders of magnitude more expensive than the DRAM it tries to optimize access to.

Certainly not in my use case of 2^20 counters.

And in the 2^29 counters use case, please remember I said that you don't need millions of compute units. The compute units can be shared by the non-stalled threads and even these can be pipelined and queued. So you don't need millions of compute units circuits. The millions is only the logical threads.

and again sharing the computation transistors amongst only the threads that aren't stalled, thus not needing millions of instances of the compute units.

How would you know what to stall without doing its computation first??

That is irrelevant. Think it out. It is a statistical phenomenon. Not all the computations are computed monolithically followed all the syncs. These stages are happening in parallel for 2^8 memory banks (row buffers), so we can have many threads stalled yet some not. And even the computations can be pipelined and queued. Let's not conflate power consumption goal with speed goal.

It is very very difficult to defeat the ASIC. I think I know how to do it though.

Also remember I need for instant transactions to have a very fast proving time, which thus means at 2^20 counters it could be trivially parallelized losing ASIC resistance with only thousands of threads.

For such small instances, latency is not that relevant as it all fits in SRAM cache.

Latency is a speed bound concern. Our concern herein is power consumption bound. That is what I am getting you to focus on these being orthogonal issues.

The ASIC can choose to use DRAM instead and amortize the power consumption over the row buffer. So I am not yet sure if that would give the ASIC an advantage. Although I've read that SRAM is 10X faster and 10X more power consumption (thus being neutral relative to the DRAM), I am wondering if the ASIC can coalesce (sync) all the reads in the row buffer for DRAM, if the DRAM might not have a significant power advantage thus rendering it computation bound for power consumption.

Do you have any idea? I was planning to study the equations for memory power consumption in the reference I cited upthread.

Note I think on Android it may be possible to turn off the cache.
6028  Alternate cryptocurrencies / Altcoin Discussion / Re: Valid uses cases for Smart Contracts, Dapps, and DAOs? on: June 10, 2016, 12:01:52 PM
I see the DAO as a game of who can compromise it first and cash out the funds to themselves.

Yup that is what it appears to be. But can you fathom what this is going to do to all the bagholders and to ETH.

I got your point that maybe there is some game theory to how long you stay vested. You like Russian Roulette I guess.

The SEC will be coming... this is the event that will start the SEC's involvement in actively regulating CC. Thanks to Vitalik and Tual.


I don't consider DAO as fail, give it few more months it will surely come back stronger and better after all known issues have been fixed.

The game theory issues appear to be fundamental. Meaning they can not be fixed.

It is analogous to wanting to build a skyscaper to the moon. We can't do it. No matter how good our technology is.

Some things are truly impossible.

Yet, to keep The DAO from capsizing, Tual said he thinks members need to accomplish three objectives – rounding out its list of curators, amending its governance model to see after the concerns of voting members and investing in sound business ideas.

Bribe the curators.

This is just the corruption of government we are all fighting to remove with block chains and then we bring it right back again and place this $150+ million UNREGULATED honeypot to attract the flies.

Since it is unregulated, the criminal mindsets will be drawn to this like flies to honey.

It is just a matter of time before The DAO is in political bickering and probably worse with the power vacuum being always a "winner take all" phenomenon.
6029  Alternate cryptocurrencies / Altcoin Discussion / Re: Valid uses cases for Smart Contracts, Dapps, and DAOs? on: June 10, 2016, 11:53:56 AM
it is hype for scam purpose only

In the case of DAO, Slock.it, and Augur, then that seems to be the case. There is no valid use case which isn't game theory broken for those. Expect The DAO to eventually collapse in a massive clusterfuck of theft and waste with most losing their money. Wise people would get the hell out of The DAO as fast as they can, because the DAO is broken in the sense that you can be jammed from exiting.

Decentralized crowd funding might be viable.
6030  Alternate cryptocurrencies / Altcoin Discussion / Re: Will Iota reach Billion Marketcap? That's $1 per token on: June 10, 2016, 11:50:43 AM
Originally I couldn't envision many valid use cases for this new craze building on top of more powerful block chain scripting and the block chain as a generalized state transition database, i,e. the block chain + scripting as a Turing complete machine.

...

So with those technological ground rules in place, please help me to enumerate valid uses cases for this new craze.

...

3. Internet-Of-Things (IoT). This is like Slock.it where we want the contract to control some external devices, such as a paywall for a parking meter. This seems to be a very weak use case, because there is really no advantage at all gained here over simply sending payment to the parking meter API. There is no gains from the oversight of recording the data on a block chain, because there is no way for either party to prove if the service was delivered or not, other than each voting on it and thus they cancel each other's vote out and either they both agree or there is no quorum. IoT is more about block chain performance of instant micropayments, low transaction fees, and cloud databases but not about decentralized consensus on state transitions.
6031  Alternate cryptocurrencies / Altcoin Discussion / Re: The DAO FAIL on: June 10, 2016, 11:41:59 AM
5. Decentralized Autonomous Organizations (DAO). Technically the idea that investors buy a colored coin which enables them to vote on how the funds (denominated in which every token was exchanged for the colored tokens) are spent, and to sell this colored coin at-will on the market. Issues:

  • Game theory appears to be insolubly broken in that "No" votes are more expensive/risky than selling your vestment. There doesn't appear to be any remedy because even holding up funds for a grace period doesn't stop the run on the price after the "Yes" on a stupid proposal. However, I thought of a possible mitigation is to limit the value of proposals that can be voted on simultaneously, so that no bad outcome can't drastically impact the price. But this doesn't mitigate the game theory that there is an incentive for those who want to steal the funds to buy up the colored tokens so they can influence the outcome of the votes and those who see it has been infiltrated have more incentive to just sell than to fight, thus the infiltrators get to buy the tokens at cheaper and cheaper prices and yet the funds they control does not diminish in value. The game theory seems insolubly broken.

    Eric S. Raymond wrote about the Iron Law of Political Economics, and it is always a power vacuum. When pooling funds, the game theory is a mess.
  • Organization is the antithesis of decentralization. Business projects require cohesion and continuity with fluidity of decision making. There is no way to make this into a decentralized structure which doesn't destroy the essence of efficiency of production. Production is highly interactive and collaborative. The time lag in the communication overload of the Mythical Man Month can render a project into gridlock oscillation between competing options. Top-down voting is top-down governance, which is the antithesis of decentralized production. Decentralized version control open source (DVCS) solves this discord to obtain resonance by allowing every participant to have their own perspective on changesets. DAO is entirely wrong model for decentralized production. It fights against everything we learned with decentralized open source development, which is that the individual should be empowered to act independently.
  • Note I could envision tracking investment, decentralizing the trading of the shares (colored tokens), voting on a board, and distribution of dividends. But this voting on each proposal as a flat democracy does not work.


it is hype for scam purpose only

In the case of DAO, Slock.it, and Augur, then that seems to be the case. There is no valid use case which isn't game theory broken for those. Expect The DAO to eventually collapse in a massive clusterfuck of theft and waste with most losing their money. Wise people would get the hell out of The DAO as fast as they can, because the DAO is broken in the sense that you can be jammed from exiting.

Decentralized crowd funding might be viable.


I see the DAO as a game of who can compromise it first and cash out the funds to themselves.

Yup that is what it appears to be. But can you fathom what this is going to do to all the bagholders and to ETH.

I got your point that maybe there is some game theory to how long you stay vested. You like Russian Roulette I guess.

The SEC will be coming... this is the event that will start the SEC's involvement in actively regulating CC. Thanks to Vitalik and Tual.
6032  Alternate cryptocurrencies / Altcoin Discussion / Re: Hoskinson's interview on Lisk on: June 10, 2016, 11:35:30 AM
Charles please contribute to this thread, "Valid uses cases for Smart Contracts, Dapps, and DAOs?".
6033  Alternate cryptocurrencies / Altcoin Discussion / Valid uses cases for Smart Contracts, Dapps, and DAOs? on: June 10, 2016, 11:32:51 AM
Forum please help me.

Originally I couldn't envision many valid use cases for this new craze building on top of more powerful block chain scripting and the block chain as a generalized state transition database, i,e. the block chain + scripting as a Turing complete machine.

And that was to a large extent because data feeds on external events break consensus algorithms. But then I realized that external events can be voted on.

Any contract that requires an external data feed breaks the ability to have a valid consensus algorithm. So this limits us to scripting which refers to data that is already on the block chain, i.e. the only thing that a block chain can validate are state transitions from an initial set of data, i.e. the genesis block.

So this basically limits what block chains can do, to financial contracts that involve how value is transferred over time and voting. Our contract logic can't refer to events that occur external to the block chain, except by voting. So this means external data feeds (events external to the block chain) can only be accommodated as voters, i.e. if one reporter of the event is authorized then it is a 1-of-1 quorum and if there are 5 reporters of an external event, then say our contract requires 3-of-5 to agree on the report, and then of course the contract has to have logic for what to do in the case that the quorum on an external event can't be achieved.

So with those technological ground rules in place, please help me to enumerate valid uses cases for this new craze.

1. Decentralized crowd funding. This definitely seems to be a valid use case. The funds can be refunded if minimum threshold is not met in time. The funds can be distributed based on milestones which are voted on by the crowd funders.  All these parameters can be preset when the contract is formed.

2. Legal contracts. This only works if the parties can agree on who will vote on the external events that the contract enforces. And the parties could still go to court after the fact if any party felt the contract was not executed faithfully. The restitution from any court decision would come in the form of compelling the parties to do something which the smart contract could no longer enforce. I don't discuss the option of giving courts master keys to override past contract outcomes as this a "can of worms" in many facets.

3. Internet-Of-Things (IoT). This is like Slock.it where we want the contract to control some external devices, such as a paywall for a parking meter. This seems to be a very weak use case, because there is really no advantage at all gained here over simply sending payment to the parking meter API. There is no gains from the oversight of recording the data on a block chain, because there is no way for either party to prove if the service was delivered or not, other than each voting on it and thus they cancel each other's vote out and either they both agree or there is no quorum. IoT is more about block chain performance of instant micropayments, low transaction fees, and cloud databases but not about decentralized consensus on state transitions.

4. Prediction Markets. Such as Augur and Gnosis, the participants to the bet (or all the participants on the network) vote on the external events. This seems to have some seriously bad game theory concerns. Refer to the discussion of game theory issues for DAOs as a hint.

5. Decentralized Autonomous Organizations (DAO). Technically the idea that investors buy a colored coin which enables them to vote on how the funds (denominated in which every token was exchanged for the colored tokens) are spent, and to sell this colored coin at-will on the market. Issues:

  • Game theory appears to be insolubly broken in that "No" votes are more expensive/risky than selling your vestment. There doesn't appear to be any remedy because even holding up funds for a grace period doesn't stop the run on the price after the "Yes" on a stupid proposal. However, I thought of a possible mitigation is to limit the value of proposals that can be voted on simultaneously, so that no bad outcome can't drastically impact the price. But this doesn't mitigate the game theory that there is an incentive for those who want to steal the funds to buy up the colored tokens so they can influence the outcome of the votes and those who see it has been infiltrated have more incentive to just sell than to fight, thus the infiltrators get to buy the tokens at cheaper and cheaper prices and yet the funds they control does not diminish in value. The game theory seems insolubly broken.

    Eric S. Raymond wrote about the Iron Law of Political Economics, and it is always a power vacuum. When pooling funds, the game theory is a mess.
  • Organization is the antithesis of decentralization. Business projects require cohesion and continuity with fluidity of decision making. There is no way to make this into a decentralized structure which doesn't destroy the essence of efficiency of production. Production is highly interactive and collaborative. The time lag in the communication overload of the Mythical Man Month can render a project into gridlock oscillation between competing options. Top-down voting is top-down governance, which is the antithesis of decentralized production. Decentralized version control open source (DVCS) solves this discord to obtain resonance by allowing every participant to have their own perspective on changesets. DAO is entirely wrong model for decentralized production. It fights against everything we learned with decentralized open source development, which is that the individual should be empowered to act independently.
  • Note I could envision tracking investment, decentralizing the trading of the shares (colored tokens), voting on a board, and distribution of dividends. But this voting on each proposal as a flat democracy does not work. But then this would model a centralized corporation and thus be subject to investment securities regulation.
6034  Alternate cryptocurrencies / Altcoin Discussion / Re: The Ethereum Paradox on: June 10, 2016, 10:20:05 AM
I am just arguing why not be safer than sorry when so much is at stake? Why would you recommend adding unnecessary risk even where you think you are omniscient and there is no risk?

I only need good statistical properties of the hash function, not cryptographic security.

Pseudorandomness ("good statistical properties") is tied into the cryptographic security:

Quote from: Blake2 paper
BLAKE2 aims to provide the highest security level, be it in terms of classical notions as (second) preimage or collision resistance, or of theoretical notions as pseudorandomness (a.k.a. indistinguishability) or indifferentiability.

Relevant terms are 'indistinguishability' and 'indifferentiability'.

Siphash is sufficient for the use case it was designed for because the attacker doesn't know the input to the hash function. It is not the case that Siphash has been cryptanalysed for the case of your application of Siphash in Cuckoo Cycle wherein the attacker knows both the input and output of the hash function.

Note that every submission that Dan Berstein made for cryptographic hash, has always been rejected having breakage. He appears to not excel in that one area. His ARX rounds (from ChaCha symmetric encryption) have been redeployed in Blake2 by those who know how to make it cryptographically secure for the different application of a hash function. I admire Dan a lot. He knows a hell of a lot more than me about everything involving higher maths and cryptography.

In fact siphash-2-4 is already overkill for my purposes, with siphash-4-8 being close to cryptographically secure. Your attack is about as conceivable to me as P being equal to NP, in which case there is no cryptographically secure hash function anyway.

Fact? Based on what published cryptanalysis.

What I am saying is that the edge trimming eliminates most of the nodes from consideration and then you have a hash table representing a sparse array for the remainder of the nodes which aren't the pruned/trimmed leaf edges.

Yes, totally correct. But in my view the term "bucket" denotes a fixed size unordered container, and my current algorithms use no such thing.

Agreed 'bucket' would not be an ordered array element. That is a more precise use of the term 'bucket'. We programmers who started hacking since the 1970s and 1980s may be a little bit more loose (employing the general English definition) with our terminology (using 'bucket' to refer to an array element as an array bucket) as compared to someone who came up predominantly in an academic setting and especially someone employing technical English not as their native language wherein they would likely be more strict about definitions. I stand corrected on the more precise terminology.

Well my wink is about using maximum nonce counts, i.e. edge fraction M/N > 1/2.

My algorithms break down in that case, since the expected number of cycles becomes non-constant, causing the basic algorithm to overlook an increasing fraction of them, and edge-trimming fails altogether to remove a large fraction of edges.

Exactly. Wink In a few months, you'll know what I was referring to when it is published.

My calculation was 2^15, that isn't a million.

You need millions to find 2^12 of them wanting to access a single row.

With 2^15 threads you expect 1 access per row.

Correct on 2^15 but not on the conclusion. We both agree the number of memory banks is irrelevant for this concern. Memory banks only determine how many row buffers are active simultaneously, which impacts maximum speed and whether we are memory bandwidth or memory latency bound on speed of execution.

So for the power consumption issue, if we have 2^29 counters (2-bits each) and 2^14 counters per memory page/row, then 2^16 (65536) h/w threads means we can stall threads (perhaps not permanently so as to account for variance outliers) and be roughly assured statistically of roughly 2 accesses per row. That already doubles the hash computation per memory page row.

Thus we don't need millions of h/w threads to convert the algorithm to computation bound on power consumption, and thus making it not ASIC resistant.

At 2^18 (262144) h/w threads we have 8 times more computation per memory page row than the CPU (and the computation will be orders-of-magnitude more power efficient on the ASIC).

A CUDA gpu can apparently do 671 million threads, although this are probably not synced on memory row buffers as we would need here, although the statistical spread might be sufficient without syncing. I think if you had the Kill-A-Watt meter what you would have observed is that by increasing threads, the speed remained topped out at memory bandwidth, but the power consumption of the GPU decreases as threads are increased beyond a certain threshold (assuming GPU threads are very efficient, but that might not be the case).

I don't think it is safe to assume that ASICs can't be designed to support millions of very efficient threads for this very customized computation and again sharing the computation transistors amongst only the threads that aren't stalled, thus not needing millions of instances of the compute units. The GPU may already be doing this, although maybe not since it is probably optimized for performance and not power consumption where there is a choice, although once it has hit memory bandwidth bound one might hope the GPU would be optimizing for minimizing power consumption by maximizing row buffer coalescing, but this is perhaps too costly to implement in the general case of GPU (but I am arguing not in the case of an ASIC with a specific custom computation circuit). Apparently GPU memory coalescing is not powerful enough to do what we require.

Also remember I need for instant transactions to have a very fast proving time, which thus means at 2^20 counters it could be trivially parallelized losing ASIC resistance with only thousands of threads.

Actually how ironic that the 2-bit counters instead of the original basic algorithm, makes the ASIC resistance worse, because more counters fit in a row. Remember I wrote this about "bit array" in my 2013 rough draft paper on memory hard proof-of-work.
6035  Alternate cryptocurrencies / Announcements (Altcoins) / Re: IOTA - Unmoderated thread on: June 10, 2016, 08:30:11 AM
You ought to be sterilized.

Oh my. Ted Turner, Bill Gates eugenics.

You don't just have slimy ethics— you are sociopath.
6036  Bitcoin / Bitcoin Discussion / Re: Bitcoin IS basically DESTROYED on: June 10, 2016, 12:46:51 AM
As AnonyMint predicted back in 2013...


The centralized Bitcoin won't be 51% attacked bcz those controlling it will have the 666 control system they designed Bitcoin to accomplish. In the transitionary phase now, the Chinese miners will be handed lots of wealth as the process of centralizing mining proceeds. We can't say every Chinese miner today knows he is part of the ultimate plan. We can't even say the Blockstream devs know they are part of some diabolical plan. They are just trying to fix a design that can't be fixed without restarting from scratch. Compartmentalization is the modus operandi of the DEEP STATE. This is a process. The DEEP STATE that designed Bitcoin has a plan over years.

Our other hope is the system blows up technically. But that is why Blockstream is receiving so much funding, because they probably have the expertise to centralize Bitcoin sufficiently whil still being able to give some illusion of decentralization for sufficient time that Bitcoin maximalists fall into the trap, of which SegWit is a major step in that direction.

https://bitconnect.co/bitcoin-news/126/what-is-gavin-andresen-telling-china-chinese-ama-details-revealed

The one-time anointed king of Bitcoin by Satoshi Nakamoto himself ... When asked about his association with Blockstream, who some see as a force looking to centralize Bitcoin, he did not seem to be in alignment with their agenda...

In regards to the centralization of Bitcoin, he now works for the Media Lab at MIT, who has just come out with a controversial concept known as ChainAnchor... ChainAnchor would coerce miners to not allow transactions that do not have the identities of the users of Bitcoins tied to their transactions and wallets, defeating the peer-to-peer, identity-protecting foundation of Bitcoin itself. Andresen also is a paid technical advisor to leading Bitcoin companies like Coinbase, BitPay, and Xapo.

His commentary on the upcoming Lightning Network concept was less than glowing...
6037  Alternate cryptocurrencies / Announcements (Altcoins) / Re: IOTA - Unmoderated thread on: June 10, 2016, 12:41:21 AM
altcoinUK you are a pitiful lame loser curmudgeon and now you elect your silver-back ape Blockstream corrupt leader, but sorry...

As for that uber talent, the young gmaxwell destroyed you in 5 sentences.

AnonyMint destroyed his logic errors numerous times such as the following one where he can't do basic math and AnonyMint will continue to:

Doing so would also increase the overhead for the format by 20% or so. As mentioned, accurate indexes are not small-- and many things compromise by just not providing accurate indexes; which then leaves applications linearly scanning or not permitting sample accurate seeking.

I assume the 20% estimate is only for when the optional index is present. So it is presumed someone would use an index only when that 20% was justified by their use case. Again I argue you should not remove degrees-of-freedom and hinder the optimization of use cases which you did not envision because no group or person is omniscient.

And how is not having the index any worse than not allowing an index. I fail to see the logic. Seems you are arguing that the receiving end will expect indexes and not be prepared for the case where indexes are not present. But that is a bug in the receiving end's software then. And in that case, there is no assurance that software would have done the index-less seeking more efficiently for the status quo of not allowing an index. None of this makes sense to me.

Also I don't understand how you calculate 20% increase in file size for adding an index. For example, lets take an average 180 second song consuming roughly 5MB for VBR encoding. Let's assume my users are satisfied with seeking in 1 second increments, so that means means I need at most 180 of 22-bit indices, so that is only 495 bytes which is only a 0.01% increase! On top of that I could even compress those 22-bit indices into relative offsets if I want to shrink it by roughly 75% to 0.0025%.

Seems you are excluding use cases.

What use case is excluded?

I have alluded to scenarios in this post. Permute them. And there are use cases that neither of us are aware of. We are not omniscient. We should not top-down remove degrees-of-freedom. Some of the people designing Web standards these days are doing it wrong and are not of the same pedigree as Vint Cerf. I butted heads with them in the past (including Ian Hickson!), and I am tired of arguing with you guys. Just do what ever you want, I am going to route around the failure.
6038  Alternate cryptocurrencies / Altcoin Discussion / Re: Can Bitcoin maximalism be defeated in the reality of adoption? on: June 10, 2016, 12:31:34 AM
As AnonyMint predicted back in 2013...


The centralized Bitcoin won't be 51% attacked bcz those controlling it will have the 666 control system they designed Bitcoin to accomplish. In the transitionary phase now, the Chinese miners will be handed lots of wealth as the process of centralizing mining proceeds. We can't say every Chinese miner today knows he is part of the ultimate plan. We can't even say the Blockstream devs know they are part of some diabolical plan. They are just trying to fix a design that can't be fixed without restarting from scratch. Compartmentalization is the modus operandi of the DEEP STATE. This is a process. The DEEP STATE that designed Bitcoin has a plan over years.

Our other hope is the system blows up technically. But that is why Blockstream is receiving so much funding, because they probably have the expertise to centralize Bitcoin sufficiently whil still being able to give some illusion of decentralization for sufficient time that Bitcoin maximalists fall into the trap, of which SegWit is a major step in that direction.

https://bitconnect.co/bitcoin-news/126/what-is-gavin-andresen-telling-china-chinese-ama-details-revealed

The one-time anointed king of Bitcoin by Satoshi Nakamoto himself ... When asked about his association with Blockstream, who some see as a force looking to centralize Bitcoin, he did not seem to be in alignment with their agenda...

In regards to the centralization of Bitcoin, he now works for the Media Lab at MIT, who has just come out with a controversial concept known as ChainAnchor... ChainAnchor would coerce miners to not allow transactions that do not have the identities of the users of Bitcoins tied to their transactions and wallets, defeating the peer-to-peer, identity-protecting foundation of Bitcoin itself. Andresen also is a paid technical advisor to leading Bitcoin companies like Coinbase, BitPay, and Xapo.

His commentary on the upcoming Lightning Network concept was less than glowing...
6039  Economy / Economics / Re: Economic Totalitarianism on: June 10, 2016, 12:29:37 AM
As AnonyMint predicted back in 2013...


The centralized Bitcoin won't be 51% attacked bcz those controlling it will have the 666 control system they designed Bitcoin to accomplish. In the transitionary phase now, the Chinese miners will be handed lots of wealth as the process of centralizing mining proceeds. We can't say every Chinese miner today knows he is part of the ultimate plan. We can't even say the Blockstream devs know they are part of some diabolical plan. They are just trying to fix a design that can't be fixed without restarting from scratch. Compartmentalization is the modus operandi of the DEEP STATE. This is a process. The DEEP STATE that designed Bitcoin has a plan over years.

Our other hope is the system blows up technically. But that is why Blockstream is receiving so much funding, because they probably have the expertise to centralize Bitcoin sufficiently whil still being able to give some illusion of decentralization for sufficient time that Bitcoin maximalists fall into the trap, of which SegWit is a major step in that direction.

https://bitconnect.co/bitcoin-news/126/what-is-gavin-andresen-telling-china-chinese-ama-details-revealed

The one-time anointed king of Bitcoin by Satoshi Nakamoto himself ... When asked about his association with Blockstream, who some see as a force looking to centralize Bitcoin, he did not seem to be in alignment with their agenda...

In regards to the centralization of Bitcoin, he now works for the Media Lab at MIT, who has just come out with a controversial concept known as ChainAnchor... ChainAnchor would coerce miners to not allow transactions that do not have the identities of the users of Bitcoins tied to their transactions and wallets, defeating the peer-to-peer, identity-protecting foundation of Bitcoin itself. Andresen also is a paid technical advisor to leading Bitcoin companies like Coinbase, BitPay, and Xapo.

His commentary on the upcoming Lightning Network concept was less than glowing...
6040  Bitcoin / Bitcoin Discussion / Re: A Question for Bitcoin Maximalist on: June 10, 2016, 12:27:20 AM
As AnonyMint predicted back in 2013...


The centralized Bitcoin won't be 51% attacked bcz those controlling it will have the 666 control system they designed Bitcoin to accomplish. In the transitionary phase now, the Chinese miners will be handed lots of wealth as the process of centralizing mining proceeds. We can't say every Chinese miner today knows he is part of the ultimate plan. We can't even say the Blockstream devs know they are part of some diabolical plan. They are just trying to fix a design that can't be fixed without restarting from scratch. Compartmentalization is the modus operandi of the DEEP STATE. This is a process. The DEEP STATE that designed Bitcoin has a plan over years.

Our other hope is the system blows up technically. But that is why Blockstream is receiving so much funding, because they probably have the expertise to centralize Bitcoin sufficiently whil still being able to give some illusion of decentralization for sufficient time that Bitcoin maximalists fall into the trap, of which SegWit is a major step in that direction.

https://bitconnect.co/bitcoin-news/126/what-is-gavin-andresen-telling-china-chinese-ama-details-revealed

The one-time anointed king of Bitcoin by Satoshi Nakamoto himself ... When asked about his association with Blockstream, who some see as a force looking to centralize Bitcoin, he did not seem to be in alignment with their agenda...

In regards to the centralization of Bitcoin, he now works for the Media Lab at MIT, who has just come out with a controversial concept known as ChainAnchor... ChainAnchor would coerce miners to not allow transactions that do not have the identities of the users of Bitcoins tied to their transactions and wallets, defeating the peer-to-peer, identity-protecting foundation of Bitcoin itself. Andresen also is a paid technical advisor to leading Bitcoin companies like Coinbase, BitPay, and Xapo.

His commentary on the upcoming Lightning Network concept was less than glowing...
Pages: « 1 ... 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 [302] 303 304 305 306 307 308 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!