Bitcoin Forum
January 26, 2020, 05:28:28 AM *
News: Latest Bitcoin Core release: [Torrent]
  Home Help Search Login Register More  
  Show Posts
Pages: [1] 2 3 »
1  Other / Meta / Impeachment: Is Greg Maxwell the best choice for being a mod in bitcointalk? on: November 01, 2019, 07:20:08 AM
I'm not starting this to attack Gregory Maxwell, on the contrary, it is about praising him.

Gregory Maxwell is a techno/political icon and a legend. He has full rights to be biased in favor of or against any single topic in the bitcoin ecosystem, actually, he should be biased, otherwise, who is in charge of taking care of hypes, FUDs and scams?

My point is, such a figure doesn't need any authority in this forum to do his job as a think tank, and this forum doesn't need a biased moderator on the other side.

I understand; bitcoin is money and money was born with blood on his hands but believe it or not, bitcoin needs to evolve and nobody is in charge of its evolution path, it is not Ethereum, there is no Foundation and no Vitalik neither any stupid roadmap in bitcoin because it is not a project. It needs space and opportunity for divergent ideas and out of the box thinking.

I don't want to go to the details and put forward how Greg's biased point of view is affecting his job as a forum moderator, it would be absolutely unnecessary, it is not about this or that evidence supporting or refuting my concerns, it is about a general situation we are dealing with: a conflict of interests.

Hereby, I officially ask Gregory Maxwell to step down from his moderation positions in this forum.

Edit: I'm not and won't be campaigning for this to happen, please don't terrorize me or try to make me quite. I have no plan to argue about what I said and won't answer stupid attacks by shills.
2  Bitcoin / Development & Technical Discussion / Bitcoin adoption: A technical challenge on: October 19, 2019, 01:34:26 PM
Hi all,

I'm not a bitcoin whale not even a person who has bought like few coins and is praying for the price to skyrocket. On the contrary, I've exhausted all my coins and savings to keep myself full time focused on bitcoin as my research field instead of what I've been doing for years as an ordinary software engineer and a programmer. Not that ordinary in the latter field tho.  Tongue

So, from my point of view bitcoin adoption is not an urgent personal requirement at all. Actually I'm totally satisfied with what bitcoin has done up to now: great codebase, excellent discussions, a decade of 24*7 mission-critical task accomplished with almost no interrupts, absolutely no failure,  ... bitcoin is amazing from a technical perspective.

But before being a software guy, I'm a human being and an activist. I want peace and justice and equal opportunities and prosperity for mankind and health and safety and preservation of species for the planet. Actually bitcoin became my main technical concern because of its superiority in ethical aspects. It was the first field of commercial activity ever that I found to be coherent with my half/spare time occupation as an activist, it was why I quitted my job and stick with bitcoin, a life-time decision. Smiley

So, aliashraf the idealist, wants adoption to happen while aliashraf the dev does not care that much about it, why should he? Technical curiosity? come on, out there zillions of technical problems to masturbate with, in cryptocurrency and other IT fields, let alone physics and cosmology.

A bitcoin whale/hodler/investor faces a dilemma: on one hand, he needs bitcoin to be mass-adopted because it is what can eventually make bitcoin to skyrocket but on the other hand, there is a lot of contradictions that discourage him, most importantly, they don't like change and putting their assets in the risk of hypes and tensions. It is how it works, the real world, people are ready to invest a tiny fraction of their savings on a promising technology, but when it happens and they get rich, it will be time for conservatism, so natural.

People in bitcoin ecosystem are hybrids of devs, investors, activists, ordinary users, ... As long as we are asking about agendas and objectives it is totally about the ingredients: How much of each factor is presented in the anatomy of each person who is somehow active in bitcoin?

But no matter who you are and what's your priorities, bitcoin mass adoption is a goal you need to respect eventually because it is not just about expansion but about survival, systems either grow or collapse, without mass-adoption bitcoin will fade out.

Among many things that one can suggest for this to happen, technical challenges are the most important ones. Bitcoin is technology after all, isn't it? So, how is it possible to have bitcoin adopted by billions of people when it is faced with centralization and scaling challenges? Sure it is not possible and let's don't get the "world reserve currency" claims as serious rhetoric, they are not, ask a BS economics graduate.

My proposition in this topic is as follows:
Bitcoin mass-adoption is subject to technical developments that should and can happen simultaneously in three critical fields: Decentralization, Scaling and Privacy.

I strongly denounce Buterine's claim about the existence of a so-called trilemma which implies that it is impossible or (as he has retreated to it recently) very hard to achieve to such a state.

I am aware of the popularity of Buterine's trilemma among some bitcoiners, core devs and LN believers who are naysayers to ambitious improvement ideas because of the balance of the above-mentioned ingredients in their blood.  Cheesy
All I have to mention to these folks is that Buterine himself is retreating from his claim not only officially but also by advocating in favor of Serenity and Eth 2.0!

During my research, for a long time, I have mainly focused on decentralization targetting both ASICs and pools as evils, postponing scaling and privacy problems. To be clear, I don't believe in security as an independent problem, it is rather a spin-off from centralization scenarios. I formulated some ideas and proposals for both ASICs and pools, specifically, I made a thorough analysis of pooling pressure flaw in bitcoin and proposed an alternative approach to bitcoin like PoW systems which are based on a winner-takes-all idea, instead I proposed a collaborative proof of Work method.

Thereafter I began to realize that both centralization and scaling issues in bitcoin are by no means subjects of a trade-off unlike the poisonous ideas behind the trilemma of Vitalik Buterine and more interestingly they are not essentially and radically two different problems and both could be understood as consequences of the same (now, let's say) flaw: winner-takes-all.

My latest work is focused on a comprehensive solution to this problem and I think it is in a good state, almost ready to publish and I'll share it with bitcoin community asap, for the time being, I'm just wondering:

1) How important do you think such a project is?

2) Who is ready to jump in by dedicating actual resources to support/participate in a project tackling centralization, scaling, and privacy at the same time without any trade-offs, i.e without sacrificing one in favor of the other two?

Please note:
I'm talking about bitcoin, my agenda involves no forks and no alt-coins don't waste your valuable time debating about how bad is a  fork and how disappointing would be an alt.

3  Bitcoin / Development & Technical Discussion / minisketch: txids vs wtxids on: October 07, 2019, 01:45:45 PM
I made a mistake and posted a reply in @CarltonBanks thread about minisketch. It is self-moderated and you know Carlton: a troll who fakes being a troll hunter

Wtxids are not used anywhere (so it shouldn't be pre-computed already) and they are more expensive to compute,
Sure they are, they're required to tell two different transactions apart.
With all due respects, I completely disagree. Two different wtxids do not represent two different txns but different txids definitively do so.

When txs are only identified by txids I can take a valid transaction mutate its witness to make it invalid (or just too low a feerate), and it'll have the same txid, so if you fetch by txid you can't avoid fetching the same junk multiple times.
Why should you? Because you are an adversary? So, as an adversary, couldn't you produce multiple witness data for the same tx? Aren't we back to the transaction malleability era?

My point is wtxids are vulnerable to txn malleability and I see no reason to use them in minisketch or any new proposal.

To be more specific: I think even in the bootstrap process we could have segwit witness data pruned if there were enough blocks under which the containing block is buried.

4  Other / Off-topic / GitHub is shitty, why not a decentralized solution? on: July 27, 2019, 12:32:09 AM
I've just received this e-mail from Github:

GitHub <>

to me
Due to U.S. trade controls law restrictions, your GitHub account has been restricted.
For individual accounts, you may have limited access to free GitHub public repository
services for personal communications only. Please read about GitHub and Trade Controls at for more information
So, people of Iran(like me), Cuba, Syria, North Korea and Crimea (200 millions?) are subject to US Trade Controls as a whole and they can't use GitHub accordingly  Grin

Any comments?
5  Bitcoin / Development & Technical Discussion / bitcoin transaction: offline relay vs off-chain processing on: February 16, 2019, 12:48:02 PM
Following link is an article including a guide to something the author insists to call offline bitcoin transactions, such schemes are trending nowadays:

It is 5 min read and if you would bother taking a look, you would easily grab the point:

1- We have a wallet full node/spv with no outbound connection just using BlockStream Satellite service.

2- Somehow we decide to send few coins to an address.

3- We generate a signed txn using our wallet, as we don't have outbound connections, without trying to relay we store it as raw text or a QR image.

4- We use a non-internet message relay system (like TxTenna or LoraWan ) or we can simply post our txn to either the receiver herself and get txn relayed to the actual p2p network.

5- Once the transaction is included in the blockchain, our one-way satellite link with BlockStream lets our node to be informed of the event because the whole block is broadcasted using a push protocol anyway.

It would be easy to criticize this model for its dependency on BlockStream Satellite service. Though, I don't like trivial takes when it comes to such schemes and in this case it is more than obvious that our node/wallet is exclusively bound to a trusted source, so what?

The main advantage of this model is its censorship/surveillance resistance, we are not present on internet and they just can't track/block our ip and it is good but is it good enough?

No, it is not:
First of all I think it is too much to call such a simple model offline transaction it is just confusing, using a non-ip relay network to send your transactions doesn't make bitcoin or its transaction processing offline. It is just an offline transaction relay model, nothing more and it is not enough as I said.

Alternatively I'm working on a true offline technology which I prefer to call it off-chain transaction processing that I will discuss in more details about it in next posts. For the starter I wanted to know how do you guys think about this subject and perceive this term: off-chain transaction processing. Heads up, I'm not talking about LN or side chains, ... it is bitcoin transactions being processed offline.

Again it would be easy to denounce such a concept as paradoxical or ambiguous but as I said, trivial takes are not helpful.
Saying things like: "Hey, as long as we are speaking of bitcoin transactions, they are processed on-chain", wouldn't help as we are all aware of that.

To be a bit more specific by off-chain bitcoin transaction processing I mean a technology that allows wallets to exchange bitcoin txns, just like paper money/bills, multiple times with multiple parties engaged without leaving any trace on the blockchain other than circumstances that one of the parties, decides to deposit it.
6  Other / Meta / Improve draft feature, seriously! on: January 27, 2019, 07:54:55 AM
I just lost like one hundred lines of text including a data sheet I was preparing to start a new topic in development & technical subforum because of the naive way the draft feature is implemented with.

I don't use other word processors when I'm posting here no matter how long and sophisticated is my post and takes how much time or effort to write it down and it puts me in danger of losing my content and to mitigate it I regularly use preview button because it is supposed to save a draft of my work in progress but when it comes to a sophisticated post that takes days to be ready things get more risky and you need to re-preview your work because of this:
Drafts are saved whenever you preview or post a topic, post, or PM. Up to 100 drafts are kept. Drafts are deleted after 7 days.

Fair enough, 100 drafts is not bad and automation is great idea but the implementation is a joke and put me in a huge trouble. It turned out that they mean it when they say whenever.  Cheesy

WHENEVER you push postor preview  bottoms you get a draft and if you push this several times on the same subject (a post or a pm) you get one more and (it is really stupid) you can easily make 100 copies of the same thing saved as draft and because you can't have more than 100 drafts, you lose drafts of your works in progress.

It is what happened to me the other night when after a crash I tried to recover my work using this feature and I found that I've pushed post/preview buttons on few other posts too much (multiple edits) and I have a lot of stupid versions of same posts and I don't have any draft of my actual work in progress because of the pile of garbage this feature has produced and buried what I actually need as draft.

I think some tweaks would be very helpful:
1- Don't keep multiple drafts for a subject (post/pm) just keep the latest version.
2- Don't generate drafts for post operation.

7  Bitcoin / Development & Technical Discussion / Ethereum Anti-ASIC fork, is it the right time for bitcoin too? on: January 09, 2019, 01:08:04 PM
In governance Ethereum is far more centralized than bitcoin, they have Vitalik both as a celebrity and a spiritual leader and believe it or not they have a roadmap  Cheesy

IMO, a cryptocurrency with a leader,  is not reliable in the first place, but when the leader turns out to be a PoS believer in charge of a PoW coin things get even more confusing. I believe that Eth 2018 falling down 3 times worse than bitcoin has some thing to do with this fact.

Still there are good news as well: Vitalik is growing up and stepping down, well, not officially and completely but there exist signs. Most importantly, in January 5 latest Ethereum core dev meeting ended with a long-waiting admission, tentatively tho, of implementing ProgPoW as an anti-asic algorithm to retire Ethash. ProgPoW is designed to utilize gpu strengths such that it is almost impossible for asic manufacturers to build a considerably more efficient chip for mining it and not ending to to a gpu design project.

It is an important event in cryptocurrency and I think we will be witnessing a new wave of debates and discussions in bitcoin community regarding the situation with ASICs and the potentials for an anti-ASIC fork.

8  Bitcoin / Bitcoin Discussion / USA blacklists 2 bitcoin addresses, threatens with secondary sanctions! on: November 29, 2018, 09:25:33 AM
I made this post in my other topic : The situation with Iran, but it seems to go beyond that scope, so I started this one:

Today, the US federal agency in charge of Iran sanctions, Office of Foreign Assets Control (OFAC), for the first time in bitcoin history, announced two bitcoin addresses associated to two Iranian individuals subject to secondary sanctions!


WASHINGTON – The U.S. Department of the Treasury’s Office of Foreign Assets Control (OFAC) took action today against two Iran-based individuals, Ali Khorashadizadeh and Mohammad Ghorbaniyan, who helped exchange digital currency (bitcoin) ransom payments into Iranian rial on behalf of Iranian malicious cyber actors involved with the SamSam ransomware scheme that targeted over 200 known victims.  Also today, OFAC identified two digital currency addresses associated with these two financial facilitators.  Over 7,000 transactions in bitcoin, worth millions of U.S. dollars, have processed through these two addresses - some of which involved SamSam ransomware derived bitcoin. In a related action, the U.S. Department of Justice today indicted two Iranian criminal actors for infecting numerous data networks with SamSam ransomware in the United States, United Kingdom, and Canada since 2015.  

“Treasury is targeting digital currency exchangers who have enabled Iranian cyber actors to profit from extorting digital ransom payments from their victims.  As Iran becomes increasingly isolated and desperate for access to U.S. dollars, it is vital that virtual currency exchanges, peer-to-peer exchangers, and other providers of digital currency services harden their networks against these illicit schemes,” said Treasury Under Secretary for Terrorism and Financial Intelligence Sigal Mandelker.  “We are publishing digital currency addresses to identify illicit actors operating in the digital currency space. Treasury will aggressively pursue Iran and other rogue regimes attempting to exploit digital currencies and weaknesses in cyber and AML/CFT safeguards to further their nefarious objectives.

Obviously, the two addresses belong to Iranian local bitcoin traders/exchanges that has been transacting bitcoin for years and the accusation of being "involved" in processing ransomware related addresses is just an excuse for weakening bitcoin in Iran by threatening people all around the world.

It is a declaration of war against bitcoin too. The dick heads in Trump administration has no clue about what they are talking about! How in the hell bitcoiners should avoid this two addresses to transact with?

And it is a proof of what I'm saying about the shady zone getting narrower:
I guess, it is no more possible for bitcoiners to live in the gray zone, it is just fading out, the gray zone:

Trump needs even more centralization of power for running his version of Fascism, Putin has already centralized everything in Russia (again) and Chinese have no clue about what a non-centralized form of power could ever be.

In monetary systems, AML/KYC discourse is getting more aggressive on a daily basis, It is about Orwell 1984 rather than Satoshi Nakamoto 2009. The true force behind bitcoin falling down is this trend and the fact that bitcoiners are doing almost nothing about it other than sticking with their few coins and waiting for dick heads in regulatory agencies to show merci and absorb them in the so-called 'legal system'.

As a smart con artist, Craig Faketoshi Wrong has understood the situation and made a decision: "Don't be shy Craig, you've never been, choose the wrong side as usual and figure out a way to keep talking in public."

The rest of bitcoiners? They are just shy!

Just a few days after the above post and they are blacklisting bitcoin addresses and threatening bitcoiners all around the world by their brutal secondary sanctions!

What?! You think they may be following my posts?

But seriously, isn't it the moment of truth for bitcoin?  
9  Bitcoin / Bitcoin Discussion / The situation with IRAN on: November 22, 2018, 06:19:19 PM
As a resident of Iran, I'm obviously biased toward my nation, the people who I personally know and well, mostly love. A great country it is, Iran. Thousand years of history, a very fantastic, sophisticated and sweet language FARSI (persian) which has absorbed a lot of words and concepts from Arabic pertaining its own unique and well formed grammar and rich treasury of vocabulary which has given birth to one of the most beautiful and important parts of humanity literature heritage. I love this country and its nation and I'm honored to be biased defending its right to survive and to develop.

Obviously Trump administration in the US is a global disaster (Ironic isn't it? US citizens vote on behalf of us and determine our fate!) but its worst political behavior is withdrawing from Iran nuclear deal and reinitiating the most brutal sanctions against Iran, with no excuse and no international support. Sanctions that are described as "the strongest ever in history" by US officials. In the heart of them a row of harsh restrictions  against Iran banking system, including its central bank, Bank Markazi.

Now it would be the challenge: How could bitcoin help people of Iran to resist against this dirty invasion? Isn't it the right time for bitcoiners to prove themselves as true libertarians?

I mean we have Faketoshi Wright who represents everything against bitcoin on one side but who is representing bitcoin, true bitcoin on the other side?
And what the hell other bitcoiners are doing?
Most of the prominent figures have disclosed their identities and are vulnerable to SEC/NSA prosecution and have no choice other than playing coward, I suppose.

I didn't start it to discuss about how idiot Trump is or is not, or to argue in favor of a fuckedup regime like what we've in Iran.

I'm just asking about how faithful and honest we are and how could we help people of Iran to be able to do a fair "non-nuclear" trade like buying food, drugs, weed, civil air plane parts, ... ?

Believe it or not  International Court of Justice recently ordered US to lift Iran sanctions, which Trump says he would not follow at all. Apparently it is because America has recently become great againGrin and does not care about what the remaining 96% of human beings think or want!

10  Other / Meta / Save Bitcoin Discussion subforum on: November 22, 2018, 11:16:51 AM
Recently I started to be more active on Bitcoin Discussion subforum and became totally disappointed. A lot of noise over there low quality redundant topics pop-up every minute and bury each other under a pile of garbage. Isn't it mods job to take care of this issue?
11  Bitcoin / Bitcoin Discussion / Roger Ver vs Craig Wright, What is splitting the two? on: November 14, 2018, 07:21:08 PM
For some reasons I couldn't manage to do enough research on Faketoshi vs Ver debate and it would be really appreciated if somebody could brief me about their theoretical divergence.

I'm already aware of parts of Wright's agenda to make (or at least keep) bcash more government friendly and persuade his victims to give him their money without hesitation. What I don't exactly know is Ver's agenda.

Personally, I don't recognize bcash as bitcoin and definitively not the idea of increasing block size as a serious scaling solution, but I believe there is always something to learn from debates in crypto ecosystem generally as it is possible to experience same situations in bitcoin.

I also have been notified that Gregory Maxwell has somehow intervened in this debate I was just curious: what's going on?

12  Bitcoin / Bitcoin Discussion / SEC targets decentralized exchange developer warns others brutally. Lessons? on: November 13, 2018, 09:01:43 PM
U.S. SEC charges EtherDelta smart contract developer Zachary Coburn and threatens developers of decentralized exchange software officially.

It is why we have Satoshi missing, isn't it?

U.S feds never deserved to be considered as friends but Trump administration appears to be the worst enemy ever. They care about nothing specially law when it comes to expanding their authority. Seriously, what kind of a reasonable government charges a programmer for coding open-source?  

Now, what would be the lessons?
13  Bitcoin / Bitcoin Discussion / Axiom of Resistance (Why Craig Wright is not Satoshi) on: November 10, 2018, 09:12:07 AM
In early days of May 2016, when Craig Wright claimed to be Satoshi, by rejecting most of the community members demanding for Satoshi private keys, I argued somehow in favor of him. I don't believe in keys, keys are not our identities, they are certifications to our rights, nothing more. Losing/having access to a couple of keys won't change anything about who Satoshi is or is not. I like Gavin Anderson (personally) and I followed him, it was not a big deal after all, who cares about Satoshi real identity?

Even in the past couple of years, being informed about Wright's suspicious behaviours and moves in the ecosystem, I have not decided about him being a hoax or Satoshi himself. Actually didn't follow the man at all.

Now, I have encountered this article : Drugs, fraud, and murder By Craig Wright and I'm now fully convinced about him being a hoax. Thank you Craig, you are absolutely helpful in making an embarrassment exemplary out of your carrier.

In this article, besides repeatedly denouncing bitcoin and advertising for bcash, Craig Wright is crusading against:
... a group of misguided anarchistic socialists who refuse to work within the bounds of the law wanting to cry at the world and say, we do not want law, we want to say what the world is like. It is unfortunate that many grown men still act this way.

Other than its poor writing, this article shows a radical difference in philosophy and vision between the fake Satoshi and the original one:
>[Lengthy exposition of vulnerability of a systm to use-of-force
>monopolies ellided.]
>You will not find a solution to political problems in cryptography.

Yes, but we can win a major battle in the arms race and gain a new territory of freedom for several years.

Governments are good at cutting off the heads of a centrally controlled networks like Napster, but pure P2P networks like Gnutella and Tor seem to be holding their own.


I, personally, wouldn't care about bitcoin if it was not against state control.
Libbitcoin guys have formalized this issue as Axiom of Resistance. The word 'axiom' is used intentionally to prevent any further disputes. They simply ask whether you believe in desirability and feasibility of resisting against state control or not? Yes? You are a bitcoiner. No? You are not! Their words:
One who does not accept the axiom of resistance is contemplating an entirely different system than Bitcoin. If one assumes it is not possible for a system to resist state controls, conclusions do not make sense in the context of Bitcoin; just as conclusions in spherical geometry contradict Euclidean.

I didn't started this to re-new an old hoax story. I'm curious about how other bitcoiners think about this issue.

14  Bitcoin / Development & Technical Discussion / Is Bitcoin infrastructure too Chinese? What should be done technically? on: October 10, 2018, 07:37:56 AM
I just read this academic  paper, authors are suggesting that Bitcoin is in danger of being compromised by Chinese government because of ASICs and pools.

I have been championing against ASICs and pools for a while and as of my experience when it comes to any serious improvement to bitcoin and it has to be done by a hard fork an army of 'legendary' shills are ready to make it almost impossible to discuss anymore.

But we have hard-fork-wishlist, discussing an issue won't fork the chain, actual forking does! So, I politely ask these guys to give us a break and let us to have a productive discussion about whether or not we could do anything, any technical improvement obviously, to deal with what the authors are pointing out?

15  Economy / Economics / On Marxism and the bitcoin energy consumption debate on: September 08, 2018, 08:38:44 PM
On Marxism and the bitcoin energy consumption debate

What's the value of bitcoin?

As much as his idea about "changing the world instead of interpreting it" that  Nazists in Germany and Communists in the USSR shared to ruin their societies and recently is employed by Neocons in the USA (apparently for a same purpose) is void and dangerous, Marx's contribution to political economy is one of the greatest human theoretical achievements ever:
He was the first who proposed a scientific and quantitative measure for value of a commodity: work.  

Marx's labour theory of value asserts that although the price of a commodity is determined by supply and demand it is nothing more than a concrete presentation of an abstract and essential  property inherent in each commodity: its value that is determined by the average amount of labour necessary for the society to produce it. Value is not volatile say because of market fluctuations.
By labour Marx implies both live(e.g. man hour) and dead labour which is recursively embedded in the resources that should be consumed/depreciated in the process.

Unfortunately, Das Kapital very soon become the bible of Communists and (remained so for more than a century), fueled by "changing the world" discourse and later completed by a package of other fake revolutionary ideas that fooled an important segment of intellectuals all over the world to act in the best interests of a corrupted regime in Russia.

On the other side, capitalists and their mercenary "scientists" in academies counterattacked by forging their own version of political economy: Marginalism.
More precisely: their own version of anti-political economy or simply, anti-Marx economy.

Marginalism is an exemplary for fake human sciences made/supported for sole political purposes in 20th century. It was based on the most ridiculous interpretation of value: utility.

Common sense is aligned with what utilitarians say: a commodity's value depends on its usefulness, desirability, utility, ... which is wrong just like any other assertion of common sense:
The earth is NOT flat,
Objects do NOT naturally stop moving,
There is NOT any universal clock,
... and
Bitcoin is NOT wasting electricity (as we will see later).

Historically the huge investments on Marginalism helped development of mathematical models, etc. that filled the shelfs of libraries and gave birth to a "science" that somehow was applicable in predicting market behavior and how the demand for a commodity would change due to psychological factors full of excuses for not being precise because of "complexities" in models and probabilistic nature of the variables involved.

Academy's primary mission was rather complicated: eliminating political economy from mainstream and replacing it with more applicable neutralized "sciences" like micro and macro economics.  
Being ruled by giants like  Marx, Ricardo, Smith, ... political economy was not a territory to be conquered by mercenary scientists, after all.

This mission was accomplished by investing on utilitarianism, the trick was presenting  and propagating it as an alternative theory of value in political debates but practically using it as an applicable instrument for predicting the demand (somehow useful sometimes).
This way they managed to convince their students firstly that value is a controversial topic and the neutralized utilitarian point of view is something meaningful like Marx's Labour Theory of Value and finally they became so confident to announce Marx's theory and political economy dead.

And now we are here, bitcoin has emerged and mercenary economists are in a deadly impasse. Their "science" is absolutely void and inefficient for understanding such a revolutionary phenomenon because they have been castrated more than a century ago and don't understand what does a political economic revolution look like.

Recently debating Pos/PoW with a PoS enthusiast I asserted PoS coins are made out of thin air (just like fiat) and the energy consumption in PoW is not a waste because it is the source of bitcoin value. My reasoning was naturally based on the established politico economic labour theory of value, Marx's theory.

Surprisingly, few days later I encountered this article. Again, a pos proponent (I suppose) is questioning the value of bitcoin being measurable by the amount of "work" miners do, this time, by directly claiming Marx's theory to be a fallacy!

It is why I'm becoming more and more convinced that the PoW/PoS debate is nothing less than a final debate between true political economists resurrected after bitcoin on one side and fake mercenary economists with their utilitarian interpretation of value that is incapable of understanding why bitcoin has an inherent value not based on a subjective convention or an artificial demand caused by speculation nor even its usefullness as a medium of exchange and a utility.

From a much wider perspective, I would suggest that the whole crypto currency movement would find its theoretical support in political economy as an original decent science rather than fake anti-Marx discourses that belong to a bitter passed period of history named the cold war.
16  Economy / Scam Accusations / Bittrex first scams $millions and now joins to self-regulatory group on: August 20, 2018, 05:23:35 PM
Just check their tweet

Believe it?  Grin

These shitty scammers destroyed lives of thousands of people few months ago by stealing millions of dollars worth of poor Middle East and  Eastern European countries by playing their dirty "verification needed" game without any notice. I know noobs who had invested all the pennies they had earned so hard during years in this scammy exchange and they lost it overnight because they were Iranian, Ukrainian, Syrian, ... and had not access to US jurisdiction resources and were easy to beat victims.

I personally lost like 3 Eths during that scam, no worries, I'm fine now and I'll be around for a while and I have enough time for retaliation, but I know people who seriously suffered from this scam and never healed, I'll retaliate on their behalf as well. I promise.
17  Bitcoin / Development & Technical Discussion / A framework for designing CPU only mining algorithms on: August 19, 2018, 07:42:03 PM
Hi all,

I'm not thinking of replacing SHA256 of bitcoin or releasing a brand new coin, ... and yet I need a proof of work algorithm resistant against parallelism i.e. not feasible for GPUs to compete with a modern CPU.

I'm already aware of lots of literature and mostly failed efforts regarding this issue but I can't get rid of one simple idea I have for designing such an algorithm. I just don't understand why it shouldn't be considered safe against gpu mining.

It will be highly appreciated if somebody please prove me wrong  Cheesy

Two Phase Mining
Suppose we have this memory hard algorithm, something like Dagger Hashimoto which utilizes a big chunk of memory. As we are already aware, gpus mine this algorithm ways more efficient and faster than a cpu because of their multiple thousands of cores that can share the same memory bank. For EtHash (as a Dagger Hashimoto variant) it causes the whole mining process being bound with memory bus unit performance which resists against ASICs but bypasses cpus because of the significant number of cores that utilize the bus almost completely without affecting miner's performance as the bus is dedicated to the gpu.

Now we change this algorithm such that it goes through 2 phases: estimation phase and targeting phase.

Estimation Phase is a normal run of the algorithm, but instead of looking for a hash less than network target difficulty dn, we look for a hash with much lighter difficulty, like 216 times easier, i.e. d0 = dn << 16. (actually it is difficulty-1 that we are talking about). We assume the shift/multiplication operation won't produce any carry ; i.e target difficulty > 216

A typical GPU with enough RAM will substantially outperform any CPU because of the huge number of cores, obviously. After each hit (that normally happens very frequently), we have a nonce like n1 that satisfies H(s|n1) < d1, Until now everything is in favor of gpus, yet. But ...

Targeting Phase is supposed to be much hard to run using a shared chunk of (like 1 GB) of memory. For this:
1-We primarily set n2 = n1 << 16

2- Suppose we have a function,  f(M, n, s, e, flag) that changes a chunk of memory partially (like 20%) using the supplied n from address s to address e and flag determines whether the function only maps and returns the range or modifies it in the memory as well. This function is supposed to be complicated enough that running it is hundreds of thousands times harder and more time consuming than fetching a page from memory. We change the memory chunk (DAG in EtHash) by applying this function with n1 as the second parameter and start and end addresses of the memory chunk and setting flag to true to modify it. Now we have a dedicated chunk of memory specialized for this nonce.

3-We run the original memory hard algorithm with a special restriction: only a combination of last 16 bits of n2 are allowed to be set to 1 to generate a new nonce n, i.e. n-n2 <= 216
4- We need H(s|n) <= 216      
5- Rebuild Memory chunk (e.g. use a backup)

validating the hash includes:
1- Calculate n1= n >> 16, d0 = dn << 16
2- Check that the supplied block header with its nonce, yields a hash H(s|n) <=  216). For this, in each memory access  f(M,n1, address, address, false) should be called instead of memory read.
3-If step 2 is passed now check H(s|n1 < d0
4-continue checking other rules.

We note that the targeting phase above is optimized once we follow the algorithm by applying 20% change in memory chunk (Dag file for Ethash) otherwise we need checking and calculating the values we read from memory in every single run of the algorithm which is supposed to access memory many times( otherwise it is not memory hard at all).

If in our algorithm we access the memory N (20 for EtHash, I suppose) times applying f function in each round of targeting phase will cost n times executing f and as we have almost 216 rounds. Obviously it wouldn't be cost effective for a gpu to use f function in calculate only mode so many times.

Alternatively, modifying memory by calling f function once is a single thread job because f should hold lock on the memory and the multiple cores of gpu are useless during this process. If  f is defined properly this algorithm in its second phase would outperform a gpu because setting up multiple cores to start searching a 32K space simply doesn't worth it.


We use a two phase algorithm, in phase one, an estimate nonce is generated that is useless without a complementary 32K search that is practically single thread. Although gpus keep their advantage in phase 1, the estimates they generate are useless because they should be kept in the queue behind a single thread task that is deliberately designed to be a bottleneck.

I expect a 2 core cpu to beat a single gpu with up to 10 thousand cores.

18  Bitcoin / Development & Technical Discussion / An analysis of bitcoin blockchain height divergence from its standard behavior on: August 16, 2018, 11:04:05 AM
How far can possibly bitcoin blockchain height diverge from the ideal one- block-per-10-minutes measure?

During a discussion with @gmaxwell about a hypothetical DoS vulnerability, in response to my suggestion for blacklisting peers that send block locators unreasonably large, he objected to my proposal by asserting that a loyal node may become maliciously bootstrapped and fooled to commit to a very long chain with trivial difficulty, so blocking it won't be helpful. I responded with a moderate version, ... but after further assessments, I realized that the whole situation is somewhat abnormal, the possibility of having honest nodes with an unreasonable perception of chain height.

In bitcoin, 10 minutes block time interval is not imposed synchronously nor integrally. Protocol adjusts the difficulty every 2016 blocks to keep the generation pace at 10 minutes target but a steady growth in network hash power makes it a possibility for chain height to exceed the ideal number based on 10 minutes generation rate. Actually maximum block height of the network as of this writing is #535461 while it is supposed to be ~504300 showing  a +6% divergence.

Not satisfied with the situation, I asked myself:
Why should we accept a proposed chain with an unreasonable longitude like 10 times or 1000 times longer than normal in the first place?
Why shouldn't we simply reject such proposals and shut down the peers who made them? Putting difficulty requirements aside, is it possible to have a chain orders of magnitude longer than normal?

In the malicious block locator scenario, a peer node, very commonly a spv, sends us a getheaders message with a payload of hundreds of thousands stupid hashes as block locator and instead of being rejected and banned we hopefully and exhaustively try to locate them one by one.

At first, @gmaxwell didn't believe it to be an attack vector because of the cpu bound nature of the processing involved but finally he made a pull request out of that discussion and it was committed to the source. Bitcoin core now puts a MAX_LOCATOR_SZ hardcoded to be equal to 101. So, we have block locator problem fixed now.

A block locator in bitcoin is a special data structure that spontaneously represents (partially) a node's interpretation of block headers in the chain. The spontaneous nature of block locator generation algorithm guarantees its length to be of O(log(n)) where n is the maximum block height in the chain, it is exactly 10+(log2(n)) . For current block height of 535461 a legitimate block locator should carry a maximum of 29 hashes and a node sending 30 hashes is claiming a chain with 1,073,741,824 height which is almost twice longer and it would be 2 million times longer if block locator was holding 50 hashes. Yet we were worried about block locators holding thousands of hashes which represent chains with astronomical heights.

Although the issue is fixed now, the underlying theoretical base didn't get the attention it deserves: A boundary for the divergence of actual block chain's height  from what we could trivially calculate by dividing the minutes elapsed from an accepted checkpoint block's  timestamp by 10 (normal block interval).

At the time of this writing, the divergence is  6+% but we have no measure to consider 60%, 600% , 6,000,000,000% , ... divergence indexes infeasible, until now.

In  this article I'm trying to establish a mathematical framework for further research on this problem, by introducing a relation between hash power increase requirements for a specific divergence within a relatively large window of time.

I also found it useful to include a background section for interested readers, if you are familiar with the subjects, just skip this section.


Difficulty adjustment: How bitcoin keeps the pace
Two rules are applied in bitcoin protocol for regulating the rate by which blocks are generated:
1- Increase/decrease difficulty every 2016 blocks by comparing the actual time elapsed with an expected 2 week period.
2- Don't Increase/decrease difficulty with a factor greater than 4.

The latter constraint is commonly overlooked because such a large retargeting is very unlikely to happen with the huge inertia of the currently installed hash power but as we are studying extreme conditions, the 4 times increase factor is of much importance.

Longest Chain Rule: How Bitcoin chooses between forks
In bitcoin and its clones, LRU is not interpreted naively as selecting the chain with biggest height as the main, instead it is defined as the chain with most work needed to generate it.
To calculate accumulative work done on each chain:
1-work for each block is calculated as:  floor(2^256 / (target + 1))
This is done in chain.cpp of bitcoin source code via GetBlockProof function:
arith_uint256 GetBlockProof(const CBlockIndex& block)
  122 {
  123     arith_uint256 bnTarget;
  124     bool fNegative;
  125     bool fOverflow;
  126     bnTarget.SetCompact(block.nBits, &fNegative, &fOverflow);
  127     if (fNegative || fOverflow || bnTarget == 0)
  128         return 0;
  129     // We need to compute 2**256 / (bnTarget+1), but we can't represent 2**256
  130     // as it's too large for an arith_uint256. However, as 2**256 is at least as large
  131     // as bnTarget+1, it is equal to ((2**256 - bnTarget - 1) / (bnTarget+1)) + 1,
  132     // or ~bnTarget / (bnTarget+1) + 1.
  133     return (~bnTarget / (bnTarget + 1)) + 1;
  134 }

2- During loading/adding a block to block index in memory, not only the block work is computed by calling the above function but also an accumulated chain work is aggregated in nChainWork with this ocde:
pindex->nChainWork = (pindex->pprev ? pindex->pprev->nChainWork : 0) + GetBlockProof(*pindex);
which is executed both in LoadBlockIndex and AddToBlockIndex.

The forks are compared based on nChainWork of BlockIndex item of their respective last block and once a chain is found heavier than current active chain a reorg will be performed mainly by updating UTXO and pointing to new chain as the active fork.

Between two difficulty adjustments, this the-most-difficult-chain-is-the-best-chain approach is identical to the-longest-chain-is-the-best-chain originally proposed by Satoshi but when we have to choose between two forks with at least one difficulty adjustment (typically in both forks) there is no guarantee for the results to be identical.

Block timestamp:How bitcoin keeps track of time
In bitcoin few rules apply to block time and current time. This is an important topic for the purpose of this article, defining an upper bound for the height of forks, because to impose such a boundary a node should eventually have a measure of  time and it is not simply its own system clock for obvious reasons.

1- Bitcoin defines a concept named network-adjusted time. It is defined to be the median of times reported by all the peers a node is connected to.

2- In any case network-adjusted time shouldn't offset node's local time more than 70 minutes.

3- Blocks should be marked with a timestamp greater than the median of previous 11  blocks and less than 2 hours after node's network-adjusted-time.

These constraints make it very hard for an adversary to manipulate block times and forging fake chains with fake difficulty and height which is our concern.

The short-range difficulty adjustment policy of bitcoin causes a divergence of expected chain height (expected from ideal 10 minutes block time) because of increased hashpower during each epoch, we show that this divergence is exponentially difficult by proving a functional dependency between the ratios by which hash power and the divergence increase: F = (n/N)^n

To prove this relation, we primarily show the maximum divergence caused by introducing any amount of new hashpower to the network in a specific period of time is achieved when its distribution in epochs, is representable by a geometric  progress.

The 4 times maximum threshold imposed by difficulty adjustment algorithm of network is deliberately ignored to to avoid complicating the discussion. It is a justifiable simplifying assumption in the context of this article because a 4+ times per epoch increase in hashpower is not feasible  anyway, as we shortly discuss at the end of the article.

After proving the exponential dependency, we briefly illustrate and discuss its relevance to block locator size problem and other interesting potentials for dealing with similar problems generally.

Suppose at the end of a difficulty adjusting epoch we have already chosen as  a check point and has started at time t0 we have calculated an average hashpower C as current network hashpower and begun to introduce Q as a relatively large hashpower in a period of time t, greater than or equal to 1 ideal difficulty adjustment epoch time T0. The number of epochs n that occur in period [t0, t0+t], is at its maximum when Q is distributed in a way that the network hashrate would go through a geometrical progression With:
Ci = C*qi
and we have Sum(Ci-Ci-1)=Q the increased hash power:
For a network state at the end of a given epoch that is in equilibrium with hashpower C and an arbitrary distribution of new power Q in n epochs we have the sequence
C, C*k1, (C*k1)*k2, ...,  (C*ki-1)*ki, ..., (C*kn-1)*kn
defined recursively as :
Ci=Ci-1*ki , C0 = C
Observing that for n+1>i>0:
and in the end of application of power Q, we have a total hashpower of C+Q that is equal to the last term of the sequence
We have to prove for n+1>i>0 we have k1=k2=...=kn=q for the n epochs to occur in the least possible actual time. For this, we note that in the end of each epoch the difficulty adjustment algorithm of bitcoin  (4 times limit temporarily being ignored) resets block generation pace to T0 for future epochs, so in each epoch we have:
ti= Ci/Ci-1 = (C* (C* = 1/ki
For total elapsed time t during n epochs, we have:
T = T0+T0*1/k1+T0*1/k2+... +T0*1/kn
T/T0 =1+1/k1+1/k2+...+ 1/kn
                        = 1+ f(k)/(
                         = 1+f(k)/ ((C+Q)/C) )
Where f(k) is the sum of the sequence:
ai =
, now we need to have this sum in its minimum. We first calculate the product of the terms as prod(ai) = ( = ((C+Q)/C)n-1

where ((C+Q)/C) and n-1 are constants and the sum is minimum when a1=a2=...=an hence:
Now we have proof because for the minimum time to be elapsed the power sequence definitively should be rewritten as:
Ci = Ci-1*q = C0*qi
where C0 = C and n+1>i>1  which is a sequence with geometric progression.

Minimum hashpower growth factor F = (Q+C)/C needed for bitcoin blockchain* to grow by an abnormal ratio n/N is determined by F=(n/N)^n
where C is current  hashpower of the network and Q is the absolute increase during a period of N*T0.

*Note: Bitcoin difficulty adjustment algorithm applies 4 times maximum threshold that is ignored deliberately for practical purposes

For a given F = (Q+C)/C Using the geometric progress lemma above we have:
t=T0*n/q    ==> t/T0=N=n/q ==> n/N = q
Q=C*(qn-1) ==> (Q+C)/C=F=qn
eliminating q:

The exponential relationship with the ratio of power increase and the ratio by which blockchain height diverges from normal N=t/T0 is an important key to understand how impractical would be for a network like bitcoin with tens of thousands penta hash inertia (C) to diverge from its normal behavior in terms of longitude (chain height).
Let's have a graphical illustration of this function. We replace n with N+d where d is the number of abnormally added epochs. So F=((N+d)/N)(n+d)
This form of the equation illustrates how F (relative hashpower) should grow to have N+d epochs instead of N, normal epochs.

figure 1: F/d diagram with various N values for F up to 30 times hashpower increase
Figure 1 illustrates how hard is achieving to 2-3 epochs divergence in 3,6,12,24 standard epoch times (two weeks long each) e.g for reaching to 3 epochs divergence within 24 weeks (12 epochs) even 40 times increase in network hashpower doesn't suffice.

Figure 2 (below) illustrates the same function in larger scale (1000 times hashpower increase). To diverge 6 epochs within 48 weeks we need 800 times increase in network hash power and within 24 weeks even with 1000 times increase divergence can't approach 6.
figure 2: F/d diagram with various N values for F up to 1000 times hashpower increase

This exponential behavior is very interesting and  potentially can be used as a basis for confronting malicious bootstrap attacks and issues like what we mentioned in the beginning of this article: bogus block locator issue.

As of the 4 times maximum threshold bitcoin uses for difficulty adjustment, obviously having a 4 times or more increase in each epoch for the network is infeasible specially when it comes to large windows of time, like 1-2 years that cause exponential network hash power increase up to infeasible quantities. Hence, ignoring that threshold practically won't affect this analysis.
19  Bitcoin / Development & Technical Discussion / How devil competes with its own mined blocks. on: August 09, 2018, 12:06:52 PM
It is really crazy dudes, investigating proximity problem in bitcoin blockchain I was double surprised:

At first I found that for latest 65000 blocks we have just 25 orphan blocks. i.e. a rough 0.00038 ratio or 0.038% which is too low and proves void  the concerns about security consequences of reducing block time. Actually an orphan rate of 1% should be considered safe, this figures suggest we can safely reduce block-time without getting even close to danger zones.

I was enjoying my discovery and planning how to take advantage of this fact in favor of my PoCW proposal,  when I noticed a hilarious point.

Just take a look at this snapshot:

It is how represents orphan blocks, the block on the left is the one that have progressed and is added to the main chain and the right one is the orphan block.

Both the progressed and the orphan blocks are relayed by AntPool  Grin
The blocks time stamped with like 90 seconds difference.

How this ridiculous situation should be interpreted?

Option one:This fat boy is bloated that much that is no longer capable of taking advantage of its own premium. So large that cancelling works assigned to its workers and initiating a new search is hell of a job and takes a long time.

Option Two: They have outsourced their operation somehow and the branches are competing with each other.

Option Three: They are just stupid dick heads that have no clue about what they are doing.

Option Four: Both blocks have been found almost simultaneously (it is feasible despite timestamps being 90s divergent) then they've relayed the left one to the part of the network and the right one to another (minor) part intentionally to keep them busy validating and changing work load while the majority (like Antpool) are mining the right chain.

Option Five: Another unknown devil practice.

Any how this giant is really crazy. Cheesy
20  Bitcoin / Development & Technical Discussion / An analysis of Mining Variance and Proximity Premium flaws in Bitcoin on: July 16, 2018, 03:07:29 PM
An analysis of Mining Variance and Proximity Premium flaws in Bitcoin
The problem of solo mining becoming too risky and impractical for small mining facilities appeared in 2010, less than 2 years after bitcoin had been launched. Although Satoshi Nakamoto made a comment on bitcointalk about first pooling proposals,  it was among the latest posts Satoshi made and he just disappeared few days later from this forum, forever, without making a serious contribution to the subject. Bitcoin was just 2 years old when pooling age began and eventually dominated almost all the hashpower of the network, putting it in danger of centralization as its obvious consequence.

Since then it has been extensively discussed and become a classical problem named Pooling Pressure  in bitcoin and PoW networks which is mainly escalated by both mining variance and proximity premium flaws.

In this article I'm trying to show that:
1-Mining Variance is an inevitable consequence of bitcoin's winner-takes-all approach to PoW.
2-Proximity Premium is basically an amplified version of Mining Variance, Hence another consequence of that approach.

Readers may be already familiar with my proposal for fixing these flaws and removing the unfamous Pooling Pressure from bitcoin and its clones and other blockchains that inherited winner-takes-all from it, i.e all minable coins in the market!
As a matter of fact I was writing the (almost) final version of this proposal when I found myself examining these two flaws more extensively and I thought it would be more helpful to publish this part separately.

Mining variance flaw in traditional PoW
The binary nature of Winner-Takes-All strategy in bitcoin,  implies that a miner's chance to win a block with a relative hash rate of p, will follow a  Bernoulli distribution.
The variance of Bernoulli distribution is known to be p(1-p) Hence the standard deviation for N consequent blocks is:
σ=sqrt(1/p-1) / sqrt(N)

For N=365*24*6 = 52,560 blocks sqrt(N) ~ 229.26
Calculating standard deviation of p for  the following series  (representing respected hashpower ratio of some miners)
0.1,           0.01,             0.001,                 0.0001,                0.00001 ,       0.000001
generates the series
0.013,        0.043,           0.138,                 0.436,                 1.380,            4.362

This quadratic  increase in standard deviation as the hashpower ratio of the miner decreases is a direct consequence of Bernoulli Distribution which in turn is a result of binary nature of winner-takes-all approach (you win/you lose, 1/0).

This illustrates how the risks involved in mining with medium to low hashrates in traditional PoW, are that high to make solo mining for average miners just like participating in a very high stake lottery with a single winner each round. This is  kinda gambling that only hobbyists may be interested in and no rational investor would take it as a serious investment opportunity, unless he might be able to install and run very large facilities.

The direct consequence has been experimentally proved to be a pressure toward forming pools, typically centralized entities who aggregate their own hashpower with their clients' reducing the number of human beings in charge of mining in the network dramatically to very low thresholds.

Proximity Premium flaw
Although Mining Variance is important enough to push miners away from solo mining it is even boosted more by proximity premium, another flaw:

Propagation delay of announcements (most importantly new-block-mined information) gives the node nearer to the source (primarily the source itself) a premium by which they start mining the next block sooner than other participants that are losing their electricity and opportunity costs meanwhile. A thorough mathematical evaluation of this flaw is not available as of this writing, and as of my knowledge it would be hard to produce one. The model proposed here is rather qualitative and based on simplifying assumptions to be used instrumentally to prove my point here: Proximity Premium flaw is an amplified version of Mining Variance (analysed above), hence it is another consequence of bitcoin's winner-takes-all strategy.

The P2P network of bitcoin can be modeled as a weighted undirected complete graph, PP with nodes as the members of P, the vertices and edges being weighted by the number of nodes in the shortest path between every two node (minimum number of hopes from one to other) in the actual P2P network graph.

Assuming that for every specific type of information there is a same propagation delay for adjacent nodes (not necessarily an exact approximation of the real network),  the edges in PP would be assumed as a scale of the amount of time needed for information to be received (and verified) by each peer.

Every node pk of such graph, partitions the network to the subsets of nodes having the same weight on their edges to pk. i.e.:

Ik,0 = {Pk}
Ik,1 = {pi| pi∈P & hk,i=1}
Ik,m = {pi| pi∈P & hk,i=m}

Where m is the maximum edge weight for node k on the complete graph And we have Ik,0 U Ik,1, ..., U Ik,m = P.

Now, one may be interested in how the cardinality of each partition Ik,i is dependent on i. Obviously, it depends on P2P network topology and differs from node to node and for each node from distance to distance. But if we had a choice to impose a strict constraint on the minimum and maximum of peers and at the same time forcing nodes to peek their peers randomly we would observe that in the first few hopes we have few nodes 'near' to the source while the majority of the nodes are 'far' enough to be considered in danger of losing opportunity costs because of working on informations being outdated already.

It should be noted explicitly that the impact of this proximity flaw is not only determined by the distance but also by the nature of the information under consideration.

In Bitcoin network there are 2 types of information continuously produced and distributed by peers: transactions and blocks.

While transactions happen and are relayed constantly their importance for miners are not critical and they don't lose too much because of propagation delay of this type of information. The only risks involved is the impact of such a delay on verifying blocks containing such transactions and missing the opportunity to include them when they have higher fees.

But for 'new block mined' events, the impact of being 'far' from the source is disruptive and causes miners to lose opportunity costs linearly with the length of the time they are kept in darkness. This opportunity cost is proportional to the miner's relative hashpower and could be expressed by means of a scalar variable.

It is theoretically possible to have a thorough analysis of Bitcoin network and suggest the average opportunity costs of each miner when blocks are found in other parts of the network. Such an index would be a function of the network topology because it is expressed  as the ratio of resources miner holds.

At the first glance it may look somewhat odd, big miners lose more because the ratio would be applied on larger resources. But another factor should be taken in consideration: the fact that miners with larger resources have a proportional hashpower and the probability for them to be in bad topological position is again proportionally less than small miners.  

So, everything seems to be in an equilibrium , big miners get premium more frequently and it compensates for situations that they are losing opportunity costs because of topological reasons and the fact that they have not mined the new block. Small miners lose a percentage of their resources which is not too high but this lost happens more frequently.

In reality, two factors change this picture radically:

Firstly,  the topology of the network is not that random and even. Large mining facilities utilize more powerful .
(set of) nodes able to maintain large number of connections with better bandwidth and fault tolerance.

Secondly, and most importantly, mining variance discussed above applies here in an order of magnitude worse scenario:

For a node to take advantage of its premium (in the very short window of time it is in premium) it has to build and mine a new block from scratch.
For a small mining facility it is practically impossible. e.g.  for a node that mines a block every 1 month (yet it is what you may get from a farm with 20 S9s)  or so, it is very unlikely to happen in the fraction of seconds it is in premium.

Hence, while small miners are losing opportunity costs every single round, they will never have a practical chance to be compensated ever.

It is the same assertion that I've made in the beginning and is the main motivation for publishing this analysis:
Proximity Premium flaw  is basically an exaggerated version of Mining Variance flaw and any fix/improvement to the latter would fix/improve it as well.

Pages: [1] 2 3 »
Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!