Bitcoin Forum
April 16, 2024, 11:58:13 PM *
News: Latest Bitcoin Core release: 26.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 4 5 6 [7] 8 9 10 »  All
  Print  
Author Topic: The MAX_BLOCK_SIZE fork  (Read 35541 times)
DeathAndTaxes
Donator
Legendary
*
Offline Offline

Activity: 1218
Merit: 1079


Gerald Davis


View Profile
February 06, 2013, 04:02:28 PM
Last edit: February 06, 2013, 04:29:46 PM by DeathAndTaxes
 #121

There's a delay regardless of whether or not two different blocks are solved at the same time?

Yes but if a miner knew that no other block would be found in the next x seconds they wouldn't care.  Since that can never be known the longer the propagation delay the higher the probability that a competing block will be found before propagating completes and potentially win the race.

Quote
You mean that when two different blocks are solved at the same time, the smaller block will propagate faster and therefore more miners will start building on it versus the larger block?

Yes although it is more like the smaller block has a higher probability of winning the race.  A miner can never know if he will be in a race condition or which races he will lose but over the long run everything else being equal a miner with a longer propogating delay will suffer a higher rate of orphaned blocks.

Quote
Is there a straightforward way to estimate the risk of an orphan?

Not that I know of.  I do know pools have looked into this and to improve their orphan rates to remain competitive.  My guess is any analysis is crude because it would be difficult to model so testing needs to be done with real blocks = real earnings. A pool at least wants to ensure its orphan rate isn't significantly higher than its peers (or global average).


Quote
Even with a separate overlay, two blocks solved at the same time is a problem. And I would imagine that adding a new overlay is an extreme solution to be considered as a last resort only.

True but if it became a large enough problem, a mining network would allow for direct transmission to other miners.  A block notification superhighway of sorts.  Blocks could be digitally signed by a miner and if that miner is trusted by other miners (based on prior submitted work) those miners could start mining the next block immediately.  This of it as WOT for miners but instead of rating financial transactions miners are trusting other miners based on prior "good work".

The propogation delay on large blocks is a combination of the relay -> verify -> relay nature of the bitcoin network, combined with relatively slow block verification (large fraction of a second), and the potential need for multiple hops.  All combined this can result in a delay of multiple seconds before a majority of miners start work on this block.

A single hop, trust enough to start work on next block, and verify after the fact would make the "cost" of a larger block negligible.   It is just an idea.  I care less about mining these days so that is something for major miners to work out.    Even if this network never became "official" I would imagine some sort of private high speed data network to emerge.  It would allow participating miners to gain a small but real competitive advantage on other miners.  Less orphans, ability to include more tx (and thus higher fees) = more net revenue for miners.

Quote
What are your thoughts on the last scheme I described?

Any system which relies on trivial input can be gamed.  I (and other merchants) could buy/rent enough hashing power to solve 1% of blocks and fill them with tx containing massive fees (which come right back to us) and inflate the average fee per block.

I would point out that a fixed money supply and static inflation curve is non-optimal. In theory a central bank should be able to do a better job.  By matching the growth of the money supply to economic growth (or contraction) prices never need to rise or fall (in aggregate).  A can of soup which costs $0.05 in 1905 would cost $0.05 today.  At least the inflation aspect.  The actual price may vary for non-inflationary reasons such as improved productivity or true scarcity of resources.  

The problem with central banks isn't the theory ... it is the humans.  The models of monetary policy rely on flawed humans making perfect decisions and that is why they are doomed to failure.  Flawed humans choosing the benefit for the many (the value of price stability) over the increased benefit for the few (direct profit from manipulation of the money supply).  Maybe someday when we create a utopian such ideas will work but until then they will be manipulated for personal benefit.

The value of Bitcoin comes from the inability to manipulate the money supply.  Sure many times a fixed money supply and static inflation curve is non-optimal but it can't be manipulated and thus this non-optimal system has the potential to outperform systems which in theory are superior but have the flaw of needing perfect humans to run them.

On edit:
On rereading I noticed you proposed using median block reward not average.  That is harder to manipulate.  It probably is better than a fixed block size but I would expect 50 BTC per block isn't necessary on a very large tx volume so it may result in higher than needed fees (although still not as bad as 1MB fixed).  It is worth considering.  Not sure if a consensus for a hard fork can ever be reached though.

Quote
Hmm...this seems problematic. If the transaction volume doesn't grow sufficiently, this could kill fees. But if the transaction volume grows too much, fees will become exhorbitant. IF we accept that max block size needs to change, I believe it should be done in a way that decreases scarcity in response to a rise in average transaction fees.

Likewise a fixed subsidy reduction schedule is non-optimal.  What if tx fees don't cover the drop in subsidy value in 2016.  Network security will be reduced.  Should we also make the money supply adjustable? Smiley  (Kidding but I hope it illustrates the point).

TL/DR: Fixed but non-optimal vs adjustable but manipulable? Your choice.
The grue lurks in the darkest places of the earth. Its favorite diet is adventurers, but its insatiable appetite is tempered by its fear of light. No grue has ever been seen by the light of day, and few have survived its fearsome jaws to tell the tale.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1713311893
Hero Member
*
Offline Offline

Posts: 1713311893

View Profile Personal Message (Offline)

Ignore
1713311893
Reply with quote  #2

1713311893
Report to moderator
1713311893
Hero Member
*
Offline Offline

Posts: 1713311893

View Profile Personal Message (Offline)

Ignore
1713311893
Reply with quote  #2

1713311893
Report to moderator
1713311893
Hero Member
*
Offline Offline

Posts: 1713311893

View Profile Personal Message (Offline)

Ignore
1713311893
Reply with quote  #2

1713311893
Report to moderator
jl2012
Legendary
*
Offline Offline

Activity: 1792
Merit: 1092


View Profile
February 06, 2013, 04:09:11 PM
 #122

Also, requiring a total reward of 50BTC means requiring 25BTC in fee NOW. As the typical total tx fee in a block is about 0.25BTC, the fee has to increase by 100x and obviously this will kill the system.

How and why would the system be "killed"? The max block size would simply not increase.


So you propose that the size will only increase but never decrease? Anyway, the major problem is the choice of total reward amount because change in purchasing power is unpredictable.

Donation address: 374iXxS4BuqFHsEwwxUuH3nvJ69Y7Hqur3 (Bitcoin ONLY)
LRDGENPLYrcTRssGoZrsCT1hngaH3BVkM4 (LTC)
PGP: D3CC 1772 8600 5BB8 FF67 3294 C524 2A1A B393 6517
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 06, 2013, 04:22:53 PM
 #123

Any system which relies on trivial input can be easily gamed.  I (and other merchants) could buy/rent enough hashing power to solve 1% of blocks and fill them with massive fees (which come right back to us) and inflate the average fee per block.

Hmm...This was a problem in my first idea but I fixed it for the last two. Do you an exploitable problem with the most recent proposal?

Quote
I would point out that a fixed money supply and static inflation curve is non-optimal. In theory a central bank should be able to do a better job.  By matching the growth of the money supply to economic growth (or contraction) prices never need to rise or fall (in aggregate).

I disagree. There's nothing particularly attractive about fixed prices. Especially troublesome is when growth in the money supply is politically directed (versus doled out through proof of work). But this heads us in the direction of "deflationary currency" debate so I'll stop here.

...you propose that the size will only increase but never decrease? Anyway, the major problem is the choice of total reward amount because change in purchasing power is unpredictable.

Yes, size would only increase. If we allow the size to decrease then it could cause fees to skyrocket. I proposed a new scheme that does not depend on total reward amount, I believe it addresses your concerns.
mrvision
Sr. Member
****
Offline Offline

Activity: 527
Merit: 250



View Profile
February 06, 2013, 06:10:05 PM
 #124

I think that if we reach the 1mb limit and don't upgrade with a solution, then the spontaneous order will create fiat currencies backed with bitcoins, in order to reduce the amount of transactions in the bitcoin network.

I'm not so sure this is a bad thing. These ad-hoc "fiat" currencies may be created with unique properties that make them better suited to the task at hand than Bitcoin. For example, a private payment network that provides instant confirmation and requires no mining (relying on trust in a central authority).


It is a bad thing because even when you may know the amount of deposits they have (since you can audit the blockchain), you don't actually know the amount of notes they will have emited, so eventually this will drive us to a fractional reserve system again.

That's what i think.
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 06, 2013, 06:14:01 PM
 #125

eventually this will drive us to a fractional reserve system again.

There's nothing wrong with a fractional reserve system, just look at systems of self-issued credit (like Ripple). The problem is with legal tender laws that force you to use a particular debt instrument. With voluntary exchange, competition between systems would keep them honest.
mrvision
Sr. Member
****
Offline Offline

Activity: 527
Merit: 250



View Profile
February 06, 2013, 06:20:33 PM
 #126

eventually this will drive us to a fractional reserve system again.

There's nothing wrong with a fractional reserve system, just look at systems of self-issued credit (like Ripple). The problem is with legal tender laws that force you to use a particular debt instrument. With voluntary exchange, competition between systems would keep them honest.


Well indeed there is a problem if the only option you have is to use a fractional reserve system because the hard limit of 1mb. Specially if the 'bank' cannot give you back your bitcoins because everybody else has withdrawn theirs for whatever reason Cheesy
solex
Legendary
*
Offline Offline

Activity: 1078
Merit: 1002


100 satoshis -> ISO code


View Profile
February 06, 2013, 08:31:22 PM
Last edit: February 07, 2013, 05:39:34 AM by solex
 #127


Well indeed there is a problem if the only option you have is to use a fractional reserve system because the hard limit of 1mb. Specially if the 'bank' cannot give you back your bitcoins because everybody else has withdrawn theirs for whatever reason Cheesy


Agreed 100%

If Bitcoin cripples itself at such an early stage then the central bankers will be laughing as they quaff bourbon in their gentlemen's clubs. Bitcoin is a very disruptive technology which has a huge first-mover advantage, potentially returning a lot of power from governments and TBTF banks to the people. If this is thrown away then expect the next major cryptocurrency to be FedCoin or ECBcoin or even IMFcoin which will be designed to integrate somehow with the existing fiat systems. (Oh. And expect attempts to ban community-based alternatives when "official" ones are up and running.)

SimonL
Member
**
Offline Offline

Activity: 113
Merit: 11


View Profile
February 09, 2013, 04:00:39 PM
 #128

I haven't read the entire thread so this may have been covered. I've been stewing over this problem for a while and would just like to think aloud here....

I very much think the blocksize should be network regulated much like difficulty is used to regulate propagation windows based on the amount of computation cycles used to find hashes for particular difficulty targets. To clarify, when I say CPU I mean CPUs, GPUs, and ASICs collectively.

Difficulty is very much focused on the network's collective CPU cycles to control propagation windows (1 block every 10 mins), avoid 51% attacks, and distribute new coins.

However the max_blocksize is not related to computing resources to validate transactions and regular block propagation, it is geared much more to network speed, storage capacity of miners (and includes even non-mining full nodes) and verification of transactions (which as I understand it means hammering the disk). What we need to determine is whether the nodes supporting the network can quickly and easily propagate blocks while not having this affect the propagation window.

Interestingly there is a connection between CPU resources, the calculation of the propagation window with difficulty targets, and network propagation health. If we have no max_blocksize limit in place, it leaves the network open to a special type of manipulation of the difficulty.

The propagation window can be manipulated in two ways as I see it, one is creating more blocks as we classically know, throw more CPUs at block creation, and we transmit more blocks, more computation power = more blocks produced, and the difficulty ensures the propagation window doesn't get manipulated this way. The difficulty is measured by timestamps in the blocks to determine whether more or less blocks in a certain period were created and whether difficulty goes up or down. All taken care of.

The propagation window could also be manipulated in a more subtle way though, that being transmission of large blocks (huge blocks in fact). Large blocks take longer to transmit, longer to verify, and longer to write to disk, though this manipulation of the number of blocks being produced is unlikely to be noticed until a monster block gets pushed across the network (in a situation where there is no limit on blocksize that is). Now because there is only a 10 minute window the block can't take longer than that I'm guessing. If it does, difficulty will sink and we have a whole new problem, that being manipulation of the difficulty through massive blocks. Massive blocks could mess with difficulty and push out smaller miners, causing all sorts of undesirable centralisations. In short, it would probably destroy the Bitcoin network.

So we need a maximum block size that is high enough that the vast majority of nodes are comfortable with it, and isn't so big that it can be used to manipulate the difficulty by artificially slowing propagation accross the network with massive blocks. With the help of the maintaining of the propagation window through it's difficulty, we may be able to determine whether the propagation of blocks is slowing and whether the max_blocksize should be adjusted down to ensure the propagation window remains stable.

Because the difficulty can be potentially manipulated this way we could possibly have a means of knowing what the Bitcoin network is comfortable with propagating. And it could be determined thusly:

If the median size of the blocks transmitted in the last difficulty period is bumping up against the max_blocksize (median being chosen to avoid situations where one malicious entity, or entities tries to arbitrarily push up the max_blocksize limit), and the difficulty is "stable", increase the max_blocksize (say by 10%) for the next difficulty period (say the median is within 20% of the max_blocksize), but if the median size of blocks for the last period is much lower (say less than half the current blocksize_limit), then lower the size by 20% instead.

However, if the If the median size of the blocks transmitted in the last difficulty period is bumping up against the max_blocksize and the difficulty is NOT stable, don't increase the max_blocksize since there is a possibility that the network is not currently healthy and increasing or decreasing the max_blocksize is a bad idea. Or alternatively in those situations lower the max_blocksize by 10% for the next difficulty period anyway (not sure if this is a good idea or not though).

In either case the 1mb max_blocksize should be the lowest the blocksize should go to if it continued to shrink.

Checking the stability of the last difficulty period and the next one is what determines whether the network is spitting out blocks at a regular rate or not, if the median blocksize of blocks transmitted in the last difficulty period is bumping up against the limit, and difficulty is going down, it could mean a significant number of nodes can't keep up, esp. if the difficulty needs to be moved down, that means that blocks aren't getting to all the nodes in time and hashing capacity is getting cut off because they are too busy verifying the blocks they received. If the difficulty is going up and median block size is bumping up against the limit, then there's a strong indication that nodes are all processing the blocks they receive easily and so raising the max_blocksize limit a little should be OK. The one thing I'm not sure of though is determining whether the difficulty is "stable" or not, I'm very much open to suggestions the best way of doing that. The argument that what is deemed "stable" is arbitrary and could still lead to manipulation of the max_blocksize, just over a longer and more sustained period I think is possible too, so I'm not entirely sure this approach could be made foolproof, how does calculating of difficulty targets take these things into consideration?

OK, guys, tear it apart. Wink
Ari
Member
**
Offline Offline

Activity: 75
Merit: 10


View Profile
February 10, 2013, 02:38:33 PM
 #129

We will eventually hit the limit, and the limit will be raised.  People running old versions will get disconnected from the network when that happens.  The only question is how close we will come to a 50% network split.  It would be good to reach some consensus well in advance, so that this is minimally disruptive.
notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
February 10, 2013, 09:07:09 PM
 #130

We will eventually hit the limit, and the limit will be raised.  People running old versions will get disconnected from the network new block chain and continue on the old, causing confusion as to which chain is the official "bitcoin" chain when that happens.  The only question is how close we will come to a 50% network split.  It would be good to reach some consensus well in advance, so that this is minimally disruptive.

FTFY

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
Ari
Member
**
Offline Offline

Activity: 75
Merit: 10


View Profile
February 10, 2013, 09:42:06 PM
 #131

Yeah.  If there's a major split, the <1MB blockchain will probably continue for a while.  It will just get slower and slower with transactions not confirming.  It would be better if there is a clear upgrade path so we don't end up with a lot of people in that situation.
notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
February 10, 2013, 09:44:48 PM
 #132

Yeah.  If there's a major split, the <1MB blockchain will probably continue for a while.  It will just get slower and slower with transactions not confirming.  It would be better if there is a clear upgrade path so we don't end up with a lot of people in that situation.

You are assuming miners want to switch.  They have a very strong incentive to keep the limit in place (higher fees, lower storage costs).

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
Anon136
Legendary
*
Offline Offline

Activity: 1722
Merit: 1217



View Profile
February 10, 2013, 10:08:37 PM
Last edit: February 10, 2013, 10:27:55 PM by Anon136
 #133

Opinions differ on the subject. The text on the Wiki largely reflect's Mike Hern's views.

Here are my views:

Without a sharp constraint on the maximum blocksize there is currently _no_ rational reason to believe that Bitcoin would be secure at all once the subsidy goes down.

Bitcoin is valuable because of scarcity. One of the important scarcities is the limited supply of coins, another is the limited supply of block-space: Limited blockspace creates a market for transaction fees, the fees fund the mining needed to make the chain robust against hostile reorganization.  I have not yet seen any suggestion as to how Bitcoin is long term viable without this except ones that argue for cartel or regulatory behavior (both of which I don't consider viable: they moot the decentralized purpose of Bitcoin).

Even going beyond fee funding— as Dan Kaminsky argued so succinctly— with, gigabyte blocks bitcoin would not be functionally decentralized in any meaningful way: only a small self selecting group of some thousands of major banks would have the means and the motive to participate in validation (much less mining), just as some thousands of major banks are the primary drivers of the USD and other major world currencies. An argument that Bitcoin can simply scale directly like that is an argument that the whole decentralization thing is a pretext: and some have argued that it's evidence that bitcoin is just destined to become another centralized currency (with some "bonus" wealth redistribution in the process, that they suggest is the real motive— that the decentralization is a cynical lie).

Obviously decentralization can be preserved for increased scale with technical improvements, and those should be done— but if decentralization doesn't come first I think we would lose what makes Bitcoin valuable and special...  and I think that would be sad. (Though, to be frank— Bitcoin becoming a worldwide centrally controlled currency could quite possibly be the most profitable for me— but I would prefer to profit by seeing the world be a diverse place with may good and personally liberating choices available to people)

Perhaps the proper maximum size isn't 1MB but some other value which is also modest and still preserves decentralization— I don't have much of an opinion beyond that fact that there is some number of years in the future where— say— 10MB will be no worse than 1MB today. It's often repeated that Satoshi intended to remove "the limit" but I always understood that to be the 500k maximum generation soft limit... quite possible I misunderstood, but I don't understand why it would be a hardforking protocol rule otherwise. (and why the redundant soft limit— and why not make it a rule for which blocks get extended when mining instead of a protocol rule? ...  and if that protocol rule didn't exist? I would have never become convinced that Bitcoin could survive... so where are the answers to long term survival?)

(In any case the worst thing that can possibly happen to a distributed consensus system is that fails to achieve consensus. A substantial persistently forked network is the worst possible failure mode for Bitcoin: Spend all your own coins twice!  No hardfork can be tolerated that wouldn't result in an thoroughly dominant chain with near certain probability)

But before I think we can even have a discussion about increasing it I think there must be evidence that the transaction load has gone over the optimum level for creating a market for fees (e.g. we should already be at some multiple of saturation and still see difficulty increasing or at least holding steady).  This would also have the benefit of further incentivizing external fast payment networks, which I think must exist before any blockchain increase: it would be unwise to argue an increase is an urgent emergency because we've painted ourselves into a corner by using the system stupidly and not investing in building the infrastructure to use it well.

Quote
You can get around it (NAT), and you can fix it (IPv6) but the former is annoying and the latter is taking forever

It's not really analogous at all.  Bitcoin has substantial limits that cannot be fixed within the architecture, unrelated to the artificial* block-size cap. The blockchain is a worldwide broadcast medium and will always scale poorly (even if rocket boosters can be strapped to that pig), the consensus it provides takes time to converge with high probability— you can't have instant confirmations,  you can't have reversals for anti-fraud (even when the parties all desire and consent to it),  and the privacy is quite weak owing to the purely public nature of all transactions.

(*artificial doesn't mean bad, unless you think that the finite supply of coin or the limitations on counterfeiting, or all of the other explicit rules of the system are also bad...)

Its important to distinguish Bitcoin the currency and Bitcoin the payment network.  The currency is worthwhile because of the highly trustworth extreme decentralization which we only know how to create through a highly distributed and decentralized public blockchain.  But the properties of the blockchain that make it a good basis for a ultimately trustworthy worldwide currency do _not_ make it a good payment network.  Bitcoin is only as much of a payment network as it must be in order to be a currency and in order to integrate other payment networks.

Or, by analogy— Gold may be a good store of value, but it's a cruddy payment system (especially online!).  Bitcoin is a better store of value— for one reason because it can better integrate good payment systems.

See retep's post on fidelity bonded chaum token banks for my personal current favorite way to produce infinitely scalable trustworthy payments networks denominated in Bitcoin.

Cheers,

thanks gmaxwell. im not technically minded and i was wondering the whole time why no one brought this up. Once bitcoin becomes popular enough to use the whole 1mb on 100% serious transactions than higher demand will simply lead to a need to supply higher transaction fees inorder to get your transaction processed. If these fees become high enough to be prohibitive than centrally managed servers can be used for small transactions that will be recorded on their records but not on the blockchain.

Rep Thread: https://bitcointalk.org/index.php?topic=381041
If one can not confer upon another a right which he does not himself first possess, by what means does the state derive the right to engage in behaviors from which the public is prohibited?
d'aniel
Sr. Member
****
Offline Offline

Activity: 461
Merit: 251


View Profile
February 10, 2013, 10:14:47 PM
 #134

Yeah.  If there's a major split, the <1MB blockchain will probably continue for a while.  It will just get slower and slower with transactions not confirming.  It would be better if there is a clear upgrade path so we don't end up with a lot of people in that situation.

You are assuming miners want to switch.  They have a very strong incentive to keep the limit in place (higher fees, lower storage costs).
Sometimes more customers paying less results in higher profit.  Miners will surely have an incentive to lower their artificially high prices to accommodate new customers, instead of having them all go with much cheaper off-blockchain competitors.
markm
Legendary
*
Offline Offline

Activity: 2940
Merit: 1090



View Profile WWW
February 10, 2013, 11:49:41 PM
 #135

Don't forget merged mining. Smaller transactions could use one of the merged-mined  blockchains, there are several such blockchains already, this kind of pressure might just cause one or more of them to increase in popularity, and miners would still reap the fees.

-MarkM-

Browser-launched Crossfire client now online (select CrossCiv server for Galactic  Milieu)
Free website hosting with PHP, MySQL etc: http://hosting.knotwork.com/
deepceleron
Legendary
*
Offline Offline

Activity: 1512
Merit: 1025



View Profile WWW
February 11, 2013, 12:10:09 AM
 #136

I'm just going to leave this here:

http://en.wikipedia.org/wiki/Parkinson%27s_law_of_triviality
notme
Legendary
*
Offline Offline

Activity: 1904
Merit: 1002


View Profile
February 11, 2013, 06:04:39 AM
 #137


 Grin

https://www.bitcoin.org/bitcoin.pdf
While no idea is perfect, some ideas are useful.
fornit
Hero Member
*****
Offline Offline

Activity: 991
Merit: 1008


View Profile
February 11, 2013, 02:37:12 PM
 #138

very much depends on what you consider trivial. little more than two years ago, all copies of the blockchain all around the world would have fit on a single hard drive. right now, if every client would be a full node, we would need something around a thousand hard drives.

in a few years, we might easily end up with a few million or billion hard drives assuming everyone is a full node. so imho how this issue is handles directly determines how many full nodes we will have long term. i have no idea how many full nodes is acceptable, 1%? 0,1%? but i am pretty sure 0,0000001% wont cut it.
mrvision
Sr. Member
****
Offline Offline

Activity: 527
Merit: 250



View Profile
February 11, 2013, 02:43:55 PM
 #139

Why don't we set up a 1kb limit? That way the miners will earn a lot more from fees Smiley [/ironic]

Oh yes! because that way people would start making trades off the chain, and miners won't get those 'big fees' they are coveting. There's an optimum where the miners have the biggest earnings and don't encourage people to use whatever else coin (phisical or virtual) to trade.

I am all for "let-the-market-decide" elastic algorithms.

Right now i like this.

I think 1mb should be the smallest limit, but maybe i want to accept 4 mb of transactions, for whatever reason, and earn a lot more from fees. Maybe i've got a rig of asics and i can process a lot more mb in 10 minutes instead of pushing up the dificulty. So, maybe a GPU miner, can mine 1mb blocks, and an asic miner can mine 20 mb blocks for the same difficulty, having then the same odds of solving the problem.

I am just thinking aloud Cheesy
misterbigg
Legendary
*
Offline Offline

Activity: 1064
Merit: 1001



View Profile
February 11, 2013, 04:47:49 PM
 #140

Maybe i've got a rig of asics and i can process a lot more mb in 10 minutes instead of pushing up the dificulty. So, maybe a GPU miner, can mine 1mb blocks, and an asic miner can mine 20 mb blocks for the same difficulty

The time required to mine the block is independent of the size of the block.

Pages: « 1 2 3 4 5 6 [7] 8 9 10 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!