h2odysee (OP)
|
|
August 20, 2013, 05:40:48 PM |
|
Having said that, when a switch happens, the pool should still accept shares for the old chain.
What would happen here?
- miner finds a LTC block - pool switches to some other chain - miner submits LTC block to pool
Would that count as a rejected block?
Yeah, unfortunately that block will be rejected.
|
|
|
|
Eastwind
|
|
August 20, 2013, 05:43:36 PM Last edit: August 20, 2013, 06:01:08 PM by Eastwind |
|
H2O, what is today's mining rate?
|
|
|
|
Damnsammit
|
|
August 20, 2013, 05:49:01 PM |
|
No. This "loss/waste of hashpower" happens both when we change coins AND when a block changes. In that said 5-10 minutes, we might burn through 4-5 blocks, or we might get 1 block, or none, or whatever, depending on the coin.
Okay, I think I see what you are saying but if you average a share every minute and you are on a 15s average block coin, and they mine this coin for 5 minutes then the difficulty doesn't matter. 5 minute duration 512 difficulty - 1 minute/avg - 5 shares 256 difficulty - 30 second/avg - 10 shares (worth half of the 512 difficulty shares in terms of payout value) Whether your average share time is greater than the average block time doesn't matter. They both start at zero and have a chance to hit with every second that ticks. Just because the clock ticked 59 times and you didn't get a share, it does not mean that the 60th time has any higher chance to hit than the 1st. Edit: This sounds a lot like the Gambler's Fallacy.
|
|
|
|
Liquidfire
Newbie
Offline
Activity: 28
Merit: 0
|
|
August 20, 2013, 05:49:07 PM |
|
The perfect analogy to this problem is cluster size for a file system.
Basically it is a minimum amount of space that must get allocated for data. So if your custer size was 10, and you wanted to write a file of size one, you waste 9. If your custer size is 10 and you use a file of 15, you use two clusters, one filling up and one wasting 5.
Check out this explanation from Microsoft on cluster size:
All file systems that are used by Windows organize your hard disk based on cluster size (also known as allocation unit size). Cluster size represents the smallest amount of disk space that can be used to hold a file. When file sizes do not come out to an even multiple of the cluster size, additional space must be used to hold the file (up to the next multiple of the cluster size). On the typical hard disk partition, the average amount of space that is lost in this manner can be calculated by using the equation (cluster size)/2 * (number of files).
It's the same formula I ended up with when figuring out the block change / vs hashrate.
(cluster size)/2 * (number of files) = Equals wasted space
(Average time for miner to find a share) / 2* (Average block find time) = Percentage of LOST profit/hashpower
|
|
|
|
Cryptos2go
Member
Offline
Activity: 81
Merit: 10
|
|
August 20, 2013, 05:50:18 PM |
|
It's going to be difficult to get many, if any shares at all, when the block times get too fast regardless of your difficulty settings. I have seen blocks as fast as 3 seconds. This is only part of the problem. This screen was posted by Ixion in the DGC thread here. https://bitcointalk.org/index.php?topic=209508.3680This is what can and does happen when fast low diff coins get over worked with way too much hashing power. I think most would agree that in these cases less is more. H20 has mentioned there would be increased profits IF he can find a way to more efficiently distribute the pool hash in these situations. Splitting the pool 50/50 could be a solution but even that would probably be difficult to implement effectively as some times both pools could/should be on the same coin (LTC/FTC/NVC) but other times would need to be on separate coins. Then you have the possible issue of combining the profits from the 2 pools into 1 distributed payout as if it were 1 pool. Not to mention a host of other potential issues that I don't even understand. H20 is already actively involved with managing the coins and exchanges in the stable, which the pool requires in order to maintain his/our profitability as the newer coins can often be amongst the most profitable for a time. Obviously we all want the same thing here but there are probably limits to what H20 can accomplish without this becoming a full time job and even then there will still be limits for whatever reasons.
|
|
|
|
Liquidfire
Newbie
Offline
Activity: 28
Merit: 0
|
|
August 20, 2013, 05:51:33 PM Last edit: August 20, 2013, 06:02:23 PM by Liquidfire |
|
No. This "loss/waste of hashpower" happens both when we change coins AND when a block changes. In that said 5-10 minutes, we might burn through 4-5 blocks, or we might get 1 block, or none, or whatever, depending on the coin.
Okay, I think I see what you are saying but if you average a share every minute and you are on a 15s average block coin, and they mine this coin for 5 minutes then the difficulty doesn't matter. 5 minute duration 512 difficulty - 1 minute/avg - 5 shares 256 difficulty - 30 second/avg - 10 shares (worth half of the 512 difficulty shares in terms of payout value) Whether your average share time is greater than the average block time doesn't matter. They both start at zero and have a chance to hit with every second that ticks. Just because the clock ticked 59 times and you didn't get a share, it does not mean that the 60th time has any higher chance to hit than the 1st. No, when the block changes you lose all progress on that block that didn't get included in a share. Thats where the waste comes in. You are half way through a share when the block is found, the "half" share that you have just gets thrown out. Say you cut the diff in half. Guess what? That half a share that got thrown out? Thats ONE share now, and it gets submitted, and you get credit for it.
|
|
|
|
ranlo
Legendary
Offline
Activity: 1988
Merit: 1007
|
|
August 20, 2013, 06:01:31 PM |
|
No. This "loss/waste of hashpower" happens both when we change coins AND when a block changes. In that said 5-10 minutes, we might burn through 4-5 blocks, or we might get 1 block, or none, or whatever, depending on the coin.
Okay, I think I see what you are saying but if you average a share every minute and you are on a 15s average block coin, and they mine this coin for 5 minutes then the difficulty doesn't matter. 5 minute duration 512 difficulty - 1 minute/avg - 5 shares 256 difficulty - 30 second/avg - 10 shares (worth half of the 512 difficulty shares in terms of payout value) Whether your average share time is greater than the average block time doesn't matter. They both start at zero and have a chance to hit with every second that ticks. Just because the clock ticked 59 times and you didn't get a share, it does not mean that the 60th time has any higher chance to hit than the 1st. No, when the block changes you lose all progress on that block that didn't get included in a share. Thats where the waste comes in. You are half way through a share when the block is found, the "half" share that you have just gets thrown out. Say you cut the diff in half. Guess what? That half a share that got thrown out? Thats ONE share now, and it gets submitted, and you get credit for it. This. I don't see why people keep trying to say difficulty doesn't matter. If it didn't, why is there even a difficulty to begin with? Why is there VarDiff? If all of this were really irrelevant there would be no need for any of that.
|
|
|
|
Liquidfire
Newbie
Offline
Activity: 28
Merit: 0
|
|
August 20, 2013, 06:03:18 PM |
|
Whether your average share time is greater than the average block time doesn't matter. They both start at zero and have a chance to hit with every second that ticks. Just because the clock ticked 59 times and you didn't get a share, it does not mean that the 60th time has any higher chance to hit than the 1st.
If you couldn't trust averages, vegas would still be a barren desert. The casinos know statistically speaking their expected return is 1.01, or 1.05, or 1.10 or whatever on that particular game. Yes its mathematically possible that every patron that placed every bet would win and bankrupt the casino, but that doesn't happen. Small variations even out over time and they end up with virtually exactly what the statistics said they would.
|
|
|
|
TierNolan
Legendary
Offline
Activity: 1232
Merit: 1104
|
|
August 20, 2013, 06:09:22 PM |
|
Forgot the /2 part, because you'll be at a random point of progress on the share each time a block changes, averaging to half complete
The is the point of misunderstanding. There is no progress. Each hash is an independent random trial. If you only do one hash per block, that hash has the same chance as a hash done by a miner with the strongest rig. He will do many more hashes, so will have higher odds, but each hash, done by anyone is equal. GPUs and (probably) ASICS have a "scan-time". They can't do one hash. They have to lots of hashes all at once. The scan time is how long to do an entire run. The lost hashing power is something like (network latency + scan-time/2) / (block time)
|
1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
|
|
|
Damnsammit
|
|
August 20, 2013, 06:10:12 PM |
|
No, when the block changes you lose all progress on that block that didn't get included in a share. Thats where the waste comes in. You are half way through a share when the block is found, the "half" share that you have just gets thrown out. Say you cut the diff in half. Guess what? That half a share that got thrown out? Thats ONE share now, and it gets submitted, and you get credit for it.
But that's not the way it works. You don't build a share like you build a house. You don't pour the foundation in the first second, and then the next second start building the frame. By the 30th second you don't have an almost finished house that just needs some brick work and plumbing to be complete just at the same time that a big block comes down and destroys it and washes you back to square one. From the first second your GPU is computing thousands of hashes and trying to get one share to submit. Nothing is lost when the block changes... the time since last block is just reset. At that point you have exactly the same chance in the next second that you have in every other second with difficulty (coin) constant. If you cut the difficulty in half, then over the span of a few days, my hypothesis would be that people would see their amount of shares double, each share to be worth half, and for everyone's profits to be relatively unchanged.
|
|
|
|
Liquidfire
Newbie
Offline
Activity: 28
Merit: 0
|
|
August 20, 2013, 06:14:53 PM |
|
Forgot the /2 part, because you'll be at a random point of progress on the share each time a block changes, averaging to half complete
The is the point of misunderstanding. There is no progress. Each hash is an independent random trial. If you only do one hash per block, that hash has the same chance as a hash done by a miner with the strongest rig. He will do many more hashes, so will have higher odds, but each hash, done by anyone is equal. GPUs and (probably) ASICS have a "scan-time". They can't do one hash. They have to lots of hashes all at once. The scan time is how long to do an entire run. The lost hashing power is something like (network latency + scan-time/2) / (block time) You are correct, I spoke in an oversimplification, but the effect is the same. Here is the long (more correct) version, with the same result in the end. Forgot the /2 part, because you'll randomly complete shares at a rate that statistically evens out to half the time less then the average and half the time more than the average. Better?
|
|
|
|
Liquidfire
Newbie
Offline
Activity: 28
Merit: 0
|
|
August 20, 2013, 06:17:34 PM |
|
No, when the block changes you lose all progress on that block that didn't get included in a share. Thats where the waste comes in. You are half way through a share when the block is found, the "half" share that you have just gets thrown out. Say you cut the diff in half. Guess what? That half a share that got thrown out? Thats ONE share now, and it gets submitted, and you get credit for it.
But that's not the way it works. You don't build a share like you build a house. You don't pour the foundation in the first second, and then the next second start building the frame. By the 30th second you don't have an almost finished house that just needs some brick work and plumbing to be complete just at the same time that a big block comes down and destroys it and washes you back to square one. From the first second your GPU is computing thousands of hashes and trying to get one share to submit. Nothing is lost when the block changes... the time since last block is just reset. At that point you have exactly the same chance in the next second that you have in every other second with difficulty (coin) constant. If you cut the difficulty in half, then over the span of a few days, my hypothesis would be that people would see their amount of shares double, each share to be worth half, and for everyone's profits to be relatively unchanged. As I said before I acknowledge I am speaking in simplifications, but I am only doing so because the answer is statistically the same as if it did work that way. if 50% of the time, my partial-brick gets washed away, versus 50% of the time i either put a full brick on or none at all, the results are statistically the same.
|
|
|
|
ranlo
Legendary
Offline
Activity: 1988
Merit: 1007
|
|
August 20, 2013, 06:19:45 PM |
|
Forgot the /2 part, because you'll be at a random point of progress on the share each time a block changes, averaging to half complete
The is the point of misunderstanding. There is no progress. Each hash is an independent random trial. If you only do one hash per block, that hash has the same chance as a hash done by a miner with the strongest rig. He will do many more hashes, so will have higher odds, but each hash, done by anyone is equal. GPUs and (probably) ASICS have a "scan-time". They can't do one hash. They have to lots of hashes all at once. The scan time is how long to do an entire run. The lost hashing power is something like (network latency + scan-time/2) / (block time) There technically IS a progress. If you flip a quarter 50x you can expect that you will get heads 25x and tails 25x. If you got heads 49x, you can reasonably expect that over the next hundreds/thousands/millions you will end up getting more tails than heads because statistically they should be about equal. The same thing here. If you had a lot of 10s blocks when the average should be 30s, you can reasonably expect that you are going to have an equal number of 50s blocks. Otherwise the AVERAGE is lower than the average; you can't possibly have a lower average than average. That's like saying 2=6. It's just not mathematically possible.
|
|
|
|
Damnsammit
|
|
August 20, 2013, 06:20:55 PM |
|
If you couldn't trust averages, vegas would still be a barren desert. The casinos know statistically speaking their expected return is 1.01, or 1.05, or 1.10 or whatever on that particular game. Yes its mathematically possible that every patron that placed every bet would win and bankrupt the casino, but that doesn't happen. Small variations even out over time and they end up with virtually exactly what the statistics said they would.
But that's exactly what I am talking about. Let's do the gambling analogy. Martingale betting strategy succumbs to the same Gambler's fallacy that you are speaking of now. Let's say you have a coin, heads and tails, no tricks. The odds are 50/50 that it will land on either side. What are the odds that it will hit tails five times in a row? Not very high. Statistically, you know it is a 3.125% chance. Now you observe someone flipping a coin and the results are as follows: Tails Tails Tails Tails What are the odds that it will hit tails again? Still 50/50 Each coin flip is independent of all previous coinflips, just as each share submitted is independent of all previous shares. They all have the same odds of being submittied and/or solving a block.
|
|
|
|
iGotSpots
Legendary
Offline
Activity: 2548
Merit: 1054
CPU Web Mining 🕸️ on webmining.io
|
|
August 20, 2013, 06:24:41 PM |
|
If you couldn't trust averages, vegas would still be a barren desert. The casinos know statistically speaking their expected return is 1.01, or 1.05, or 1.10 or whatever on that particular game. Yes its mathematically possible that every patron that placed every bet would win and bankrupt the casino, but that doesn't happen. Small variations even out over time and they end up with virtually exactly what the statistics said they would.
But that's exactly what I am talking about. Let's do the gambling analogy. Martingale betting strategy succumbs to the same Gambler's fallacy that you are speaking of now. Let's say you have a coin, heads and tails, no tricks. The odds are 50/50 that it will land on either side. What are the odds that it will hit tails five times in a row? Not very high. Statistically, you know it is a 3.125% chance. Now you observe someone flipping a coin and the results are as follows: Tails Tails Tails Tails What are the odds that it will hit tails again? Still 50/50 Each coin flip is independent of all previous coinflips, just as each share submitted is independent of all previous shares. They all have the same odds of being submittied and/or solving a block. Pretty much what this guy said. It's not a cumulative probability, each is its own separate dice roll
|
|
|
|
Liquidfire
Newbie
Offline
Activity: 28
Merit: 0
|
|
August 20, 2013, 06:29:34 PM |
|
If you couldn't trust averages, vegas would still be a barren desert. The casinos know statistically speaking their expected return is 1.01, or 1.05, or 1.10 or whatever on that particular game. Yes its mathematically possible that every patron that placed every bet would win and bankrupt the casino, but that doesn't happen. Small variations even out over time and they end up with virtually exactly what the statistics said they would.
But that's exactly what I am talking about. Let's do the gambling analogy. Martingale betting strategy succumbs to the same Gambler's fallacy that you are speaking of now. Let's say you have a coin, heads and tails, no tricks. The odds are 50/50 that it will land on either side. What are the odds that it will hit tails five times in a row? Not very high. Statistically, you know it is a 3.125% chance. Now you observe someone flipping a coin and the results are as follows: Tails Tails Tails Tails What are the odds that it will hit tails again? Still 50/50 Each coin flip is independent of all previous coinflips, just as each share submitted is independent of all previous shares. They all have the same odds of being submittied and/or solving a block. You are actually saying the same thing as me. You just don't realize it. Each individual hash has a particular chance of hitting. Just like each individual coin flip has a particular chance (50%). We aren't talking about one hash or one coin flip. We are talking about thousands, millions, billions. You are suggesting solving a share is equivalent to one hash, one coin flip if you will. Each share still typically requires thousands/millions of hashes to solve. That is the point of a proof-of-work algorithm. It is saying "statistically speaking, if you solve a share of X difficulty, i know you did about Y hashes, and I will reward you based on that assumption".
|
|
|
|
Liquidfire
Newbie
Offline
Activity: 28
Merit: 0
|
|
August 20, 2013, 06:31:01 PM |
|
If you couldn't trust averages, vegas would still be a barren desert. The casinos know statistically speaking their expected return is 1.01, or 1.05, or 1.10 or whatever on that particular game. Yes its mathematically possible that every patron that placed every bet would win and bankrupt the casino, but that doesn't happen. Small variations even out over time and they end up with virtually exactly what the statistics said they would.
But that's exactly what I am talking about. Let's do the gambling analogy. Martingale betting strategy succumbs to the same Gambler's fallacy that you are speaking of now. Let's say you have a coin, heads and tails, no tricks. The odds are 50/50 that it will land on either side. What are the odds that it will hit tails five times in a row? Not very high. Statistically, you know it is a 3.125% chance. Now you observe someone flipping a coin and the results are as follows: Tails Tails Tails Tails What are the odds that it will hit tails again? Still 50/50 Each coin flip is independent of all previous coinflips, just as each share submitted is independent of all previous shares. They all have the same odds of being submittied and/or solving a block. Pretty much what this guy said. It's not a cumulative probability, each is its own separate dice roll Incorrect. Predicting the time it takes to solve a share/block whatever is exactly that, a cumulative probability.
|
|
|
|
Damnsammit
|
|
August 20, 2013, 06:39:26 PM |
|
You are suggesting solving a share is equivalent to one hash, one coin flip if you will. Each share still typically requires thousands/millions of hashes to solve.
Yes. I believe solving a share is equivalent to one independent hash or nonce, if you will. Typically it will take thousands of hashes or millions to solve a share. Sometimes the first nonce submitted might be lower than the target and thus, a succesfully submitted share. Regardless, each hash is independent of the other hashes whether it is one or a trillion. Pool difficulty does not change that.
|
|
|
|
Liquidfire
Newbie
Offline
Activity: 28
Merit: 0
|
|
August 20, 2013, 06:45:54 PM |
|
You are suggesting solving a share is equivalent to one hash, one coin flip if you will. Each share still typically requires thousands/millions of hashes to solve.
Yes. I believe solving a share is equivalent to one independent hash or nonce, if you will. Typically it will take thousands of hashes or millions to solve a share. Sometimes the first nonce submitted might be lower than the target and thus, a succesfully submitted share. Regardless, each hash is independent of the other hashes whether it is one or a trillion. Pool difficulty does not change that. You have a fundamental misunderstanding of statistics that you refuse to budge on. We can't have a legitimate discussion about the diff rate with that in our way. I will try one more time. Gambler sits down to flip 5 coins. Say the first 4 are heads. The GAMBLERS FALLACY is that he thinks since the first 4 were heads, he is "due" for a tails. This is obviously incorrect, as there is a 50/50 chance. THAT is gamblers fallacy. Statistics/Probability is when you predict, before any coins are flipped, how many will be heads/tails. Probability is 3.125% of 5 consecutive heads. Probability examines the entire set. The gambler may of got 4 heads that time, and be in a position of having a 50% chance, at this point, of getting all 5. But if he tries the whole sequence again, do you think he will once again get 4 heads? Yes. I believe solving a share is equivalent to one independent hash or nonce, if you will.
Typically it will take thousands of hashes or millions to solve a share. Complete contradiction. That "typically" you talked about? Thats the average. That's all it is man.
|
|
|
|
Damnsammit
|
|
August 20, 2013, 06:50:45 PM |
|
I'll concede to your superior statistical skills.
Now, can you please explain how lowering the difficulty will change the fact that each hash submitted will have the exact same chance of containing a nonce that is below the target for the block?
That's what I am not following. We can stop using the gambling analogy since my statistical skills are subpar.
|
|
|
|
|