Bitcoin Forum
July 20, 2019, 11:00:27 PM *
News: Latest Bitcoin Core release: 0.18.0 [Torrent] (New!)
 
   Home   Help Search Login Register More  
Pages: « 1 ... 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 [62] 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 ... 538 »
  Print  
Author Topic: [ANN] profit switching auto-exchanging pool - www.middlecoin.com  (Read 809784 times)
Liquidfire
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
August 20, 2013, 05:40:23 PM
 #1221

Without seeing the secret sauce, the best way to tell if we switch coins is to just listen to your cards throttle down. I don't think it happens too often, maybe 5-10 minutes sometimes but I am just guessing... i know I have been in my office for hours and not heard it before as well.



So then wouldn't it mean that as long as your average share time was under 5-10 minutes then the difficulty would not affect anything?

No. This "loss/waste of hashpower" happens both when we change coins AND when a block changes. In that said 5-10 minutes, we might burn through 4-5 blocks, or we might get 1 block, or none, or whatever, depending on the coin.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1563663627
Hero Member
*
Offline Offline

Posts: 1563663627

View Profile Personal Message (Offline)

Ignore
1563663627
Reply with quote  #2

1563663627
Report to moderator
h2odysee
Full Member
***
Offline Offline

Activity: 238
Merit: 100


View Profile WWW
August 20, 2013, 05:40:48 PM
 #1222

Having said that, when a switch happens, the pool should still accept shares for the old chain. 

What would happen here?

- miner finds a LTC block
- pool switches to some other chain
- miner submits LTC block to pool

Would that count as a rejected block?

Yeah, unfortunately that block will be rejected.

http://middlecoin.com - profit-switching, auto-exchanging scrypt pool that pays out in BTC
Eastwind
Hero Member
*****
Offline Offline

Activity: 896
Merit: 1000



View Profile
August 20, 2013, 05:43:36 PM
Last edit: August 20, 2013, 06:01:08 PM by Eastwind
 #1223

H2O, what is today's mining rate?
Damnsammit
Sr. Member
****
Offline Offline

Activity: 406
Merit: 250



View Profile
August 20, 2013, 05:49:01 PM
 #1224


No. This "loss/waste of hashpower" happens both when we change coins AND when a block changes. In that said 5-10 minutes, we might burn through 4-5 blocks, or we might get 1 block, or none, or whatever, depending on the coin.

Okay, I think I see what you are saying but if you average a share every minute and you are on a 15s average block coin, and they mine this coin for 5 minutes then the difficulty doesn't matter.

5 minute duration
512 difficulty - 1 minute/avg - 5 shares
256 difficulty - 30 second/avg - 10 shares (worth half of the 512 difficulty shares in terms of payout value)

Whether your average share time is greater than the average block time doesn't matter.  They both start at zero and have a chance to hit with every second that ticks.  Just because the clock ticked 59 times and you didn't get a share, it does not mean that the 60th time has any higher chance to hit than the 1st.

Edit:  This sounds a lot like the Gambler's Fallacy. 
Liquidfire
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
August 20, 2013, 05:49:07 PM
 #1225

The perfect analogy to this problem is cluster size for a file system.


Basically it is a minimum amount of space that must get allocated for data. So if your custer size was 10, and you wanted to write a file of size one, you waste 9. If your custer size is 10 and you use a file of 15, you use two clusters, one filling up and one wasting 5.



Check out this explanation from Microsoft on cluster size:

All file systems that are used by Windows organize your hard disk based on cluster size (also known as allocation unit size). Cluster size represents the smallest amount of disk space that can be used to hold a file. When file sizes do not come out to an even multiple of the cluster size, additional space must be used to hold the file (up to the next multiple of the cluster size). On the typical hard disk partition, the average amount of space that is lost in this manner can be calculated by using the equation (cluster size)/2 * (number of files).

It's the same formula I ended up with when figuring out the block change / vs hashrate.

(cluster size)/2 * (number of files) = Equals wasted space

(Average time for miner to find a share) / 2* (Average block find time) = Percentage of LOST profit/hashpower

Cryptos2go
Member
**
Offline Offline

Activity: 81
Merit: 10


View Profile
August 20, 2013, 05:50:18 PM
 #1226

It's going to be difficult to get many, if any shares at all, when the block times get too fast regardless of your difficulty settings.

I have seen blocks as fast as 3 seconds.

This is only part of the problem.

This screen was posted by Ixion in the DGC thread here. https://bitcointalk.org/index.php?topic=209508.3680



This is what can and does happen when fast low diff coins get over worked with way too much hashing power.

I think most would agree that in these cases less is more.

H20 has mentioned there would be increased profits IF he can find a way to more efficiently distribute the pool hash in these situations.

Splitting the pool 50/50 could be a solution but even that would probably be difficult to implement effectively as some times both pools could/should be on the same coin (LTC/FTC/NVC) but other times would need to be on separate coins.

Then you have the possible issue of combining the profits from the 2 pools into 1 distributed payout as if it were 1 pool.

Not to mention a host of other potential issues that I don't even understand.

H20 is already actively involved with managing the coins and exchanges in the stable, which the pool requires in order to maintain his/our profitability as the newer coins can often be amongst the most profitable for a time.

Obviously we all want the same thing here but there are probably limits to what H20 can accomplish without this becoming a full time job and even then there will still be limits for whatever reasons.
Liquidfire
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
August 20, 2013, 05:51:33 PM
Last edit: August 20, 2013, 06:02:23 PM by Liquidfire
 #1227


No. This "loss/waste of hashpower" happens both when we change coins AND when a block changes. In that said 5-10 minutes, we might burn through 4-5 blocks, or we might get 1 block, or none, or whatever, depending on the coin.

Okay, I think I see what you are saying but if you average a share every minute and you are on a 15s average block coin, and they mine this coin for 5 minutes then the difficulty doesn't matter.

5 minute duration
512 difficulty - 1 minute/avg - 5 shares
256 difficulty - 30 second/avg - 10 shares (worth half of the 512 difficulty shares in terms of payout value)

Whether your average share time is greater than the average block time doesn't matter.  They both start at zero and have a chance to hit with every second that ticks.  Just because the clock ticked 59 times and you didn't get a share, it does not mean that the 60th time has any higher chance to hit than the 1st.

No, when the block changes you lose all progress on that block that didn't get included in a share. Thats where the waste comes in. You are half way through a share when the block is found, the "half" share that you have just gets thrown out. Say you cut the diff in half. Guess what? That half a share that got thrown out? Thats ONE share now, and it gets submitted, and you get credit for it.


ranlo
Legendary
*
Offline Offline

Activity: 1638
Merit: 1002



View Profile
August 20, 2013, 06:01:31 PM
 #1228


No. This "loss/waste of hashpower" happens both when we change coins AND when a block changes. In that said 5-10 minutes, we might burn through 4-5 blocks, or we might get 1 block, or none, or whatever, depending on the coin.

Okay, I think I see what you are saying but if you average a share every minute and you are on a 15s average block coin, and they mine this coin for 5 minutes then the difficulty doesn't matter.

5 minute duration
512 difficulty - 1 minute/avg - 5 shares
256 difficulty - 30 second/avg - 10 shares (worth half of the 512 difficulty shares in terms of payout value)

Whether your average share time is greater than the average block time doesn't matter.  They both start at zero and have a chance to hit with every second that ticks.  Just because the clock ticked 59 times and you didn't get a share, it does not mean that the 60th time has any higher chance to hit than the 1st.

No, when the block changes you lose all progress on that block that didn't get included in a share. Thats where the waste comes in. You are half way through a share when the block is found, the "half" share that you have just gets thrown out. Say you cut the diff in half. Guess what? That half a share that got thrown out? Thats ONE share now, and it gets submitted, and you get credit for it.

This. I don't see why people keep trying to say difficulty doesn't matter. If it didn't, why is there even a difficulty to begin with? Why is there VarDiff? If all of this were really irrelevant there would be no need for any of that.
Liquidfire
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
August 20, 2013, 06:03:18 PM
 #1229

Whether your average share time is greater than the average block time doesn't matter.  They both start at zero and have a chance to hit with every second that ticks.  Just because the clock ticked 59 times and you didn't get a share, it does not mean that the 60th time has any higher chance to hit than the 1st.

If you couldn't trust averages, vegas would still be a barren desert. The casinos know statistically speaking their expected return is 1.01, or 1.05, or 1.10 or whatever on that particular game. Yes its mathematically possible that every patron that placed every bet would win and bankrupt the casino, but that doesn't happen. Small variations even out over time and they end up with virtually exactly what the statistics said they would.
TierNolan
Legendary
*
Offline Offline

Activity: 1232
Merit: 1002


View Profile
August 20, 2013, 06:09:22 PM
 #1230

Forgot the /2 part, because you'll be at a random point of progress on the share each time a block changes, averaging to half complete

The is the point of misunderstanding.  There is no progress.  Each hash is an independent random trial.

If you only do one hash per block, that hash has the same chance as a hash done by a miner with the strongest rig.  He will do many more hashes, so will have higher odds, but each hash, done by anyone is equal.

GPUs and (probably) ASICS have a "scan-time".  They can't do one hash.  They have to lots of hashes all at once.  The scan time is how long to do an entire run.

The lost hashing power is something like

(network latency + scan-time/2) / (block time)

1LxbG5cKXzTwZg9mjL3gaRE835uNQEteWF
Damnsammit
Sr. Member
****
Offline Offline

Activity: 406
Merit: 250



View Profile
August 20, 2013, 06:10:12 PM
 #1231

No, when the block changes you lose all progress on that block that didn't get included in a share. Thats where the waste comes in. You are half way through a share when the block is found, the "half" share that you have just gets thrown out. Say you cut the diff in half. Guess what? That half a share that got thrown out? Thats ONE share now, and it gets submitted, and you get credit for it.

But that's not the way it works.  You don't build a share like you build a house.  You don't pour the foundation in the first second, and then the next second start building the frame.  By the 30th second you don't have an almost finished house that just needs some brick work and plumbing to be complete just at the same time that a big block comes down and destroys it and washes you back to square one.

From the first second your GPU is computing thousands of hashes and trying to get one share to submit. Nothing is lost when the block changes... the time since last block is just reset.  At that point you have exactly the same chance in the next second that you have in every other second with difficulty (coin) constant.

If you cut the difficulty in half, then over the span of a few days, my hypothesis would be that people would see their amount of shares double, each share to be worth half, and for everyone's profits to be relatively unchanged.
Liquidfire
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
August 20, 2013, 06:14:53 PM
 #1232

Forgot the /2 part, because you'll be at a random point of progress on the share each time a block changes, averaging to half complete

The is the point of misunderstanding.  There is no progress.  Each hash is an independent random trial.

If you only do one hash per block, that hash has the same chance as a hash done by a miner with the strongest rig.  He will do many more hashes, so will have higher odds, but each hash, done by anyone is equal.

GPUs and (probably) ASICS have a "scan-time".  They can't do one hash.  They have to lots of hashes all at once.  The scan time is how long to do an entire run.

The lost hashing power is something like

(network latency + scan-time/2) / (block time)

You are correct, I spoke in an oversimplification, but the effect is the same. Here is the long (more correct) version, with the same result in the end.

Forgot the /2 part, because you'll randomly complete shares at a rate that statistically evens out to half the time less then the average and half the time more than the average.


Better?

Liquidfire
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
August 20, 2013, 06:17:34 PM
 #1233

No, when the block changes you lose all progress on that block that didn't get included in a share. Thats where the waste comes in. You are half way through a share when the block is found, the "half" share that you have just gets thrown out. Say you cut the diff in half. Guess what? That half a share that got thrown out? Thats ONE share now, and it gets submitted, and you get credit for it.

But that's not the way it works.  You don't build a share like you build a house.  You don't pour the foundation in the first second, and then the next second start building the frame.  By the 30th second you don't have an almost finished house that just needs some brick work and plumbing to be complete just at the same time that a big block comes down and destroys it and washes you back to square one.

From the first second your GPU is computing thousands of hashes and trying to get one share to submit. Nothing is lost when the block changes... the time since last block is just reset.  At that point you have exactly the same chance in the next second that you have in every other second with difficulty (coin) constant.

If you cut the difficulty in half, then over the span of a few days, my hypothesis would be that people would see their amount of shares double, each share to be worth half, and for everyone's profits to be relatively unchanged.


As I said before I acknowledge I am speaking in simplifications, but I am only doing so because the answer is statistically the same as if it did work that way.

if 50% of the time, my partial-brick gets washed away, versus 50% of the time i either put a full brick on or none at all, the results are statistically the same.
ranlo
Legendary
*
Offline Offline

Activity: 1638
Merit: 1002



View Profile
August 20, 2013, 06:19:45 PM
 #1234

Forgot the /2 part, because you'll be at a random point of progress on the share each time a block changes, averaging to half complete

The is the point of misunderstanding.  There is no progress.  Each hash is an independent random trial.

If you only do one hash per block, that hash has the same chance as a hash done by a miner with the strongest rig.  He will do many more hashes, so will have higher odds, but each hash, done by anyone is equal.

GPUs and (probably) ASICS have a "scan-time".  They can't do one hash.  They have to lots of hashes all at once.  The scan time is how long to do an entire run.

The lost hashing power is something like

(network latency + scan-time/2) / (block time)

There technically IS a progress.

If you flip a quarter 50x you can expect that you will get heads 25x and tails 25x. If you got heads 49x, you can reasonably expect that over the next hundreds/thousands/millions you will end up getting more tails than heads because statistically they should be about equal.

The same thing here. If you had a lot of 10s blocks when the average should be 30s, you can reasonably expect that you are going to have an equal number of 50s blocks. Otherwise the AVERAGE is lower than the average; you can't possibly have a lower average than average. That's like saying 2=6. It's just not mathematically possible.
Damnsammit
Sr. Member
****
Offline Offline

Activity: 406
Merit: 250



View Profile
August 20, 2013, 06:20:55 PM
 #1235


If you couldn't trust averages, vegas would still be a barren desert. The casinos know statistically speaking their expected return is 1.01, or 1.05, or 1.10 or whatever on that particular game. Yes its mathematically possible that every patron that placed every bet would win and bankrupt the casino, but that doesn't happen. Small variations even out over time and they end up with virtually exactly what the statistics said they would.

But that's exactly what I am talking about.  Let's do the gambling analogy.  Martingale betting strategy succumbs to the same Gambler's fallacy that you are speaking of now.  

Let's say you have a coin, heads and tails, no tricks.  The odds are 50/50 that it will land on either side.  What are the odds that it will hit tails five times in a row?  Not very high.  Statistically, you know it is a 3.125% chance.  

Now you observe someone flipping a coin and the results are as follows:

Tails
Tails
Tails
Tails

What are the odds that it will hit tails again? Still 50/50

Each coin flip is independent of all previous coinflips, just as each share submitted is independent of all previous shares.

They all have the same odds of being submittied and/or solving a block.
iGotSpots
Legendary
*
Offline Offline

Activity: 2016
Merit: 1054



View Profile WWW
August 20, 2013, 06:24:41 PM
 #1236


If you couldn't trust averages, vegas would still be a barren desert. The casinos know statistically speaking their expected return is 1.01, or 1.05, or 1.10 or whatever on that particular game. Yes its mathematically possible that every patron that placed every bet would win and bankrupt the casino, but that doesn't happen. Small variations even out over time and they end up with virtually exactly what the statistics said they would.

But that's exactly what I am talking about.  Let's do the gambling analogy.  Martingale betting strategy succumbs to the same Gambler's fallacy that you are speaking of now.  

Let's say you have a coin, heads and tails, no tricks.  The odds are 50/50 that it will land on either side.  What are the odds that it will hit tails five times in a row?  Not very high.  Statistically, you know it is a 3.125% chance.  

Now you observe someone flipping a coin and the results are as follows:

Tails
Tails
Tails
Tails

What are the odds that it will hit tails again? Still 50/50

Each coin flip is independent of all previous coinflips, just as each share submitted is independent of all previous shares.

They all have the same odds of being submittied and/or solving a block.

Pretty much what this guy said. It's not a cumulative probability, each is its own separate dice roll

Discord  |  Giveaways  |  Minds  |  Twitter  |  YouTube
Liquidfire
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
August 20, 2013, 06:29:34 PM
 #1237


If you couldn't trust averages, vegas would still be a barren desert. The casinos know statistically speaking their expected return is 1.01, or 1.05, or 1.10 or whatever on that particular game. Yes its mathematically possible that every patron that placed every bet would win and bankrupt the casino, but that doesn't happen. Small variations even out over time and they end up with virtually exactly what the statistics said they would.

But that's exactly what I am talking about.  Let's do the gambling analogy.  Martingale betting strategy succumbs to the same Gambler's fallacy that you are speaking of now.  

Let's say you have a coin, heads and tails, no tricks.  The odds are 50/50 that it will land on either side.  What are the odds that it will hit tails five times in a row?  Not very high.  Statistically, you know it is a 3.125% chance.  

Now you observe someone flipping a coin and the results are as follows:

Tails
Tails
Tails
Tails

What are the odds that it will hit tails again? Still 50/50

Each coin flip is independent of all previous coinflips, just as each share submitted is independent of all previous shares.

They all have the same odds of being submittied and/or solving a block.


You are actually saying the same thing as me. You just don't realize it. Each individual hash has a particular chance of hitting. Just like each individual coin flip has a particular chance (50%). We aren't talking about one hash or one coin flip. We are talking about thousands, millions, billions.

You are suggesting solving a share is equivalent to one hash, one coin flip if you will. Each share still typically requires thousands/millions of hashes to solve.

That is the point of a proof-of-work algorithm. It is saying "statistically speaking, if you solve a share of X difficulty, i know you did about Y hashes, and I will reward you based on that assumption".
Liquidfire
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
August 20, 2013, 06:31:01 PM
 #1238


If you couldn't trust averages, vegas would still be a barren desert. The casinos know statistically speaking their expected return is 1.01, or 1.05, or 1.10 or whatever on that particular game. Yes its mathematically possible that every patron that placed every bet would win and bankrupt the casino, but that doesn't happen. Small variations even out over time and they end up with virtually exactly what the statistics said they would.

But that's exactly what I am talking about.  Let's do the gambling analogy.  Martingale betting strategy succumbs to the same Gambler's fallacy that you are speaking of now.  

Let's say you have a coin, heads and tails, no tricks.  The odds are 50/50 that it will land on either side.  What are the odds that it will hit tails five times in a row?  Not very high.  Statistically, you know it is a 3.125% chance.  

Now you observe someone flipping a coin and the results are as follows:

Tails
Tails
Tails
Tails

What are the odds that it will hit tails again? Still 50/50

Each coin flip is independent of all previous coinflips, just as each share submitted is independent of all previous shares.

They all have the same odds of being submittied and/or solving a block.

Pretty much what this guy said. It's not a cumulative probability, each is its own separate dice roll

Incorrect. Predicting the time it takes to solve a share/block whatever is exactly that, a cumulative probability.
Damnsammit
Sr. Member
****
Offline Offline

Activity: 406
Merit: 250



View Profile
August 20, 2013, 06:39:26 PM
 #1239


You are suggesting solving a share is equivalent to one hash, one coin flip if you will. Each share still typically requires thousands/millions of hashes to solve.

Yes.  I believe solving a share is equivalent to one independent hash or nonce, if you will. Typically it will take thousands of hashes or millions to solve a share.  Sometimes the first nonce submitted might be lower than the target and thus, a succesfully submitted share. 

Regardless, each hash is independent of the other hashes whether it is one or a trillion.   

Pool difficulty does not change that.
Liquidfire
Newbie
*
Offline Offline

Activity: 28
Merit: 0


View Profile
August 20, 2013, 06:45:54 PM
 #1240


You are suggesting solving a share is equivalent to one hash, one coin flip if you will. Each share still typically requires thousands/millions of hashes to solve.

Yes.  I believe solving a share is equivalent to one independent hash or nonce, if you will. Typically it will take thousands of hashes or millions to solve a share.  Sometimes the first nonce submitted might be lower than the target and thus, a succesfully submitted share.  

Regardless, each hash is independent of the other hashes whether it is one or a trillion.  

Pool difficulty does not change that.

You have a fundamental misunderstanding of statistics that you refuse to budge on. We can't have a legitimate discussion about the diff rate with that in our way.

I will try one more time.

Gambler sits down to flip 5 coins. Say the first 4 are heads. The GAMBLERS FALLACY is that he thinks since the first 4 were heads, he is "due" for a tails. This is obviously incorrect, as there is a 50/50 chance. THAT is gamblers fallacy.

Statistics/Probability is when you predict, before any coins are flipped, how many will be heads/tails. Probability is 3.125% of 5 consecutive heads. Probability examines the entire set. The gambler may of got 4 heads that time, and be in a position of having a 50% chance, at this point, of getting all 5. But if he tries the whole sequence again, do you think he will once again get 4 heads?

Quote
Yes.  I believe solving a share is equivalent to one independent hash or nonce, if you will.


Typically it will take thousands of hashes or millions to solve a share.


Complete contradiction. That "typically" you talked about? Thats the average. That's all it is man.
Pages: « 1 ... 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 [62] 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 ... 538 »
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!