MakerAZ (OP)
Jr. Member
Offline
Activity: 39
Merit: 5
|
|
July 28, 2024, 08:48:24 PM |
|
Hello!
I am Solo mining with Futurebit Apollo II Full Node Personal BTC Miner. It mines at roughly 5TH/s. I run it mostly in ECO mode, though it can reach 10TH/s in TURBO mode. As per the response I got from Futurebit's Support Team the miner software is in-house built, so I don't know what algorythm it follows.
I wonder if Random extraNonce approach could add an Edge to Solo mining.
So, let's quickly go over the numbers:
Total Combinations: With a 32-bit nonce and a 32-bit extraNonce (I know it is not limited to 32-bit), the total number of combinations is 2^32 × 2^32=18,446,744,073,709,600,000 combinations.
Hashing Speed and Time: At 5 TH/s (5 trillion hashes per second), I can compute 5 × 10^12 hashes per second.
Total Time to Exhaust Entire Range: To exhaust the entire combined range:
Number of combinations: 18,446,744,073,709,600,000 Hashing speed: 5 × 10^12 hashes per second Time required: 18,446,744,073,709,600,000 / 5 × 10^12 seconds = approximately 61489 minutes (around 42.5 days). (As a sidenote here - I wish I could test it. I am ready to spend all 42,5 days waiting for a valid block to be found, but I don't know how to put this into practice with the Apollo.)
Checking within 10 Minutes: In 10 minutes, I can compute: 5 × 10^12 hashes / second × 60 seconds / minute × 10 minutes = 3,000,000,000,000,000 hashes, which is just a fraction of the total combinations.
Strategy for Random extraNonce Given these constraints, would favoring a random extraNonce be advantageous?
Randomizing extraNonce: Since hashing the entire space is impractical, could randomizing the extraNonce helps explore a wider variety of block headers in a shorter amount of time.
Exploring a Larger Search Space: By randomly changing the extraNonce, do I increase the number of distinct block headers being tested? Can this help cover a broader search space in a given timeframe?
Can this approach really increase the likelihood of encountering a valid block header compared to a purely incremental extraNonce?
On the other hand, the block that I Solo mine is unique, since it has my bitcoin address in the coinbase transaction, and it is mined ONLY by me and by me ALONE. So I am actually not racing against other miners over my unique block. Every other solo miner has its own unique block to mine. And with the uniform distribution of sha256 results, I am not sure if random approach to extraNonce can increase my chances at all...
|
|
|
|
NotFuzzyWarm
Legendary
Online
Activity: 3864
Merit: 2757
Evil beware: We have waffles!
|
|
July 28, 2024, 10:01:57 PM |
|
Various ways to 'increase your odds of finding a block' have been thought of and tried numerous times over the past decades and NONE of them change the fact that Random is Random meaning that they do not work at all. The only way to increase your odds is to run more hash rate.
|
|
|
|
MakerAZ (OP)
Jr. Member
Offline
Activity: 39
Merit: 5
|
|
August 09, 2024, 05:55:17 AM |
|
Various ways to 'increase your odds of finding a block' have been thought of and tried numerous times over the past decades and NONE of them change the fact that Random is Random meaning that they do not work at all. The only way to increase your odds is to run more hash rate.
I would disagreed. I am pretty sure that random extraNonce approach can give an Edge to Solo miners. I will elaborate on this soon.
|
|
|
|
MakerAZ (OP)
Jr. Member
Offline
Activity: 39
Merit: 5
|
|
August 09, 2024, 09:32:43 PM |
|
The Random extraNonce approach actually guarantees that a Solo Miner will check some extraNonces before they are reached by pool miners. Unlike pools that follow a structured, incremental approach, a solo miner using a random approach explores different parts of the hash space, making it possible to find a valid block before the pools have checked those specific extraNonces. This provides a strategic edge in solo mining.
|
|
|
|
NotFuzzyWarm
Legendary
Online
Activity: 3864
Merit: 2757
Evil beware: We have waffles!
|
|
August 09, 2024, 10:13:37 PM Last edit: August 10, 2024, 12:37:32 AM by NotFuzzyWarm |
|
You may want to read about how BTC mining actually works instead of making guesses based on several false assumptions you've made before falling down that rabbit hole... For a start, ref Kano's help files https://kano.is/index.php?k=mining and https://kano.is/index.php?k=minedetYou also should read Satoshi's Whitepaper on the theory behind Bitcoin miningBTW: Kano is the sole remaining active primary of cgminer which is what 99% of the miners in the world have been running since 2011. For what it's worth, the core software that the Apollo's run is mining software from BitFury as it is using their chips. BF basically copied the functions of cgminer while writing their own version of it. The BF code is performing the exact same work functions that cgminer does. Oh - when it comes to producing work, pools and solo are doing the exact same thing. Only difference is that a pool has accounting procedures to track your work so you get get paid based on your % of the effort expended by all miners. All miners receive unique work just as if they were connected to their own private node (solo).
|
|
|
|
Nexus9090
Member
Offline
Activity: 285
Merit: 95
So many numbers and so little time
|
|
August 09, 2024, 11:50:31 PM |
|
|
|
|
|
MakerAZ (OP)
Jr. Member
Offline
Activity: 39
Merit: 5
|
|
August 10, 2024, 05:55:01 AM |
|
You may want to read about how BTC mining actually works instead of making guesses based on several false assumptions you've made before falling down that rabbit hole... For a start, ref Kano's help files https://kano.is/index.php?k=mining and https://kano.is/index.php?k=minedetYou also should read Satoshi's Whitepaper on the theory behind Bitcoin miningBTW: Kano is the sole remaining active primary of cgminer which is what 99% of the miners in the world have been running since 2011. For what it's worth, the core software that the Apollo's run is mining software from BitFury as it is using their chips. BF basically copied the functions of cgminer while writing their own version of it. The BF code is performing the exact same work functions that cgminer does. Oh - when it comes to producing work, pools and solo are doing the exact same thing. Only difference is that a pool has accounting procedures to track your work so you get get paid based on your % of the effort expended by all miners. All miners receive unique work just as if they were connected to their own private node (solo). Thank you very much for all the references. It is indeed a rabbit hole I will read through and come back here.
|
|
|
|
Nexus9090
Member
Offline
Activity: 285
Merit: 95
So many numbers and so little time
|
|
August 10, 2024, 09:54:26 AM Last edit: August 10, 2024, 10:28:20 AM by Nexus9090 |
|
I asked the same question a while back, it took a lot of reading and to be honest I'm still not sure I fully understand it.
AFAIK nonce+extranonce is 2^96 (79,228,162,514,264,337,593,543,950,336)
nonce being 4 bytes 2^32
extranonce being 8 bytes 2^64
and that accounts for the total adjustment range a miner can make to a header without tweeking the timestamp on each hash (which is still and option and is still done by some miners).
Theoretically extranonce can be bigger since the coinbase transaction has more space in it, but its suggested that is outside the protocol.
The mechanism I still dont have a grip on is how the nonce values are issued by a pool or bitcoind and how a pool then keeps track of nonce values to prevent overlap from different miners. There's a good chance it doesnt which means miners will overlap work.
Like you suggest, if you start at zero each time you'll never see the whole picture and have a much lower chance of ever seeing a result, which kind of explains why high hash rate miners tend to hit blocks more often as they cover a bigger range in a shorter space of time; however zero is supposed to be a random starting point in the range of 2^96. But, I'm not sure it is.
I'm assuming pools & bitcoind issue nonce + extranonce unless work is generated locally, that the part of the mystery that I've not uncovered yet.
I expect locally generated work would probably be issued to stop your miner from hammering a pool with too many requests. So your miner only fetches new work from a pool once it has found a hash with a larger difficulty than the difficulty setting of the pool. I'm not sure how that mechanism works for mining to bitcoind and if the locally generated work is contiguous from the last nonce value hashed or its a new random base.
It is a deep and complicated rabbit hole for sure.
|
|
|
|
mikeywith
Legendary
Offline
Activity: 2464
Merit: 6678
be constructive or S.T.F.U
|
|
August 29, 2024, 01:31:32 PM |
|
There is no overlapping, the pool provides the blocktemplate, the miner uses different nonce, every minier is randomly hashing a block, unless the miner software is bad and for some reason one miner is duplicating the work of another miner. You need to understand that mining is not about solvig a formula, it is not incremental, its not that you try 1 2 3 4 And 5 is a block, block could be nonce 1, so thinking you can jump here and there to hit a block faster is wrong. Give my post a read
|
|
|
|
philipma1957
Legendary
Offline
Activity: 4354
Merit: 9170
'The right to privacy matters'
|
|
August 30, 2024, 03:06:58 AM |
|
There is no overlapping, the pool provides the blocktemplate, the miner uses different nonce, every minier is randomly hashing a block, unless the miner software is bad and for some reason one miner is duplicating the work of another miner. You need to understand that mining is not about solvig a formula, it is not incremental, its not that you try 1 2 3 4 And 5 is a block, block could be nonce 1, so thinking you can jump here and there to hit a block faster is wrong. Give my post a read The part is don't understand is if the diff is: 90,000,000,000,000 and I need a value over 90t has does the software decide to pick a share. I have seen really high winning shares. say 2,000,000,000,000,000 which is 2000t So if an asic can go way past 90t what's the top end of shares. what limits the high share number? and if the high share number is limited at say 20,000t which is way over the current 90t winning value. what makes it hard to go high. as there would be way more numbers over 90t that win then there are losing numbers under 90t. heck 19,910t would be high enough and only 90t-1 would be too low. there are ways to make it harder to go over 90t but this is what always tied me up. I have seen winning share 10x the diff. which would mean 10x more winners possible then losers.
|
|
|
|
mikeywith
Legendary
Offline
Activity: 2464
Merit: 6678
be constructive or S.T.F.U
|
|
August 30, 2024, 05:35:58 AM Last edit: August 30, 2024, 11:09:54 AM by mikeywith Merited by philipma1957 (3), BitMaxz (1) |
|
The maximum difficulty would be about 2^224, that is unrealistically huge, and the current 90T diff is a tiny fraction of that. I think it would make sense when you understand what a target is, target is what the logic is built on, we store the target in the block header and not the difficulty, the difficulty is just an easier representation of the target, so mining becomes more difficult when the target is smaller and vice versa, so as silly as it sounds, when you see difficulty goes up, it simply means target went down. So an easy target would be infinity, which means anything that you guess below infinity would be valid, a difficult target would be 0, which means there is nothing smaller, the lower the target goes the smaller the number of valid hashes. The current target is 00000000000000000003255b0000000000000000000000000000000000000000 in decimal (more human-readable format) that is 301319254070149585548971905645948786445483268317904896 The target in the previous epoch was 000000000000000000033d760000000000000000000000000000000000000000 in decimal 310338180674118587457206844748640968959780028040609792 Notice that the target in the previous epoch was larger (difficulty was lower). to make things a lot clearer, here is the target of the first block Satoshi mined 00000000ffff0000000000000000000000000000000000000000000000000000 in decimal 26959535291011309493156476344723991336010898738574164086137773096960 now since Difficulty= maximum target/given target, if you apply that on the first block it would give you difficulty of 1 (target was too huge, difficulty was too low, you basically guess anything and get a block ), apply it for the current target and you get 89T Now if you read the post I quoted in my previous reply, you would understand how it's easier to hash a number that has less leading zero than more, i.e, it's easier to find larger numbers than smaller ones, think of it this way, you have a box that has an infinite number of pieces of paper ranging from 1 to infinity, I tell you if you manage to pick any piece of paper that is smaller than 1000000000000000000000000000000000, I 'l give you the prize, that would certainly be easier than if I told you the number you need is smaller than 10 because then there would be only 9 chances out of the infinite numbers in the box. I know it's easier to understand difficulty than to understand the target, but really, if you don't understand the target, there will always be missing pieces in your comprehension of how this works, now when you say high share, it's actually a small target share which then translate to a high diff share. So if you are required to find a number smaller or equal to 1000 and you find 1, then that's a very small target which is a very high diff, so when you see blocks getting a high share they technically found a hash a lot smaller than they had to win a block, so the smallest the target can be would be a 256-bit number, which is just right above 0, so when you ask how high share can be, it can be as high as 2^224.
|
|
|
|
philipma1957
Legendary
Offline
Activity: 4354
Merit: 9170
'The right to privacy matters'
|
|
September 02, 2024, 03:42:44 AM |
|
The maximum difficulty would be about 2^224, that is unrealistically huge, and the current 90T diff is a tiny fraction of that. I think it would make sense when you understand what a target is, target is what the logic is built on, we store the target in the block header and not the difficulty, the difficulty is just an easier representation of the target, so mining becomes more difficult when the target is smaller and vice versa, so as silly as it sounds, when you see difficulty goes up, it simply means target went down. So an easy target would be infinity, which means anything that you guess below infinity would be valid, a difficult target would be 0, which means there is nothing smaller, the lower the target goes the smaller the number of valid hashes. The current target is 00000000000000000003255b0000000000000000000000000000000000000000 in decimal (more human-readable format) that is 301319254070149585548971905645948786445483268317904896 The target in the previous epoch was 000000000000000000033d760000000000000000000000000000000000000000 in decimal 310338180674118587457206844748640968959780028040609792 Notice that the target in the previous epoch was larger (difficulty was lower). to make things a lot clearer, here is the target of the first block Satoshi mined 00000000ffff0000000000000000000000000000000000000000000000000000 in decimal 26959535291011309493156476344723991336010898738574164086137773096960 now since Difficulty= maximum target/given target, if you apply that on the first block it would give you difficulty of 1 (target was too huge, difficulty was too low, you basically guess anything and get a block ), apply it for the current target and you get 89T Now if you read the post I quoted in my previous reply, you would understand how it's easier to hash a number that has less leading zero than more, i.e, it's easier to find larger numbers than smaller ones, think of it this way, you have a box that has an infinite number of pieces of paper ranging from 1 to infinity, I tell you if you manage to pick any piece of paper that is smaller than 1000000000000000000000000000000000, I 'l give you the prize, that would certainly be easier than if I told you the number you need is smaller than 10 because then there would be only 9 chances out of the infinite numbers in the box. I know it's easier to understand difficulty than to understand the target, but really, if you don't understand the target, there will always be missing pieces in your comprehension of how this works, now when you say high share, it's actually a small target share which then translate to a high diff share. So if you are required to find a number smaller or equal to 1000 and you find 1, then that's a very small target which is a very high diff, so when you see blocks getting a high share they technically found a hash a lot smaller than they had to win a block, so the smallest the target can be would be a 256-bit number, which is just right above 0, so when you ask how high share can be, it can be as high as 2^224. okay makes sense. thank you
|
|
|
|
MakerAZ (OP)
Jr. Member
Offline
Activity: 39
Merit: 5
|
|
September 26, 2024, 03:10:03 PM |
|
Thank you to everyone for your interest, time and explanations shared here.
My hashing speed is: 10 TH/s = 10^13 x 60 seconds = 6 x 10^14 x 10 minutes = 6 x 10^15
Network hashing speed is: 600 EH/s = 6 x 10^20 x 60 seconds = 36 x 10^21 x 10 minutes = 36 x 10^22
Initially my understanding was that I am racing against 600 EH/s with my 10 TH/s following them and stepping on the same path. However, it is not the case. There are thousands of unique block headers that are being mined within 10 minutes and I am mining my own unique block header, so I am NOT checking the same hashes that have already failed.
At the current target of 79 "0"s there is statistically only 1 valid block per 36 x 10^22 hashes. Of course as many hashes I can check as bigger a chance finding that 1 block. However, if that 1 valid block happens to fall within my mining range, I will find it within 10 minutes with my 10 TH/s.
Random extraNonce does not add an edge.
Mining with anything less than 1% of the total hash power is a pure lottery.
|
|
|
|
NotFuzzyWarm
Legendary
Online
Activity: 3864
Merit: 2757
Evil beware: We have waffles!
|
|
September 26, 2024, 10:44:47 PM |
|
By Jove, the OP has got it!. Merit given for seeing the light.
|
|
|
|
mikeywith
Legendary
Offline
Activity: 2464
Merit: 6678
be constructive or S.T.F.U
|
|
September 27, 2024, 10:44:32 PM |
|
Mining with anything less than 1% of the total hash power is a pure lottery.
Mining is all about luck, even with 100% hashrate you are still subject to luck, which is why variance exist on a network level not just individuals. The only difference is variance, it's very normal to have a block with 200% luck, meaning you need to do double the work or wait double the time to hit a block, so if your 100% luck says 1 block a day, it would be very normal and okay to wait for 2 days to hit a block, however, if it says 5 years, then it's not okay to wait for 10 years, let alone if you get hit with a 500% luck and need to wait a few decades. Realistically, if we were to assume that we all can mine to infinity, then even your 10TH would hit a block at some point, but because you can't mine to infinity, it's more likely to be struck by lightning than live to see your 10TH hit a block.
|
|
|
|
philipma1957
Legendary
Offline
Activity: 4354
Merit: 9170
'The right to privacy matters'
|
|
September 28, 2024, 12:32:49 AM |
|
Mining with anything less than 1% of the total hash power is a pure lottery.
Mining is all about luck, even with 100% hashrate you are still subject to luck, which is why variance exist on a network level not just individuals. The only difference is variance, it's very normal to have a block with 200% luck, meaning you need to do double the work or wait double the time to hit a block, so if your 100% luck says 1 block a day, it would be very normal and okay to wait for 2 days to hit a block, however, if it says 5 years, then it's not okay to wait for 10 years, let alone if you get hit with a 500% luck and need to wait a few decades. Realistically, if we were to assume that we all can mine to infinity, then even your 10TH would hit a block at some point, but because you can't mine to infinity, it's more likely to be struck by lightning than live to see your 10TH hit a block. over the years I have seen 1200% I also saw back to back 1000%. So if you do 1 block a day you would do back to back and have 2 blocks vs 20. or 375,000 in stead of 3,750,000 even if your hash was all s21 xp at 13 watts a th you would likely be facing a 2,000,000 power bill with 375,000 income. rounded the numbers but close enough.
|
|
|
|
mikeywith
Legendary
Offline
Activity: 2464
Merit: 6678
be constructive or S.T.F.U
|
|
September 29, 2024, 09:09:16 AM Last edit: September 29, 2024, 04:33:49 PM by mikeywith Merited by philipma1957 (4) |
|
over the years I have seen 1200%
I also saw back to back 1000%.
So if you do 1 block a day you would do back to back and have 2 blocks vs 20.
or 375,000 in stead of 3,750,000
even if your hash was all s21 xp at 13 watts a th you would likely be facing a 2,000,000 power bill with 375,000 income.
rounded the numbers but close enough.
Yes, huge variances like that do happen, it doesn't matter how big you are, you are always subject to variance, and while things would even out with time, the problem is what you have described (you may not be able to survive till that happens) which is why almost everyone uses a pool, it's also the reason why PPS pools charge a heavy fee because they could be paying millions out of their own pocket before they could claim the rewards from the blockchain, so in order for them not to go bankrupt, they must charge a good fee and keep a lot of reserves, every pool that tried to operate in PPS mode with low fees has failed and eventually upped their fees.
|
|
|
|
|