Let's say I have 3.0 ETH or LTC for instance, the initial bet is 0.0001 with a 20% increase on loss.
To know exactly how many losses in a row I need in order to lose the whole 3.0 amount, I've written this:
double initialBetAmount = 0.0001;
double sumLoss = 0.0000;
int possibleBets = 0;
do
{
initialBetAmount += ((double)initialBetAmount / 100) * 20; // increase the initialBetAmount by 20%
sumLoss += initialBetAmount; //sum the losses
possibleBets++;
} while (sumLoss < 3.0);
Console.WriteLine("Possible losses in a row: " + (possibleBets-1) );
Output:
Possible losses in a row: 46
And based on that, here are the results I've got from the ChatBot (odds command):
Odds of losing 46 bets in a row at 18.0000%: Once every 9216.43 bets.
So, for those who are familiar with Stake or dice games in general, are my calculations above anywhere near accurate? Because they look a bit unrealistic to me.