Bitcoin Forum
June 28, 2024, 06:55:55 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
  Home Help Search Login Register More  
  Show Posts
Pages: « 1 ... 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 [155] 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 ... 573 »
3081  Economy / Scam Accusations / Re: DADICE : exposing investors to more risk than their kelly, misleading informatio on: July 08, 2015, 07:24:57 PM
I invested a certain amount of btc in DADICE at a kelly of 10, as far as I know this means I can only lose 10% of my profit in 1 bet.
[...]
Then someone made 1 bet : #407989150
and my investment went immediately down by 29%

The 'effective bankroll' (ie. actual investments times Kelly multiplier) was over 1000 BTC at the time. That means the maximum profit per bet should have been over 10 BTC. So when someone wins 20 BTC because the site is risking too much, it's not more than double the correct amount to be risking.

The most you should have been able to lose is 10% of your investment per roll.

Even taking this fixed 20 BTC maximum profit into account, the most you should have been able to lose is ~20% of your investment per roll.

So how on earth did you end up losing 29%?

It seems there's more than one bug here.

It will be interesting to hear DaDice staff try to explain their way out of this major fuckup.
3082  Economy / Gambling / Re: DaDice.com - Next Gen Social Gambling Dice Experience | Progressive Jackpot on: July 08, 2015, 06:48:44 PM
Can someone explain how this guy was able to win around 20 BTC when the effective bankroll is only around 1000 BTC after taking Kelly multipliers into account?



The maximum profit per roll is meant to be 1% of the effective bankroll, which would be around 10 BTC.

Will the investors be expected to pay out of their own pockets for this error, if that is what it is?
3083  Bitcoin / Mycelium / Re: Mycelium Bitcoin Wallet on: July 08, 2015, 05:54:08 PM
This message was just posted by Mycelium on Twitter:-

"All the spam flood clogged our Tor server, so we're having to defrag the 160gig database it's on. Will take a while. Sorry"

I went into settings and stopped using Tor and was successfully able to make transactions. My past transactions that did not appear on the Blockchain, have disappeared and my balance is correct.

Twitter really does seem to be a very good place to get CS as no company wishes to give bad publicity in such a Public Forum.

I just tried finding that message. I checked https://twitter.com/myceliumcom but it wasn't there. Mostly that was retweets of stuff unrelated to Mycelium. Eventually I found it here: https://twitter.com/MyceliumCom/with_replies - mixed in with a lot of out-of-context replies to questions that I can't see without clicking - and the retweets are still there too. I have to scroll through pages of photos of people's feet and such like to find important announcements? Seems sub-optimal. There must be some better way of communicating important information directly affecting users of the wallet. Is the service back up now? Is it safe to use it? If not, is there an ETA for when it will be safe? Those are the questions I'd want to see addressed, maybe on a dedicated "service status" page so I don't have to get distracted by pictures of feet or Julia in short shorts.

Try looking up problems and help here: https://github.com/mycelium-com/wallet/issues  

While github is a good resource, I don't think its issue tracker is the place for announcements about back-end server issues. The issue tracker is for tracking issues in the 'open-source' components of their wallet, which I don't think includes the back-end server databases.

I need to switch to another wallet asap...

Let us know if you find anything good for Android. I had high hopes for airbitz but it wanted access to so much stuff on my tablet that I ended up not installing it.
3084  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][CLAM] CLAMs, Proof-Of-Chain, Proof-Of-Working-Stake on: July 08, 2015, 05:33:15 PM
The main concept behind the the DigiShield and Kimoto Gravity Well was resistance to multi-pool hash "attacks".

If I understand Kimoto Gravity Well correctly, it worked by applying a curve such that more recent blocks have a higher weight in the equation that figures the next difficulty target.  This is of course similar to simple one block re-targets which were also a popular "solution".

DigiShield had implementation and exploit-ability issues.  I know because I was closely involved in the project when that feature was launched.  It functioned by making the difficulty adjustment non-symmetric.  In testing, this seemed to show a reduction in the type of "wave" oscillation you see as adjustments over-shoot.  In practice, it didn't work all that well.

xploited might be able to offer some insight into DigiShield's problems; as he was even more closely involved in it's inception.

I didn't realise you guys were at all involved with DigiShield, or I probably wouldn't have been so flippant about it. Which coin were you involved with which first added DigiShield? Was it a DOGE thing? I really don't pay much attention to other altcoins.

While we don't have to worry about multi-pools we do have to be able to cope with sudden changes to the total network stake weight. A big staker could go offline temporarily leaving the difficulty too high for a while, or someone could dig up a bunch of distribution CLAMs and start staking them, leaving the difficulty too low for a while. The effect is similar to a big multi-pool visiting.

With the first (single block) difficulty retargetting what would happen that due to variance we would see 20 second gap between blocks occasionally, and the retargetting algorithm would overreact, setting the difficulty far too high, and it would take quite a while for things to recover. That would leave us with >2 minute blocks for a while, when all that had happened was one staker got lucky once.

Are there any plans for a hard fork any time soon? If so, it would be worth experimenting with the various adjustment algorithms and seeing what we can do to improve things. I imagine running a simulation, plotting the difficulty over time with various different retarget algorithms, etc. If not, I don't think it's a big enough problem to warrant a hard fork on its own.
3085  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][CLAM] CLAMs, Proof-Of-Chain, Proof-Of-Working-Stake on: July 08, 2015, 04:21:29 PM
If I have 1 clam and I split it as far as possible down to 100,000,000 "clamtoshis" in order to have the best chance to stake is this really the right thing to do?  And if everyone else does it wouldn't it make your UTXO set astronomically huge?  And wouldn't it make spending any reasonable amount of clams extremely expensive (because of all the nano-outputs you'd have to reference)?

I'm sure I'm missing something.  Thanks in advance!

Splitting a CLAM into 10^8 separate outputs is going to cost you a lot in transaction fees.

Splitting is a case of diminishing returns. The gains you get from splitting 1 CLAM into two halves are far bigger than any gain from splitting those two halves into 4 quarters.

And yes, merging those outputs back into a single output will also cost you a fortune in transaction fees.

So it's a trade-off. There's pressure in both directions (too little splitting and your coins spend too much of their life "maturing"; too much splitting and you pay a lot of transaction fees both when splitting and when re-joining, and use too much CPU time checking all your outputs for staking opportunities), and so you can solve some equations to find the optimal splitting level based on how long you anticipate staking for, the speed of your staking machine, the price of CLAM, etc. (Or you can just set all your outputs to be 13.37 CLAM each because you think the number is 'kewl').
3086  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][CLAM] CLAMs, Proof-Of-Chain, Proof-Of-Working-Stake on: July 08, 2015, 04:16:43 PM
Black vortex. You know, something silly like that...
Do you mean kimoto gravity well?

Doesn't sound right to me. I think it was cooler sounding.

I guess I'm thinking of this, from https://en.wikipedia.org/wiki/Dogecoin:

Quote
On March 12, 2014, version 1.6 of the Dogecoin client was announced. Along with allowing for there to be a fixed reward per block, the new client update also introduced a new difficulty algorithm called DigiShield

But KGW looks interesting too:

http://bitcoin.stackexchange.com/questions/21730/how-does-the-kimoto-gravity-well-regulate-difficulty

My point is that there are probably better ways to retarget the difficulty than the naive way we're currently doing it (which in turn is better than the way we were doing it before...).
3087  Bitcoin / Mycelium / Re: Mycelium Bitcoin Wallet on: July 08, 2015, 09:38:27 AM
Do you have "only use Tor" enabled? It's towards the end of the settings.
Exactly where can I find that option? I only see the same as armedmilitia described: a use tor-network option that has two settings: "https" or "external tor (Orbot)". What should it be on?

Is there a way to NOT use Tor at all? I don't mind about the potential privacy risk involved. I never order child porn and terrorist attacks with Mycelium.

Seems like I misread the option. i see the same as you, and don't understand the options. Maybe there isn't an option not to use Tor, idk. I don't have orbot installed, and so have always had 'Use https' selected, which shows up in the top level options as "(only https)". Maybe "Use https" means "don't use Tor", but it's really not clear.
3088  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][CLAM] CLAMs, Proof-Of-Chain, Proof-Of-Working-Stake on: July 08, 2015, 06:12:53 AM
If we ever fork again, what would be the downside of increasing the window and further smoothing difficulty?

I just wonder if there isn't a better way of doing it that doesn't cause this constant swinging up and down.

Is there a chart of the DOGE difficulty over time I wonder. They use some difficulty adjustment system with a fancy name that I forgot. Turbo lightning. Black vortex. You know, something silly like that...
3089  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][CLAM] CLAMs, Proof-Of-Chain, Proof-Of-Working-Stake on: July 08, 2015, 05:22:27 AM
Its a very clever concept. Essentially you were able to reduce the amount of work involved in the proof of stake hashing, while doing it in a manner that does not compromise solving blocks because the difficulty will adjust to make the statistics approximately correct.

I have always wondered why the CLAM difficulty chart moves in such smooth waves compared to other coins, I suppose this could have something to do with that. Although now I wouldn't be surprised if you clammers redesigned the difficulty adjustment code as well.

Three things:

1) When the time granularity changed from 1 second to 16 seconds, another accidental change went in at the same time which missed out a factor of 10^8 in the difficulty calculation. It made blocks 100 million times easier to solve, but gave us 16 times less chances to try, so the two factors combined made it 6.25 million times easier than before to stake. As a result everyone was staking every 16 seconds, everyone was orphaning everyone else for a few hours, the network difficulty adjusted upwards very quickly, and after 3 or 4 hours everything was back to normal. Is was kind of amazing to watch the difficulty adjustment code do its thing and deal with the accidental 10^8 error. Do you have long-term difficulty charts? If so you'll easily see the time I'm talking about. You need a log scale, or it will look like the difficulty was 0 before the steep rise.

Edit: I don't see how to adjust the date range on your chart, but http://blocktree.io/charts/CLAM shows this:



The difficulty wasn't really 0 before November 2014, it just looks that way when you don't use a log y-axis.

2) I don't think the CLAM project is responsible for the 16 second granularity idea or code. I think it is from one of the upstream projects. I've never paid attention to any of them, but I think it was dark coin, black coin, or something like that.

3) The difficulty adjustment code was written by me specifically for CLAM, and went live at the same time as the v2 protocol and the 10^8 error. It's not ideal how it waves up and down several times per day, but it's better than the previous system of difficulty adjustment, which was far too reactive to how long the last 1 block took to stake.
3090  Bitcoin / Mycelium / Re: Mycelium Bitcoin Wallet on: July 08, 2015, 04:28:34 AM
Just letting you guys know I'm having similar issues with the mycelium client, looks like my transactions just aren't being broadcasted.

Do you have "only use Tor" enabled? It's towards the end of the settings.
3091  Bitcoin / Mycelium / Re: Mycelium Bitcoin Wallet on: July 08, 2015, 04:00:31 AM
This message was just posted by Mycelium on Twitter:-

"All the spam flood clogged our Tor server, so we're having to defrag the 160gig database it's on. Will take a while. Sorry"

I went into settings and stopped using Tor and was successfully able to make transactions. My past transactions that did not appear on the Blockchain, have disappeared and my balance is correct.

Twitter really does seem to be a very good place to get CS as no company wishes to give bad publicity in such a Public Forum.

Thanks for forwarding it here. I rarely use Twitter and it didn't occur to me as a place to look for Mycelium annoucements.

I'm pretty sure I don't use Tor at all. I never enabled it in the settings, and so I guess the Tor server isn't the only server having trouble.
3092  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][CLAM] CLAMs, Proof-Of-Chain, Proof-Of-Working-Stake on: July 08, 2015, 03:56:29 AM
Wow! Ok now that you say that I see the same 16 second mask in the check stake code. Now this all makes sense in the context dooglus put it in. This is a pretty cool way to hash for proof of stake. Now I know why I was so lost :/

The nSearchInterval variable used to (before v2 of the protocol) be set to how many seconds since the last time we tried hashing, and the hashing function would then iterate through all those missed seconds.

In v2 we don't want it doing that, so we set the variable to 1 so we get a single iteration of the loop.

The 16 second mask thing makes it 16 times easier to find a block than it was before, since we get to try 16 times less often, but still want the same frequency of blocks being solved. This makes it 16 times more likely that two peers will find a block at the same time, and so we end up with 16 times as many orphans as before. Other than that it's cool though.

Oh, and as for SuperClam's "dooglus can verify and expound on this; but", I don't need to. She pretty much nailed it.
3093  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][CLAM] CLAMs, Proof-Of-Chain, Proof-Of-Working-Stake on: July 08, 2015, 02:03:57 AM
Again, I may be picturing this incorrectly in my head, so let me explain my reading of the code and see if it lines up with yours.

Good idea. When you're specific like that it's easier to pinpoint where we're talking at cross purposes.

15 seconds is the future drift limit (which is I think what you are referring to when you say time granularity?).

No, I'm talking about kernel.h:

Quote
// To decrease granularity of timestamp
// Supposed to be 2^n-1
static const int STAKE_TIMESTAMP_MASK = 15;

and main.cpp:

Quote

    if (IsProtocolV2(nBestHeight+1))
        txCoinStake.nTime &= ~STAKE_TIMESTAMP_MASK;
    int64_t nSearchTime = txCoinStake.nTime; // search to current time

That is 'AND'ing the time with binary ~1111, ie. zeroing the last 4 bits, ie. rounding down to a multiple of 16.

and the search interval is 1 second.

https://github.com/nochowderforyou/clams/blob/master/src/main.cpp#L2603
Code:
int64_t nSearchInterval = IsProtocolV2(nBestHeight+1) ? 1 : nSearchTime - nLastCoinStakeSearchTime;

No, because look at the lines before that:

Quote
   if (nSearchTime > nLastCoinStakeSearchTime)
    {
        int64_t nSearchInterval = IsProtocolV2(nBestHeight+1) ? 1 : nSearchTime - nLastCoinStakeSearchTime;

It only gets into that block if nSearchTime is greater than the last search time. And nSearchTime has already been rounded down to a multiple of 16 seconds. So it will only get into this block at most once per 16 seconds.

I also can't find the relevant part of the code that tells it to pause looking for 16 seconds.

... and now you can! Smiley
3094  Alternate cryptocurrencies / Service Announcements (Altcoins) / Re: Scrypt.CC | Scrypt Cloud Mining on: July 08, 2015, 12:27:44 AM
Rewards will be back, and price also, its only my tip, I believe it /thats why I bought cheap KHSs/, I would bet on it.

This is interesting. What terms and stakes do you propose?
3095  Alternate cryptocurrencies / Service Announcements (Altcoins) / Re: Just-Dice.com : Invest in 1% House Edge Dice Game on: July 08, 2015, 12:22:12 AM
Dooglus can you, or do you already, optimize the transactions for staking? You wrote that a part of the calculation is the size of the clam amount per incoming transaction. Cant you create new transactions and send them automatically to another jd-address when enough coins were deposited? That way you could create clean integers and nothing is rounded down. Though maybe that wont have an effect with the other parameters.

Is there something optimizeable?

We already do - we split the outputs in the JD wallets to be around 13.37 CLAMs each. I think that the rounding down doesn't happen any more - that was on old (the original?) version of the protocol I was describing.

I just replied to the posts you quoted in their own thread.
3096  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][CLAM] CLAMs, Proof-Of-Chain, Proof-Of-Working-Stake on: July 08, 2015, 12:20:18 AM
4 Seconds? Is that needed for finding a block each time?

That's how long it takes to search for staking opportunities each 16 seconds. Most times you check you find that there isn't any such opportunity. There's only one per minute globally, so even with 100% of the staking weight you'd have around a 1 in 4 chance of any particular search being successful. When we do find a block, it will be at a random point through that 4 second search, and so on average a successful search takes 2 seconds ((0 + 4) / 2).

Given a difficulty leading to 16 seconds these 4 seconds would be huge. I mean the difference to orphan blocks or get block orphaned. Or isnt it needed to calculate a block?

I don't think you're understanding still. "Difficulty leading to 16 seconds" isn't what's happening. The difficulty adjusts how hard it is to stake a block. It adjusts such that we find around one block per minute. But blocks can only be found when the time (in seconds since some date in 1970) is a multiple of 16. That only happens every 16 seconds. That's fixed by the protocol (until the developers change the protocol again, of course), and isn't related to the difficulty.

So the 4 seconds are real and you meant it goes through each of that outputs (shouldnt it be inputs as long as they arent sent out?) and checks if it finds a hash? Does the output amount matter here? I mean you described rounding the amount of clams down to an integer. Does this apply to the address these outputs are on, so a big amount of clams or only to the single output? If the latter then one could get an advantage by sending the clams in amounts of 1 to a new address. The chance to find a block would be maximized?

It's a real 4 seconds. 4 seconds out of every 16 seconds the CPU on one core of JD's staking wallet server is pegged at 100%. They're outputs of the transactions that created them. They're not the inputs of any transactions yet, or they wouldn't be unspent. They're potential inputs, if you like, but actual outputs. When they stake they become inputs of the staking transaction.

The rounding down to an integer was related to how the age of an output affected its staking power in an older version of CLAM. I think it used to multiply the value by the age and round down to an integer. I don't think it does that rounding any more, or the multiplication by the age. These days the staking power (called the "weight") is just the same as the value in CLAMs. Each output is considered separately. It doesn't matter if you have lots of 1 CLAMs outputs on a single address, or in lots of different addresses. They each get their own individual chance of staking, with a probability proportional to their own individual value in CLAMs.

There is a benefit to splitting your outputs up into several smaller outputs. Suppose you have 1000 CLAMs. It will stake very quickly, and become 1001 CLAMs. But then it will take 8 hours to mature before it can stake again. The best you could hope for it that it will stake 3 times per day (since that's how many 8 hour maturation periods you can fit into a day).

If instead you split it into 1000 outputs of size 1, each one tries to stake independently. Each one has a 1000 times lower chance of staking than the 1000 CLAM output did, but there are 1000 of them, so it takes roughly the same time for one of them to stake, and turn from 1 CLAM to 2 CLAMs. Then, however, only the 2 CLAM output is frozen for 8 hours while it matures. The other 999 CLAMs continue trying to stake. So you have saved yourself an 8 hour wait for 99.9% of your value.

If you split your value up into *too* many outputs, you'll have too much hashing to do every 16 seconds that you won't be able to get through it all. And if you ever want to spent your outputs, having them split up into millions of tiny pieces makes the transaction which spends them very big (and so very expensive in tx fees).

So there's a tradeoff - split enough, but not too much.
3097  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][CLAM] CLAMs, Proof-Of-Chain, Proof-Of-Working-Stake on: July 08, 2015, 12:07:53 AM
Awesome work with the stake data chilly2k - we're happy to have you as part of the family Smiley

Deb calls it "the clamily".
3098  Alternate cryptocurrencies / Announcements (Altcoins) / Re: [ANN][CLAM] CLAMs, Proof-Of-Chain, Proof-Of-Working-Stake on: July 08, 2015, 12:06:49 AM
The JD staking wallet checks something like 30k outputs in 4 seconds.

I believe this means that you are missing 3 variations of hashes for each output per every 4 seconds?

I don't understand what you're asking, sorry.

The CLAM protocol has a time granularity of 16 seconds. Every 16 seconds it checks each of its unspent outputs to see if it can stake, then sleeps until then next 16-second "tick" of the clock.

If it takes you more than 16 seconds to check all your unspent outputs then you'll be missing out on staking opportunities, because you'll be falling further and further behind. JD is able to do all its staking work in 4 seconds and have a 12 second snooze before the next opportunity comes up.

I don't understand the bit about "missing 3 variations of hashes". There's only one hash per output per 16 seconds.

Now if JD was able to check all of its outputs in 0.001 seconds then it would probably have a lower orphan rate, since it is kind of a race. Maybe that's what you're referring to?
3099  Bitcoin / Mycelium / Re: Mycelium Bitcoin Wallet on: July 07, 2015, 11:42:07 PM
Is there some other place for Mycelium support? It's awfully quiet here for such a serious trouble. Where are the updates?
3100  Bitcoin / Mycelium / Re: Mycelium Bitcoin Wallet on: July 07, 2015, 05:58:03 PM
There has been some serious stress testing / spamming going on.
We are currently investigating if anything is stuck at our nodes, but besides that possibility, some transactions just need ages to confirm, some block explorers report inconsistent or outdated data and there is just in general a huge load on the systems. We are checking everything, sorry for the inconvenience!

I don't think that can be the problem (ie. I don't think it's an issue with the block explorers), because when I rebooted my tablet the unconfirmed transaction vanished. As if it had never happened. And I was able to successfully double-spend it using bitcoind. If the transaction was broadcast successfully surely it wouldn't vanish from Mycelium's transaction list would it?
Pages: « 1 ... 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 [155] 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 ... 573 »
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!