I'm overlooking something here, and I'm sure someone can point out to me where my error is.

If a pool finds a block on average every 60 minutes ("100%" luck), then...

1. If that pool finds a block in 1 minute, it's 60x better than expected?

2. Therefore, at the opposite end of luck, 60x worse than expected is 3600 minutes?

3. However the average of 1 and 3600 isn't 60.

Is #3 the flawed premise, or is something wrong with #1 and #2? Or all the above?

Thanks.

M