after each block:
difficulty+=.001*600-block.genTimeInSeconds
I assume you meant "difficulty += .001 * (600 - block.genTimeInSeconds)" instead?
Thus, the difficulty would adjust dynamically up or down every block, with the magnitude of the adjustments being in proportion to the influx or exodus of computing power during that last block.
Actually, in the current algorithm the magnitude of the adjustments is also proportional to the influx or exodus of computing power, it's just that it happens at a period of 2016 blocks instead of being adjusted whenever a block is received.
I think that what you're proposing is for the algorithm to adjust the difficulty continuously rather than periodically.
If so, I think there are better ways.
For one, we shouldn't use floating point to compute the target hash, because rounding differences in architectures could cause some computers to accept one hash but others not.
Second, I don't understand where the ".001" constant comes from. I'm not convinced that your algorithm would accurately reflect the difficulty of the hash generation.. it looks like it would converge into it eventually, but I'm not sure how quickly it would do that.
I think that to achieve your goal, it would be much better to simply change the current algorithm to adjust the difficulty at the end of each block rather than at the end of each sequence of 2016 blocks.
In other words, every time we accept a new block, we'd look at the elapsed time of the latest 2016 blocks and calculate what the new target hash should be.
This seems feasible and easy enough, the only problem is that it's not a backwards-compatible change, so you can only start doing this when everyone has upgraded to a new version which knows how to do the new calculations.