There seems to be a lot of technical problems with increasing the block size in Bitcoin, so I was wondering, could we recalibrate the formula for difficulty so that blocks would be generated faster?
This would be a hard fork change. Therefore, it would have many of the same "technical problems" as increasing the maximum allowed block size.
This would increase tx throughput while keeping blocks the same size.
It would, but it would also increase the rate at which the blockchain grows in size.
This could be done one of two ways.
(1) Keep the block reward the same. Disadvantage of course is that coins would be mined faster reaching the 21 million point much sooner.
Correct.
Advantage would be compatible with previous versions that could mine on this chain.
Incorrect. Previous versions would not recognize any of these blocks as "valid". It would reject them all and would fork the blockchain. Previous versions would be
incompatible and
unable to mine on this chain.
Wouldn't this have the same problem as increasing the blocksize?
That is, nodes that aren't updated wouldn't accept the new blocks as they'd see your difficulty to be too low and you'd fork the main chain?
Yes, unupdated nodes would reject.
However if 5 times as many blocks are being created, even with only 30% of the hash power of the original main chain, we would still be producing blocks at 150% of that chain. Our chain would be longer.
It doesn't matter which chain is longer. Nodes only accept the longest "VALID" chain. An invalid chain can be 1,000,000% longer, and it would still be rejected. Since un-updated nodes won't recognize these faster blocks as being valid, it would create an altcoin fork.