but still, for a system that is supposed to hold and secure all sorts of valuable assets, i know i'm not the first one to bring up the concern regarding infinite loops due to Turing completeness.
I'm satisfied that they can fix infinite loops by paying per cycle.
What's more concerning is that nobody has bothered to do many actual use case evaluations and cost/benefit analysis.
Etherum is supposed to be, fundamentally, a blockchain-based computer.
Because it's a global blockchain, the cycle times need to be somewhere on the order of minutes to ensure synchronization.
Cycle times in the minutes means a clock frequency in the
millihertz range.
The processor in your smartphone operates in the
gigahertz range.
That's not quite a fair comparison - each tick of the Ethereum computer can do more useful work than a single tick in an transitional CPU. How much more work? Maybe about a million times more work.
This means Ethereum is a computer with an effective clock speed in the kilohertz range. (The Intel 8086 processor released in 1976 had a minimum 5 megahertz clock speed)
Sure, it's Turing-complete, but the kinds of applications you're going to run on a computer with kilohertz-scale cycle times and access latencies measured in minutes are going to resemble the kind of programs that used to be written on punch cards more than they'll resemble Windows 8.
Oh, and how about the cost?
Let's assume there really are applications that are appropriate for the Ethereum computer. Not only are there applications, but the market demand for these applications is great enough that the transaction fees will pay for the operation of the Ethereum network (no network can pay for itself via currency printing for ever, after all).
Computations performed on the CPU in your PC only need to involve the transistors in a single chip, and are generally only performed once.
Computations performed on Ethereum need to be duplicated by CPUs all over the world and broadcast all over the globe.
Running your applications on Ethereum (absent any currency-printing subsidy) is going to be millions of times more expensive than running it on a local CPU.
Certainly there will be some applications that absolutely require what Ethereum does and will be willing to pay six orders of magnitude more than other alternatives in order to get it, but will there be enough of those applications to pay for the operation of the Ethereum network?