So can it be cheaper?
Of course. The problem with bitcoin is that it uses a relatively simple algorithm, the simpler the algorithm the more optimized miners you can create for it and that is bad for energy efficiency.
Example:
1. In a bitcoin ASIC most of the pathways of the chip circuits are used all the time, lets say 10% of transistors are active at any given time because there aren't that many different things that can happen in the algorithm.
2. In an ETH mining gpu, 99% of the transistors are idle and only 1% are being used because there are so many different ways that the algorithm can go.
I'm pulling these percentages out of my ass since nobody really has those figures but the idea remains the same - the simpler the algorithm, the more optimized hardware, the less idle transistors, the higher the energy draw. There is a physical limit as to how many transistors the worlds chip manufacturers can squeeze out and combined with transistor utilization percentage this pushes down the energy cost on more complicated algorithms.
There is an even better example - "PoC" or proof of capacity algorithms which are really just PoW that focus on huge memory sizes. These are mined using hard drives and due to the physical limits of the spinning disk, 99.999..% of the hardware is idling while working at maximum speed. A burst transaction costs ~0.02kWH in electricity and it would still be that low even if burst cost the same as bitcoin because the network has already reached HDD mining supremacy (that is to say that most of the capable hardware out there is mining burst instead of doing something else).