I think the chart is based on bad data. I think it calculates the energy usage based on an average taken back when graphics were being used using some kind of kilowatts per Megahash/s. Now ASICS mining hardware is being used and that number is much smaller. In other words the chart does not take into account the efficiencies in new mining hardware.
Yes. The chart must be using old power usage data. ASICs use very little power compared to CPUs, GPUs, and FPGAs. The dominant cost of mining now is the cost of the hardware, which is not reflected in that chart.
Power usage for GPUs is in the range of 500 - 1000 j/GH, but ASICs are in the range of 1 - 2 j/GH. A 1 TH/s miner will use about 24 kWH of power a day and mine more than $250. At the rate of $0.10 per kWH, the cost per day is only $2.40.