http://www.washingtonpost.com/blogs/wonkblog/wp/2013/04/12/what-bitcoin-teaches-us-about-the-internets-energy-use/In this article the author grossly over estimates the amount of electricity/coast is used by miners. IMO the reporter "Brad Plumer" fails to do their due diligence and evaluate the mining infrastructure. In a fit of lazy reporting he takes an inaccurate guestimation from
http://blockchain.info/stats which states "Electricity consumption is estimated based on power consumption of 650 Watts per gigahash and electricity price of 15 cent per kilowatt hour. In reality some miners will be more or less efficient." This is inaccurate with the intro of FPGA's and the relatively new ASICs on the scenet and the authors stated $147,000/day in electricity can at least be cut in half. With more and more asics begining to take over the mining landscape this number will drop. What do you guys think?