We all know that the high temperature, the higher current leakage in the transistors. But the question is how much higher?
Suppose same GPU with constant clock, fan, workload...; at 80oC it consumes more energy than when it is at 70oC. It is because of current leakage at higher temp is much higher, VRM efficiency decrease with VRM temp
There is a interesting article here:
http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_480_Amp_Edition/27.html
"so for every °C that the card runs hotter it needs 1.2W more power to handle the exact same load."
Have you ever measure your card to see how much additional power does it take when the temp increase 1oC?