Please, someone explain. I am baffled.

Power consumption can be guessed by TDP * (current gpu volts / stock GPU volts)^2 * (current gpu speed / stock gpu speed).

Power is not proportional to the square of voltage. According to the Shockley Diode Equation, Current is exponentially proportional to the Voltage, so the power is actually something like P ~ V*e^cV.

Where c is some constant governed by device physics and temperature.

But what is this? In this scenario, the owner listed that card to only be drawing 128 watts for the ENTIRE SYSTEM, but by my calculations, the card should be drawing 157 watts ALONE.

OP only has GPU plugged into the PSU he is measuring. Which is a bit skewed also due to not measuring power being drawn from PCI-E slot.