|
February 09, 2014, 09:00:48 PM |
|
Hi fghj,
Yes, the current usage increases when the cores get warmer, it is how any silicon behaves, but in this case the temperatures were same (until the card overheated due to so much current going in). In either case, it wouldn't have explained so drastic difference. I also think the current reading is correct. Even tested Intel and AMD mobo in the same case just to see if it will behave differently.
I can tell that there is something really screwed on Intel mobos. Well, to be honest it only happened in 2 Intel's so far both have igpu but I don't have many of those machines to test. One Asrock mobo and the other had Asus mobo and one machine had ivybridge and other had sandybridge processor. (with 2 different sets of R9 280x'es
For example at 800mhz cpu/1000mhz memory at about 0.925v both cards are able to run with 50%-70% fan speeds at ~73-75C temperature on any AMD system. But when an Intel CPU (with igpu) is used, the upper card shows ~20A current and fan hits 100% quickly, and also card still goes over 82C even at 100% fan speed. I dont see any reason to believe this has anything to do with card temperatures or cards themselves or settings (which I use the same in every machine). More strangely, GPU-Z shows correct core voltage for the card which uses so much power. Also, I have burned the voltage settings to the card bios using vbe editor/atiflash and I do not set them after booting the machine so it can't be some setting applied to wrong card etc.
Does anybody run R9 280x cards on Intel systems and tried to reduce voltages but failed to reduce current usage? I don't know how many are trying to tune their cards for efficiency?
Thanks!
|