Power Consumption
For this test we hooked up our power supply to a UPM power meter that will log the power consumption of the whole system twice every second. In order to stress the GPU as much as possible we once again use the Batch Render test in 3DMark06 and let it run for 30 minutes to determine the peak power consumption while letting the card sit at a stable Windows desktop for 30 minutes to determine the peak idle power consumption. We have also included several other tests as well. Please note that after extensive testing, we have found that simply plugging in a power meter to a wall outlet or UPS will NOT give you accurate power consumption numbers due to slight changes in the input voltage. Thus we use a Tripp-Lite 1800W line conditioner between the 120V outlet and the power meter.
*Please note that the HD4870 1GB used is a custom Sapphire card which is not based upon the reference design.
Now these are some interesting results since power consumption is all over God’s creation without much rhyme or reason. Our usual power consumption test stresses the GPU while leaving the CPU relatively untouched and this resulted in some perceptible savings when running full-tilt but to be honest with you, it wasn’t much to write home about. On the other hand, idle power consumption really showed one area where the 55nm core is able to shine. This is great for those of you who keep you computer on during the day or use it for non-GPU intensive tasks every now and then.
As we move on to in-game testing, things become foggy since one game shows the 55nm card consuming less while at other times it tops out higher than the 65nm version. You wouldn’t believe how many times I redid these benchmarks since I didn’t believe what I was seeing but the accuracy of the numbers was well borne out in an email sent to us by EVGA:
The 65nm GTX260 takes 182 watts on average and the 55nm, 171 watts.
55nm will draw less power on average across apps but in some cases, can draw more.
Interesting, isn’t it? Even with the move to 55nm, the engineers at Nvidia were only able to shave about eleven watts off power consumption in what is probably a “best case scenario” while in others, consumption will actually increase over the older core. This may have more to do with other items on the PCB such as the power distribution components / layout rather than the GPU but we have yet to receive a response fully answering all of our questions regarding the this issue.
Power Consumption
For this test we hooked up our power supply to a UPM power meter that will log the power consumption of the whole system twice every second. In order to stress the GPU as much as possible we once again use the Batch Render test in 3DMark06 and let it run for 30 minutes to determine the peak power consumption while letting the card sit at a stable Windows desktop for 30 minutes to determine the peak idle power consumption. We have also included several other tests as well. Please note that after extensive testing, we have found that simply plugging in a power meter to a wall outlet or UPS will NOT give you accurate power consumption numbers due to slight changes in the input voltage. Thus we use a Tripp-Lite 1800W line conditioner between the 120V outlet and the power meter.
Game Power Consumption
Now these are some interesting results since power consumption is all over God’s creation without much rhyme or reason. Our usual power consumption test stresses the GPU while leaving the CPU relatively untouched and this resulted in some perceptible savings when running full-tilt but to be honest with you, it wasn’t much to write home about. On the other hand, idle power consumption really showed one area where the 55nm core is able to shine. This is great for those of you who keep you computer on during the day or use it for non-GPU intensive tasks every now and then.
As we move on to in-game testing, things become foggy since one game shows the 55nm card consuming less while at other times it tops out higher than the 65nm version. You wouldn’t believe how many times I redid these benchmarks since I didn’t believe what I was seeing but the accuracy of the numbers was well borne out in an email sent to us by EVGA:
The 65nm GTX260 takes 182 watts on average and the 55nm, 171 watts.
55nm will draw less power on average across apps but in some cases, can draw more.
Interesting, isn’t it? Even with the move to 55nm, the engineers at Nvidia were only able to shave about eleven watts off power consumption in what is probably a “best case scenario” while in others, consumption will actually increase over the older core. This may have more to do with other items on the PCB such as the power distribution components / layout rather than the GPU but we have yet to receive a response fully answering all of our questions regarding the this issue.
Last edited: