FurMark vs GPU Caps - Stress Issue & Temps
Hey all. I'm going to keep this simple.
If I run GPU Caps Open GL Stability test at 1680*1050 MSAA x8, my Radeon 5850 temps go to 82C max with a fan speed auto regulated at 33%.
If I run FurMark Stress test at the same resolution, my Radeon 5850 temps top out at 72C max with the fan auto regulated maxing out at 33%.
The temps are measured through GPU-Z and Catalyst Control Center and in every instance, both read the same temps.
So what gives? Isn't 100% GPU utilization 100% GPU utilization? :doh:
Likely a "software" discrepancy...reading Temps...
Hmm, a few days late, but oh well.
Actually, not all 100% are equal, for CPU or GPU. Technically speaking, it's impossible to truly use 100% of either, because they have such a variety of functions they can perform. You might say that it's a little bit like trying to use 100% of Swiss Army knife - you can't open a bottle of wine while you're sawing a small branch while you clean the dirt out from under your fingernails. And even if you do try, none of those three activities will be happening at full speed.
In the case of your GPU, where you get the temperature difference is because some functions are more power intensive than others. So even if two different programs are maxing out the GPU, one of them might be maxing out the GPU's abilities in an area that's more power intensive. You can see this sort of thing with CPU's as well, where programs like Prime95, HyperPi or Linpack will all make your task manager hit 100%, but the resulting temperatures can easily vary by 10*C or more.
gpus can run pretty hot.. you could always try turning that fan up to 80 percent and see how it performs. My 285 wont go over 60c with the fan at 80%. If i leave the fan at stock it goes as high or higher than yours.
Dunno if riva tuner works on ATI cards...
|All times are GMT -7. The time now is 09:28 PM.|