NVIDIA GeForce GTX TITAN 6GB Performance Review

Author: SKYMTL
Date: February 20, 2013
Product Name: GeForce GTX TITAN
Share |

GPU Boost 2.0 Explained & Tested

When the GTX 680 was introduced, GPU Boost quickly became one of its most talked-about features. At its most basic, NVIDIA’s GPU Boost monitored the power requirements of your graphics card and dynamically adjusted clock speeds in order to keep it at a certain power target. Since most games don’t take full advantage of a GPU’s resources, in many cases this meant Kepler-based cards were able to operate a higher than reference frequencies.

In order to better define this technology, NVIDIA created a somewhat new lexicon for enthusiasts which was loosely based upon Intel’s current nomenclature. Base Clock is the minimum speed at which the GPU is guaranteed to operate at under strenuous gaming conditions, while Boost Clock refers to the average graphics clock rate when the system detects sufficient TDP overhead. As we saw in the GTX 680 review, the card was able to Boost above the stated levels in non-TDP limiting scenarios but the technology was somewhat limited in the way it monitored on-chip conditions. For example, even though the ASIC’s Power Limit could be modified to a certain extent, the monitoring solution took TDP as a relative term instead of factoring in additional (and essential) items like temperature.

In an effort to bypass GPU Boost’s original limitations, the GPU Boost 2.0 available on Titan will use the available temperature headroom when determining clock speeds. This should lead to a solution that takes into account critical metrics before making a decision about the best clocks for a given situation. The Power Target is also taken into account but you can now tell EVGA Precision or other manufacturers’ software to prioritize either temperatures or power readings when determining Boost clocks.

While some other technologies described in this article will be eventually find their way into Kepler and even Fermi cards, GPU Boost 2.0 will remain a GK110 exclusive.

In order to better demonstrate this new Boost Clock calculation, we ran some simple tests on a GeForce TITAN using different temperature targets while running at a 107% Power Offset. For this test we used EVGA’s Precision utility with priority on the Temp Target.

The success of GPU Boost 2.0 is becomes readily apparent when different temperature targets and their resulting clock speeds are compared against one another. At default, the TITAN is set to run with a target of 80°C and a relatively pedestrian fan speed of between 35% and 45%, making it very quiet.

As we can see, voltage and clock speeds steadily decrease as the target is lowered. This is because the built-in monitoring algorithm is trying to strike a delicate balance between maximizing clock speeds and voltage, while also taming noise output and temperatures. With a TDP of some 250W, accomplishing such a feat isn’t easy at lower temperatures so Boost clocks are cut off at the knees in some cases.

Increasing the Power Target to above 80°C has a positive impact, up to a certain extent. Since there are some limits imposed at GK110’s default Boost voltage (1.162V in this case), clock speeds tend to plateau between 990MHz to 1GHz without modifying the GPU Clock Offset or core voltage. Yes, the GK110 does indeed support voltage increases but more on that in the next section.

This new GPU Boost algorithm rewards lower temperatures which will be a huge boon for people using water cooling or those willing to put up with slightly louder fan speeds. Simply keeping the Temp Target at its default 80°C setting and cooling the core to anything under that point will allow for moderately better clock speeds (up to 1GHz in our tests) with a minimum of effort. If you’re the enterprising type, a combination of voltage and a higher GPU Offset could allow a better cooling solution to start paying dividends in no time. Also remember that ambient in-case temperatures play a huge part in GPU Boost’s calculation so ensuring a well ventilated case could lead to potential clock speed improvements.

As you can imagine, mucking around with the temperature offset could potentially have a dramatic effect upon fan speeds but NVIDIA has taken care of that concern. In the new GPU Boost, their fan speed curves dynamically adjust themselves to the new Temperature Target and will endeavor to remain at a constant frequency without any distracting rotational spikes. There’s hope the new algorithm will reduce TITAN’s acoustical footprint regardless of the temperature target.

Latest Reviews in Video Cards
November 1, 2017
Enough of the endless unboxings, rumors and presumptions! The GTX 1070 Ti is finally here and it's performance results are impressive to say the least!...
August 14, 2017
After nearly two years of teases, AMD's Vega 64 and Vega 56 have finally arrived. Can these two graphics cards really compete with NVIDIA's Pascal lineup?...
July 30, 2017
AMD has finally revealed almost everything there is to know about RX Vega including its pricing, performance and specifications. Is it a disappointment or everything we were hoping for?...