AMD's FreeSync; A Long-Term Review

Author: SKYMTL
Date: May 10, 2015
Product Name: FreeSync
Share |

Performance Outside the “Zone”

Prior to this page I’ve discussed how well FreeSync performs when it’s enabled alongside V-Sync and framerates remain relatively high. However, what happens when a more demanding group of settings is used and the graphics card is rendering less frames? Let’s be honest, when someone invests big-time money in a FreeSync monitor and a higher end graphics card, they want to maintain optimum image quality and high detail levels. In many of today’s triple-A titles, that means performance between 30FPS and 75FPS.

Throughout testing I noticed there was a massive performance drop-off in games like GTA V which caused framerates to absolutely tank in situations where they hit points below 40. During particularly heavy action sequences it was nearly impossible to react since framerates were so abysmally low. This never happened with the G-SYNC equipped RoG Swift so I was perplexed. Let’s start off with the same Unigine test I conducted before but this time at higher detail settings.

I obviously wasn’t going crazy when I noticed some games degenerating into literal slideshows with FreeSync enabled. Whenever framerates remained above 40, AMD’s technology delivered a superlative gaming experience by virtually eliminating stutter and tearing. It was awesome. Get below that and things fell apart in a hurry.

Compare the FreeSync & V-Sync ON results in the chart above to those with V-Sync turned off and it becomes apparent that whenever framerates dip below 38 or 37, they go into freefall all the way down to 20 or so. In these situations it seems like Adaptive Sync turns off, letting V-Sync take over synchronization of frames with refresh rates. The result is jarring to watch, destroys an otherwise good gaming experience and it makes the slight stuttering with V-Sync look like child’s play.

Grand Theft Auto V exhibits the exact same framerate cliff-diving whenever the action gets intense and let me tell you, it kills immersion. To make matters worse, simply having V-Sync enabled arguably delivers a better experience since it doesn’t cause instantaneous dips to 20FPS whenever framerates drop below 40. That area between 30 and 40 FPS feels completely payable territory in GTA, Shadow of Mordor and many other games but in cases like the one above, Adaptive Sync cuts it out of the equation completely.

So what is happening here? I asked Robert Hallock from AMD and his answer was wonderfully straightforward:

On this particular display (the BenQ XL2730Z), the LCD flickers if you rapidly switch between 40Hz, 144Hz and back to 40Hz. We’ve set the V-Sync rate to 40Hz on this display when you fall below the range, which means you would see factors of 40 as the V-Sync jumps, but flickering is completely eliminated.

This is all subject to change, as we can modify these behaviors in the driver. We’re always looking at stuff like this to find new/better ways to deal with the particulars of an LCD, given that each one has its own characteristics.

That’s a simple, effective response but I’ll chime in here as well. The entire idea behind Adaptive Sync is to implement variable panel refresh rates that synchronize properly with frames output from the graphics card. Instead of utilizing the typical ratio step downs for a 144Hz panel of 1:2 (72FPS), 1:3 (48FPS), 1:4 (36FPS) and so on, it fills in the spaces between those outputs so to speak.

We also have to take into account that current panel technology has refresh rate limits and in the XL2730Z’s case the upper and lower limits are 144Hz and 40Hz respectively. This causes a “zone” for lack of a better word between those two points where Adaptive Sync and by association FreeSync seem to be happy operating in.

Synchronizing panel refreshes and framerates below 40Hz is a challenge since the human eye begins picking up on peripheral flickering around the 35Hz (~29ms) mark while discernible direct screen flicker can be detected at 30Hz (~34ms). If Adaptive Sync was allowed to tie framerates with refreshes below the 35FPS mark, flickering would quickly become an issue. Therefore, below the variable refresh rate zone FreeSync is turning off and simply letting the regular V-Sync operation take over. Hence why turning off V-Sync while keeping FreeSync enabled didn’t exhibit this problem.

At first glance this shouldn’t have caused too much of a problem since V-Sync allows for several step-down ratios from 144 that occur below 40. 1:4 (36FPS), 1:5 (29FPS) and 1:6 (24FPS) are all possibilities which would have caused some stuttering as the panel / GPU handoff varied between them but nothing like what’s occurring. Instead, AMD is syncing to a drastically lower refresh rate in an effort to eliminate flickering. The end result is pretty abysmal performance whenever framerates dip below 40 but, according to AMD, the behavior can be further refined in future driver iterations. In addition, there could be other monitors with lower minimum refresh rates which open a broader window for FreeSync to operate in.

In order to visually compare this situation to G-SYNC, I ran the identical tests but modified settings so the GeForce card (in this case a GTX 980) was operating even lower framerates than AMD’s solutions.

In Unigine we can see that AMD’s step-down process is a marked departure from NVIDIA’s implementation which provides a universally smooth output below 40FPS and even as low as 28FPS in some cases.

It seems like NVIDIA has things figured out though since G-SYNC seems to operate just perfectly below 40FPS. How this has been accomplished is a closely guarded secret but it seems like harnessing full control over the monitor’s scaler via an add-in module does seem to have some benefits.

If anything the situation above shows why AMD so desperately needs FreeSync-compatible Crossfire drivers; with two cards working in sync to boost performance, the possibility of framerates dipping below the “zone” decreases by a substantial amount.

Latest Reviews in Video Card News
August 14, 2017
After nearly two years of teases, AMD's Vega 64 and Vega 56 have finally arrived. Can these two graphics cards really compete with NVIDIA's Pascal lineup?...
July 30, 2017
AMD has finally revealed almost everything there is to know about RX Vega including its pricing, performance and specifications. Is it a disappointment or everything we were hoping for?...
June 12, 2017
Cramming low power components into ultrabooks is easy. But what NVIDIA has done is shoehorn some serious GPU horsepower into some of the thinnest gaming notebooks around without requiring many sacrifi...