Quantcast
 
 


AMD's FreeSync; A Long-Term Review

Author: SKYMTL
Date: May 10, 2015
Product Name: FreeSync
Share |

For the three weeks or so, I've been living in a vacuum. Our lab only received our AMD FreeSync monitor relatively late (on the day of launch actually) and since then Iíve been avoiding reading anyone elseís opinions about the technology, its benefits and potential drawbacks. Simply put, I needed to formulate an unbiased opinion based upon my game-time experiences rather than looking at FreeSync based upon technological expectations.

Unlike many of our other reviews youíll be seeing my personal, somewhat biased opinions along with the usual litany of raw performance testing and a bit of technical jargon. Yes itís different but the only way to truly experience and understand the benefits of technologies like FreeSync is from a first-hand perspective. This took a bit longer than I would have liked (the BenQ XL2730Z now has over 175 hours of use on it) but in the end I believe this will result in a somewhat unique perspective.


Before I go on, there's a small admission I have to make. I'm a screaming fanboy of NVIDIA's G-SYNC technology. An Acer XB280HK has been my primary gaming display for the better part of four months now and I have logged over 250 hours on it. So when AMD said they were sending along a FreeSync monitor, I approached it with a mix of excitement and trepidation.

Weíve already seen how AMDís initiatives offer up a mixed bag with more recent technologies like Eyefinity and Mantle meeting with huge success while others like HD3D and TrueAudio ultimately failed to meet expectations. I simply didnít want FreeSync to go down the latter path since, when taken at face value, it has so much to offer gamers.


In order to understand AMDís FreeSync, some basic knowledge of VESAís DisplayPort Adaptive Sync and V-Sync in general is necessary. Gamers who want the best possible performance and minimal mouse latency typically play with V Sync disabled which allows framerates to run independently of the monitorís refresh rates. This acts as a double edged sword since screen tearing occurs as multiple frames tend to be displayed onscreen during a monitor refresh cycle. The end result may be lightning quick reaction times but a distracting onscreen image that is filled with artifacts since the displayís refreshes arenít properly synchronized with the frames being delivered by the graphics card.

Meanwhile, those who care about achieving optimum image quality tend to enable V-Sync but that caps framerates at the monitorís maximum refresh rate. It also increases mouse latency and introduces noticeable stuttering if the graphics card canít keep up with the vertical refresh rates and buffers frames in preparation for the next monitor refresh. When this happens, the previous frame is repeated before the current frame is launched onto the screen.

Another problem with enabling V-Sync is a step-down effect which happens when the system is displaying framerates that are below the monitorís native refresh rate. On a 60Hz monitor, that could lead to FPS jumping and stuttering between 60, 30, 15 and other integer multiples as the graphics card tries to synchronize its output in parallel to the display. Panels with higher refresh rates somewhat mitigate the performance capping issues but still suffer from stuttering.


Adaptive Sync is a technology baked into the DisplayPort 1.2a protocol that is meant to eliminate the aforementioned tearing and stuttering by synchronizing the GPU and monitor so frames are displayed when ready through a framerate-aware variable monitor refresh rate. However, while DisplayPort Adaptive Sync is a mechanism to achieve better onscreen fluidity, it requires system-end support to function properly.

FreeSync is simply the driver-side facilitator which allows refresh rate information to be passed between the source (in this case an AMD graphics card) and the panel. Thereís a handshake protocol which allows the monitor to tell the GPU the fastest time itís ready to accept a frame and the slowest. In effect this gives the GPU full knowledge of whatís happening without having to actually polling the monitor first while the display works off of this information to vary its refresh rate accordingly.

Since all of this is accomplished within hardware, FreeSync runs agnostically from any game-level hiccups that may be encountered. In short, like G-SYNC, FreeSync should be compatible with every game in existence, regardless of the API (DirectX, Mantle, OpenGL, etc.) being used since it doesn't rely on driver profiles to work.

As this functionality runs hand in hand with the DisplayPort specification, costs are kept down since additional hardware isnít needed. It should also allow FreeSync to be easily ported over to notebooks, a market segment which could seriously benefit from this technology.


Quite a few discussions have homed in on how FreeSync stacks up to NVIDIAís competing G-SYNC since both technologies parallel one another in many ways. They actually claim to accomplish the same set of goals by smoothing out onscreen animations and eliminating the image artifacts normally associated with running or disabling V-SYNC. It seems straightforward right? Not so fast because both companiesí methods vary quite a bit.

While AMD is harnessing Adaptive Syncís benefits without the need for proprietary, expensive hardware, G-SYNC panels require an add-on module which replaces a monitorís scaler. As a result FreeSync monitors are generally less expensive than their competitors since the necessary protocols are included directly in their EDID. Interestingly enough, this also means that any DP 1.2a-equipped monitor could be FreeSync compatible provided thereís a compatible firmware for it.

Raw cost of entry may be a major determining factor in the FreeSync versus G-SYNC battle but some of the other differences are more nuanced. Even though AMDís technology can only run through DisplayPort it also allows for other connectivity like DVI, HDMI and even VGA to be built into supporting products, something NVIDIA doesnít offer. This may not be a big deal for many gamers but during my time with G-SYNC I missed the ability to use my notebook on a larger screen via its HDMI output.

A lot of what you see in the chart above is also pure marketing mumbo jumbo. For example, AMDís FreeSync may support a wider refresh rate range than G-SYNC but youíll always be limited in this respect by the panel itself, none of which even begin to approach the 9Hz to 240Hz claims. As Iíll explain a bit later, the so-called ďperformance penaltyĒ needs to be taken with a grain of salt as well.


One of the main complaints leveled by gamers at V-SYNC is mouse lag. While Iím more averse to onscreen artifacts than a nearly imperceptible amount of input hesitation, I also prefer slower paced strategy games, think the current first person shooter genre is boring as hellÖand I love rabbits. Even though I was once placed in the top 50 global Counter Strike players (little known fact alert!) these days I'm a good bit older and the reflexes are shot to hell from too much wine so I won't cry bloody murder at a missed headshot.

With that being said, I have to applaud AMD for the way theyíre handling V-Sync here. Whereas NVIDIA automatically enables vertical synchronization whenever G-SYNC is turned on, FreeSync can operate independently of the screenís vertical synchronization locks. This allows for a gaming experience tailored to your liking. Want the best possible motion quality with typical input latency? Turn on V-Sync alongside FreeSync. Want to keep FreeSyncís ability to minimize tearing and improve latency? Simply turn off V-Sync in whatever game youíre playing but allow FreeSync to do its thing.


Within all of these potential benefits of FreeSync, there are some notable areas where it falls short as well. For starters, it boasts somewhat limited compatibility compared to G-SYNC. While AMDís feature is limited to current-generation cards, GeForce products that support G-SYNC date back to the GTX 600-series days with everything faster than a GTX 650 Ti Boost Edition able to communicate with certified monitors.

Perhaps the largest miss for FreeSync is its lack of Crossfire support at launch. Adaptive synchronization technologies require higher framerates to showcase their true potential and dips below the monitorís vertical refresh rate window tend to cause the exact artifacts AMD is seeking to avoid (more on this later).

Luckily AMD does natively support Virtual Super Resolution (VSR) on FreeSync but without Crossfire, internally boosting rendering resolutions would be counter-intuitive. On the other hand, NVIDIA has always supported multi card configurations on G-SYNC. G-SYNC is also supported within Dynamic Super Resolution super sampling setting but their current drivers donít allow for a combination of SLI, G-SYNC and DSR.


Activating FreeSync couldnít be easier. Simply go into AMDís Catalyst Control Center and turn it on within the display settings dialog area. Typically a pop-up will appear when Windows starts and you can check the settings by following the onscreen instructions.


My time with FreeSync wasnít completely smooth though. At random intervals the warning above would present itself but from what I could gather, this didnít negatively affect anything and it didnít seem like things were going awry. I have a feeling it was a false positive so donít panic.

When taken at face value, FreeSync looks like a bona fide competitor to G-SYNC but when push comes to shove both solutionsí goals are exactly the same: to provide a feature that will draw people to purchase a given graphics architecture. We have to remember that NVIDIA has a year-long lead on AMD but does that actually translate into a drastically different first-hand gaming experience? Thatís what Iím going to explore in the upcoming pages while also endeavoring to explain a few more of FreeSyncís more intricate nuts and bolts.
 
 
 

Latest Reviews in Video Card News
August 14, 2017
After nearly two years of teases, AMD's Vega 64 and Vega 56 have finally arrived. Can these two graphics cards really compete with NVIDIA's Pascal lineup?...
July 30, 2017
AMD has finally revealed almost everything there is to know about RX Vega including its pricing, performance and specifications. Is it a disappointment or everything we were hoping for?...
June 12, 2017
Cramming low power components into ultrabooks is easy. But what NVIDIA has done is shoehorn some serious GPU horsepower into some of the thinnest gaming notebooks around without requiring many sacrifi...