A Week With NVIDIA's G-SYNC Monitor

Author: SKYMTL
Date: December 11, 2013
Product Name: G-SYNC
Share |

NVIDIAís G-SYNC technology has been the talk of the gaming world since it was announced at their Montreal event. While I was able to see it there first-hand, demos were held under strictly controlled conditions and the amount of actual face time with it was limited. Nonetheless, what I saw was impressive and, like many of you, Iíve been counting down the days until NVIDIA released supporting monitors into the wild so more users could experience what could very well be a watershed moment in gaming-oriented technology.

Unfortunately the widespread release of G-SYNC will only be in Q1 2014 but an early alpha-stage prototype monitor did land on my doorstep last week and, as you can tell by the near lack of content on Hardware Canucks since then, it has ruined my productivity. Simply put, I'm addicted.

Thereís a lot to say about it but with the sample not being a final product and the underlying feature set still evolving, this article will represent a quick rundown of G-SYNC and my time with it. The actual review with a full technical overview will be forthcoming in the new year, once retail monitors become available.

The Technical Bits

Before anything else gets accomplished here, there needs to be some explanation about what NVIDIA is actually trying to accomplish with G-SYNC.

With most current graphics cards providing more than enough horsepower for playable framerates at 1080P or in some cases 1440P, there has been renewed interest in the next logical frontier: onscreen visual quality. Stutter, blur and screen tearing all contribute to lower visual standards but none of them have been properly addressed by monitor manufacturers or current panel technologies. Thatís where G-SYNC steps into the equation.

At its most basic, G-SYNC is a collaborative effort by NVIDIA and several key players in the monitor market (ASUS, BenQ, Acer, Philips and Viewsonic to name a few) to drastically improve the quality and clarity of onscreen motion sequences and provide the smoothest gaming experience possible. It accomplishes this by combining the benefits of VSync with the perceptual fluidity that can be achieved by running at extremely high framerates.

If youíre like me and want to get the most responsive gameplay possible with a minimum of input lag, VSync is a pariah. Unfortunately, with the graphics card spewing information onto the screen in complete disregard for the monitorís refresh cycles, significant image tearing will occur as new frames are presented in the middle of a panel scan. These artifacts are most obvious when panning the camera or during an objectís horizontal movement across the screen.

In order to counteract screen tearing and other visual artifacts, some folks tend to game with VSync enabled but this introduces its own set of problems, even when running a 120Hz gaming monitor. Not only can it add a substantial amount of input lag (a deal killer for FPS gamers) but the framerate also gets tied at the hip to the panelís refresh rate, causing noticeable stutters if framerates dip below the maximum scan cycle.

As long as the GPU / monitor interaction of draw frame, scan, draw frame, scan routine repeats itself, fluidity wonít be negatively affected. However, if the GPU canít keep up and is forced to buffer frames in preparation for the next monitor refresh, stuttering will occur as the framerate steps down to 15 or 30 on a 60Hz monitor. The effect is similar on a 120Hz or 144Hz screen.

NVIDIA has already been partially successful in mitigating some of the basic VSync shortcomings with Adaptive VSync which unlocks any framerates that occur below the monitorís native refresh rate. For example, on a 60Hz monitor with Adaptive VSync enabled framerates below 60FPS are fluidly maintained instead of going through the messy ďstep downĒ process mentioned above.

Unfortunately the so called adaptive technology still doesnít provide optimal input lag conditions, caps maximum framerates in line with the monitorís synchronization frequency and exhibits tearing when operating below VSync's boundaries. For gamers with 120Hz and 144Hz panels, the cap isnít a problem provided their graphics cards can keep up but 60Hz users want all the performance their cards can muster and a 60FPS / 60Hz ceiling wouldnít allow faster GPUs room to shine.

G-SYNC on the other hand allows the monitor and framerates to operate independently. Thus, the graphics card can present frames as quickly as possible, asynchronously to the monitorís refresh rate without introducing screen tearing. So even if a panel is refreshing at 60Hz, G-SYNC can ignore the usual boundaries and produce framerates anywhere above or below that mark. It also neatly avoids the input lag and stuttering normally associated with VSync since frames are presented as soon as they're available rather than being withheld by the GPU. This may not sound like much on paper but in reality, it makes a huge difference.

NVIDIAís technology is packaged into a compact module which replaces the monitorís scaler. Since this is a hardware-centric solution (other than latent driver algorithms to insure proper functionality with the GPU), it can be outfitted to any panel technology; be it TN, IPS, PVA, IZGO or others. Initially, weíll likely see it attached to 1080P TN ďgamingĒ panels with 120Hz or 144Hz refresh rates but that will change as it cascades down (or up) into other segments as well.

The real benefits of G-SYNC will likely be most evident when partners begin launching 60Hz IPS displays with integrated support. Not only will they have the color accuracy and wide viewing angles IPS is known for but 60Hz panels arguably have the most to gain from G-SYNCís stutter-reducing features.

G-SYNCís Current Limitations Explained

While G-SYNC may look like a cure-all for many of the glaring shortcomings in the display market, it isnít infallible and it canít rectify every issue. Stuttering will continue to rear its ugly head when games load textures or when a system storage device becomes a bottleneck. In addition, blur still occurs and detracts from the overall experience despite G-SYNCís improvements in other areas but with tweaks to LightBoost this can be overcome as well.

With input lag all but eliminated from the display side of the equation, according to NVIDIA youíll need a suitably high end gaming mouse with an extreme polling interval to take full advantage of what G-SYNC offers. I didnít notice anything more than the usual improvements in response time and accuracy when moving from an MX510 to a G9x but professional gamers (Iím certainly not one of those) may say otherwise since the screenís response time has been significantly reduced.

There are some additional limitations as well. While it can be used with an SLI system, G-SYNC canít operate in Windowed Mode, doesnít currently support NVIDIAís Surround multi monitor technology and will only be available through a DisplayPort interface. Donít expect any external G-SYNC adapters either since existing monitor scalers canít be modified to work with the technology, nor can they be bypassed via an external hub. As you might expect, the technology is only compatible with NVIDIAís Kepler-based cards but due to the its unique nature, Iím sure we can all understand why this needs to be held close to their chests.

A Week of Gaming on G-SYNC

After a week of using an ASUS VG248QE equipped with a G-SYNC module, I can honestly say that much of its hype is completely justified. In fast-paced shooters the ability to run with ultra-low input lag and very little stutter alongside the image quality benefits typically associated with VSync is nothing short of a revelation. The benefits within plodding RTS games are less pronounced but nonetheless still evident when doing quick viewpoint movements.

Typically my gaming is done on a 60Hz IPS monitor so the difference between my current setup and one thatís G-SYNC enabled has been extreme. The benefits were front and center with noticeable decreases in lag and an altogether cleaner sense of onscreen movement. Believe it or not, switching back to my old gaming methods brought about what can be best described as Stockholm Syndrome. I yearned for the G-SYNC gaming experience even though my IPS panel provided a much richer image. Luckily, once IPS G-SYNC monitors become available, I won't have to make this trade-off.

Even folks currently using 120Hz or 144Hz panels will likely see improvements, even if their graphics card supports NVIDIAís Adaptive VSync and can consistently output between 90 to 120FPS at a given resolution. Switching between an Acer GD235HZ and the VG248QE wasn't quite as jarring as the switch between G-SYNC and 60Hz IPS but once again I could certainly notice how much cleaner this new technology's output was.

On the input lag front, everyone will benefit and for gamers this one feature can make a massive difference in reaction times and accuracy. For an alpha-stage product my experience with G-SYNC thus far has been nothing short of revolutionary.

One of G-SYNCís most exciting aspects is its ability to run framerates and the monitorís vertical refresh rate asynchronously. This will result in massive benefits for lower end setups where VSync leaves performance on the table as the framerates step down to accurately match panel sync times. Even 144Hz panels paired up with an enthusiast-grade system and ultra-high detail settings will see tangible fluidity enhancements from G-SYNC as framerates make their way below 60.

If there is one issue with G-SYNC it is that you have to see it to believe what it can accomplish. Describing the changes it made to how I approached gaming is impossible. Right now, you may think that your current 60Hz or 120Hz panel provides a great experience. I know I did. My tune changed from the moment I started playing Battlefield 4 multiplayer with G-SYNC enabled.

The question here is simple: is NVIDIAís G-Sync a genre-defining technology? Even in its early form Iíd have to respond with an emphatic ďYES!Ē. Plus, with some of NVIDIAís soon-to-be-announced features, supporting monitors will quickly become a must have item for gamers provided the resulting products can hit a variety of price points.

Until our full review is published early next year, I can safely say that after a week of using G-SYNC, thereís really only one word to sum up my experience thus far: awesome.

Latest Reviews in Displays
June 2, 2017
A 35" screen, HDR, a 200Hz refresh rate, a quantum dot display and G-SYNC....the ASUS ROG SWIFT PG35VQ is the granddaddy of all gaming monitors....
April 26, 2017
It might not be much to look at but the Nixeus NX-VUE27P has specifications to impress like a 27" AH-IPS panel and a 1440P resolution. Oh, and did we mention its priced less than $399 USD?...
January 3, 2017
FreeSync is arguably one of the fastest-growing display technologies available today. In order to bring it into the next generation with HDR support and other feature, AMD is now introducing FreeSync...