Quantcast
 


A Week With NVIDIA's G-SYNC Monitor

Author: SKYMTL
Date: December 11, 2013
Product Name: G-SYNC
Share |

NVIDIA’s G-SYNC technology has been the talk of the gaming world since it was announced at their Montreal event. While I was able to see it there first-hand, demos were held under strictly controlled conditions and the amount of actual face time with it was limited. Nonetheless, what I saw was impressive and, like many of you, I’ve been counting down the days until NVIDIA released supporting monitors into the wild so more users could experience what could very well be a watershed moment in gaming-oriented technology.

Unfortunately the widespread release of G-SYNC will only be in Q1 2014 but an early alpha-stage prototype monitor did land on my doorstep last week and, as you can tell by the near lack of content on Hardware Canucks since then, it has ruined my productivity. Simply put, I'm addicted.

There’s a lot to say about it but with the sample not being a final product and the underlying feature set still evolving, this article will represent a quick rundown of G-SYNC and my time with it. The actual review with a full technical overview will be forthcoming in the new year, once retail monitors become available.


The Technical Bits


Before anything else gets accomplished here, there needs to be some explanation about what NVIDIA is actually trying to accomplish with G-SYNC.

With most current graphics cards providing more than enough horsepower for playable framerates at 1080P or in some cases 1440P, there has been renewed interest in the next logical frontier: onscreen visual quality. Stutter, blur and screen tearing all contribute to lower visual standards but none of them have been properly addressed by monitor manufacturers or current panel technologies. That’s where G-SYNC steps into the equation.

At its most basic, G-SYNC is a collaborative effort by NVIDIA and several key players in the monitor market (ASUS, BenQ, Acer, Philips and Viewsonic to name a few) to drastically improve the quality and clarity of onscreen motion sequences and provide the smoothest gaming experience possible. It accomplishes this by combining the benefits of VSync with the perceptual fluidity that can be achieved by running at extremely high framerates.


If you’re like me and want to get the most responsive gameplay possible with a minimum of input lag, VSync is a pariah. Unfortunately, with the graphics card spewing information onto the screen in complete disregard for the monitor’s refresh cycles, significant image tearing will occur as new frames are presented in the middle of a panel scan. These artifacts are most obvious when panning the camera or during an object’s horizontal movement across the screen.


In order to counteract screen tearing and other visual artifacts, some folks tend to game with VSync enabled but this introduces its own set of problems, even when running a 120Hz gaming monitor. Not only can it add a substantial amount of input lag (a deal killer for FPS gamers) but the framerate also gets tied at the hip to the panel’s refresh rate, causing noticeable stutters if framerates dip below the maximum scan cycle.

As long as the GPU / monitor interaction of draw frame, scan, draw frame, scan routine repeats itself, fluidity won’t be negatively affected. However, if the GPU can’t keep up and is forced to buffer frames in preparation for the next monitor refresh, stuttering will occur as the framerate steps down to 15 or 30 on a 60Hz monitor. The effect is similar on a 120Hz or 144Hz screen.


NVIDIA has already been partially successful in mitigating some of the basic VSync shortcomings with Adaptive VSync which unlocks any framerates that occur below the monitor’s native refresh rate. For example, on a 60Hz monitor with Adaptive VSync enabled framerates below 60FPS are fluidly maintained instead of going through the messy “step down” process mentioned above.

Unfortunately the so called adaptive technology still doesn’t provide optimal input lag conditions, caps maximum framerates in line with the monitor’s synchronization frequency and exhibits tearing when operating below VSync's boundaries. For gamers with 120Hz and 144Hz panels, the cap isn’t a problem provided their graphics cards can keep up but 60Hz users want all the performance their cards can muster and a 60FPS / 60Hz ceiling wouldn’t allow faster GPUs room to shine.

G-SYNC on the other hand allows the monitor and framerates to operate independently. Thus, the graphics card can present frames as quickly as possible, asynchronously to the monitor’s refresh rate without introducing screen tearing. So even if a panel is refreshing at 60Hz, G-SYNC can ignore the usual boundaries and produce framerates anywhere above or below that mark. It also neatly avoids the input lag and stuttering normally associated with VSync since frames are presented as soon as they're available rather than being withheld by the GPU. This may not sound like much on paper but in reality, it makes a huge difference.


NVIDIA’s technology is packaged into a compact module which replaces the monitor’s scaler. Since this is a hardware-centric solution (other than latent driver algorithms to insure proper functionality with the GPU), it can be outfitted to any panel technology; be it TN, IPS, PVA, IZGO or others. Initially, we’ll likely see it attached to 1080P TN “gaming” panels with 120Hz or 144Hz refresh rates but that will change as it cascades down (or up) into other segments as well.

The real benefits of G-SYNC will likely be most evident when partners begin launching 60Hz IPS displays with integrated support. Not only will they have the color accuracy and wide viewing angles IPS is known for but 60Hz panels arguably have the most to gain from G-SYNC’s stutter-reducing features.


G-SYNC’s Current Limitations Explained


While G-SYNC may look like a cure-all for many of the glaring shortcomings in the display market, it isn’t infallible and it can’t rectify every issue. Stuttering will continue to rear its ugly head when games load textures or when a system storage device becomes a bottleneck. In addition, blur still occurs and detracts from the overall experience despite G-SYNC’s improvements in other areas but with tweaks to LightBoost this can be overcome as well.

With input lag all but eliminated from the display side of the equation, according to NVIDIA you’ll need a suitably high end gaming mouse with an extreme polling interval to take full advantage of what G-SYNC offers. I didn’t notice anything more than the usual improvements in response time and accuracy when moving from an MX510 to a G9x but professional gamers (I’m certainly not one of those) may say otherwise since the screen’s response time has been significantly reduced.

There are some additional limitations as well. While it can be used with an SLI system, G-SYNC can’t operate in Windowed Mode, doesn’t currently support NVIDIA’s Surround multi monitor technology and will only be available through a DisplayPort interface. Don’t expect any external G-SYNC adapters either since existing monitor scalers can’t be modified to work with the technology, nor can they be bypassed via an external hub. As you might expect, the technology is only compatible with NVIDIA’s Kepler-based cards but due to the its unique nature, I’m sure we can all understand why this needs to be held close to their chests.


A Week of Gaming on G-SYNC


After a week of using an ASUS VG248QE equipped with a G-SYNC module, I can honestly say that much of its hype is completely justified. In fast-paced shooters the ability to run with ultra-low input lag and very little stutter alongside the image quality benefits typically associated with VSync is nothing short of a revelation. The benefits within plodding RTS games are less pronounced but nonetheless still evident when doing quick viewpoint movements.

Typically my gaming is done on a 60Hz IPS monitor so the difference between my current setup and one that’s G-SYNC enabled has been extreme. The benefits were front and center with noticeable decreases in lag and an altogether cleaner sense of onscreen movement. Believe it or not, switching back to my old gaming methods brought about what can be best described as Stockholm Syndrome. I yearned for the G-SYNC gaming experience even though my IPS panel provided a much richer image. Luckily, once IPS G-SYNC monitors become available, I won't have to make this trade-off.

Even folks currently using 120Hz or 144Hz panels will likely see improvements, even if their graphics card supports NVIDIA’s Adaptive VSync and can consistently output between 90 to 120FPS at a given resolution. Switching between an Acer GD235HZ and the VG248QE wasn't quite as jarring as the switch between G-SYNC and 60Hz IPS but once again I could certainly notice how much cleaner this new technology's output was.

On the input lag front, everyone will benefit and for gamers this one feature can make a massive difference in reaction times and accuracy. For an alpha-stage product my experience with G-SYNC thus far has been nothing short of revolutionary.


One of G-SYNC’s most exciting aspects is its ability to run framerates and the monitor’s vertical refresh rate asynchronously. This will result in massive benefits for lower end setups where VSync leaves performance on the table as the framerates step down to accurately match panel sync times. Even 144Hz panels paired up with an enthusiast-grade system and ultra-high detail settings will see tangible fluidity enhancements from G-SYNC as framerates make their way below 60.

If there is one issue with G-SYNC it is that you have to see it to believe what it can accomplish. Describing the changes it made to how I approached gaming is impossible. Right now, you may think that your current 60Hz or 120Hz panel provides a great experience. I know I did. My tune changed from the moment I started playing Battlefield 4 multiplayer with G-SYNC enabled.

The question here is simple: is NVIDIA’s G-Sync a genre-defining technology? Even in its early form I’d have to respond with an emphatic “YES!”. Plus, with some of NVIDIA’s soon-to-be-announced features, supporting monitors will quickly become a must have item for gamers provided the resulting products can hit a variety of price points.

Until our full review is published early next year, I can safely say that after a week of using G-SYNC, there’s really only one word to sum up my experience thus far: awesome.
 
 

Latest Reviews in Displays
May 21, 2014
BenQ's RL2460HT is gaming monitor which ditches the current 120Hz trend for a more affordable approach. It utilizes upgraded electronics to achieve a 1ms response time and incorporates a ton of gamin...
December 12, 2013
NVIDIA's G-SYNC is almost ready for prime time and we've spend a week up close and personal with a supporting monitor.  This is one new technology you don't want to miss!...
November 19, 2013
ViewSonic's VX2252 costs just $150.  Normally, monitors in this price range represent the bottom of the barrel...but not this time around....