Quantcast
 
 


The Radeon Technologies Group Gets Visual

Author: SKYMTL
Date: December 7, 2015
Share |

Several months ago AMD announced the creation of the Radeon Technologies Group, a branch of the company that was put together and in an effort to enhance their presence in the visual computing field. Headed by Raja Koduri, its primary mission is to push the envelopes of modern graphics solutions while also implementing forward-looking technologies that will ultimately help distinguish upcoming AMD APUs and GPUs from the competition’s offerings. While certain initiatives like AMD’s excellent Crimson software were started well before the RTG was formed, they still live and eventually evolve under its all-encompassing umbrella.

Putting up a unified front has its advantages and in 2016 we will begin seeing the fruits of the Radeon Technologies Group’s labors. To that end, over the next few weeks there will be a number of glimpses into the future of Radeon graphics which will eventually be rolled out into upcoming products. The first sneak peeks into the new world being forged by AMD and the RTG primarily concern how users experience games, movies and other multimedia displayed through their graphics solutions.


Whereas leadership in this field was once determined solely upon framerates, today’s consumers are looking for features that target visual quality right alongside performance. Essentially, throwing frames onto a screen with wanton abandon is no longer good enough if the onscreen visual experience is sub-par. This is why technologies like adaptive synchronization, which thrive in lower-framerate situations but deliver optimal motion clarity, are becoming so popular.

With this in mind, the Radeon Technologies Group’s initial “look ahead” announcements entail areas where they believe the most significant differences can be made in the coming year or so. That means support for HDR displays, rolling out FreeSync into more budget-friendly territories and moving beyond 4K with new display interconnects. Let’s get to it then…..


Displays Go HDR


While the clarity brought about by 4K displays has changed the way many approach viewing an onscreen image, the Radeon Technologies Group has asked a simple question: don’t we need more than just pixels? The answer to that is an emphatic “yes” but that also means providing better pixels. One of the first steps towards that goal is the support for upcoming high dynamic range displays.

Without getting too technical or running too far off base, I’m going to try to explain what display manufacturers are trying to achieve here. In effect, increasing resolution past a certain segment runs into a law of diminishing returns since there’s a point where the human eye stops distinguishing between individual pixels. That means the next great leap forward in display technology won’t entail beyond-UHD pixel counts but rather widening native color spaces, offering higher contrast ratios and delivering vastly improved peak luminescence. This is where high dynamic range gets taken into account since, with its support, displays should technically be able to deliver vastly more realistic images.


In order to understand HDR, it’s important to first discuss luminance since everything in the world around us gives off some measure of it. Measured in nits, luminance runs through an entire gamut of levels of which some are well beyond the ability of humans to actually see but in the areas we can perceive, a wide range makes images more lifelike. It allows blacks to look black rather than grey and colors to have a three dimensional “pop” effect. Photographers have long used high dynamic ranges to enhance the look of their photos.

So what does this have to do with current displays? Simply put, they come up well short when trying to mimic the luminance field our eyes can perceive all around us. Even the best of them cannot display absolute blacks, nor can their peak abilities extend into the higher nit levels necessary for crisper whites and true tonal highlights. These capabilities typically range from 0.1 nits to about 400 nits which covers just a fraction of what I’ve discussed above.

Things will soon change though since HDR-supporting panels are on the way, many of which will be available by next Christmas. This means LCDs that feature true local dimming and OLEDs will push the boundaries of image quality and offer those better pixels I alluded to before.


While high dynamic range for color may seem like the holy grail, actually getting images with its broad color reproduction abilities onto the screen is an extreme challenge. Historically, much of what the human eye could see has been diluted between various image processing stages between the original subject and what we see on the screen. The resulting standard dynamic range likeness is what we have come to live with but it lacks the telltale signs of a “living” recreation.


One of the primary abilities of HDR is to expand the color recreation of today’s content. Whereas the human visual range is truly vast, many of today’s color spaces fail to reach anything near the full coverage necessary for faithful recreation of every hue and tone we can see. For example, the typical sRGB color range displayed by many of today’s monitors and HDTV’s only covers the bare fraction of what’s available.

The next step in this evolution is to move into the so-called “HDR-10” color space which, when looking at the BT.2020 specifications, will move display capabilities into a range we can only dream of right now. Naturally, this will require new encoding methods but that’s a topic for another time.


The place of Radeon Graphics into this equation is an interesting one since the RTG has the opportunity to directly impact of HDR images are displayed on upcoming monitors that support the standard. However, their challenge is a multi-faceted one; right now so-called “HDR” in games is actually rendered as such within a game engine but it then has to be remapped to SDR for output to today’s displays. The end result is highlights that often look blown out and inaccurate colors.

RTG’s solution to this is an enhanced tone-mapping algorithm which allows for a more faithful recreation of in-game HDR information. When interfacing with one of the upcoming HDR displays, the end result is supposed to be much more faithful to real life than ever before.


While future Radeon graphics cards will of course sport full support for 10-bit HDR displays, the good news is that many of AMD’s existing products have forwards compatibility as well. Currently all R9 300 series, Fury and Nano cards will support HDR-10 output in games and for movies through their HDMI 1.4b and DisplayPort 1.2 connectors at rates of up to 4K at 30Hz. Meanwhile, upcoming cards will roll that support out to 4K at 60Hz over HDMI 2.0a and Displayport 1.2 / 1.3.

As for availability of the HDR displays, well that’s a bit of a moving target right now but it looks like they’ll start being available sometime in late 2016. At least that’s something to look forward to.


FreeSync Expands to HDMI


FreeSync has been a hotly contested topic here at Hardware Canucks, both among our staff and within the forums. Initial impressions ran from extremely positive to slightly disappointing due to a number of different factors but as time marched on, AMD’s technology began distinguishing itself as being much more than an “also ran” when compared directly against NVIDIA’s competing G-SYNC. Indeed, through the launch of some spectacular monitors and some key software-side additions like Low Framerate Compensation, it has become an extremely appealing solution for smoothing out on-screen motion.

The Radeon Technology Group is now moving on to something particularly interesting: supporting FreeSync over a standard HDMI interface. This is particularly interesting since unlike their use of DisplayPort’s existing adaptive-sync protocol, there’s no variable refresh rate implementation in current HDMI specifications. One is planned for the future but it is far from being rolled out. As such, the RTG is utilizing HDMI’s vendor-specific extension protocol to implement a variable refresh signal without bypassing any certifications.


If that doesn’t make you sit up and take notice, this likely will: this addition will work with any existing or future GPUs and APUs that already support FreeSync. This means it is backwards compatible with HDMI 1.4 while also being forwards compatible with HDMI 2.0 (a connector not yet available on Radeon cards). It carries over into notebooks as well, some of which may not have DisplayPort-based FreeSync support. Unfortunately, this technology does need a particular set of monitor timings for it to work properly so it isn’t compatible with any existing displays we are aware of.

In order to facilitate this rollout, AMD has partnered up with some pretty major players, a list that is bound to expand in the coming months. More importantly, a number of these like Acer, LG and Samsung are already working on displays that will natively support this technology.


Starting in Q1 2016 we will begin seeing numerous displays with FreeSync over HDMI which certainly points towards a quick turn-around. However, there is one small caveat about this particular version of FreeSync: the HDMI 1.4 connector on current Radeon cards has some serious bandwidth constraints when compared to DisplayPort. As such, the resolutions and refresh rates it supports are truncated to maximums of 1080P at 120Hz and 4K at just 30Hz. For a gaming-focused technology, that’s a pretty limiting factor but not insurmountable.

In order to effectively navigate around the aforementioned bandwidth limitations, all of the initial monitors will boast resolutions that insure refresh rates can effectively remain above the 60Hz mark…so prime FreeSync territory. The only exception to this is the Samsung CF791 but supposedly that particular display will use an upscaler so lower-resolution signals can be processed.


DisplayPort 1.3; A Feature On Upcoming GPUs


Like it or not, things like HDR-10, 4K resolutions, high refresh rates and other technologies take a phenomenal amount of video interconnect bandwidth. While HDMI 2.0 and DisplayPort 1.2 are able to offer some future-proofing, they simply aren’t enough for what’s on the horizon. Hence AMD will be introducing DisplayPort 1.3 on their 2016 mobile (and likely desktop) graphics solutions.


Announced back in 2014, DisplayPort 1.3 offers up an exponential boost in overall bandwidth when compared to all other solutions when its High Bit Rate 3 (HBR3) is used. This standard is fully backwards compatible with existing DP-equipped products and supports the same feature set as its predecessor with some key additions. There’s an HDMI 2.0 compatibility mode with HDCP 2.2 content protection, support for stereo 3D on a 4K screen and potential output to a pair of standard 4K displays. Those are some epic numbers.


With DisplayPort 1.3 there’s also a world of possibilities which open up on the monitor front for Radeon GPUs as well. Currently, due to bandwidth limitations 4K monitors cannot support 120Hz FreeSync but that dream will become a reality in late 2016 (yeah, that’s a long time to wait). Meanwhile we will also see next generation displays supporting both ultra high resolutions, great refresh rates and properly processed HDR gameplay. The possibilities truly are endless.


FreeSync Comes to Notebooks…FINALLY!



Last but not least, there is one final announcement on the FreeSync front and boy did we have to wait a while for this one! Finally there’s a notebook which has an integrated FreeSync panel. Considering how long G-SYNC alternatives have been around, AMD needed a win on the notebook front quite badly.

The Lenovo Y700 includes a 1080P IPS display alongside an AMD FX880P processor and dedicated R9 M380 GPU. A price of just $899 will likely put AMD’s sole notebook FreeSync offering on quite a few gamers’ radars but hopefully this won’t be the first and last product to include these tempting specifications.
 
 

Latest Reviews in Video Cards
November 1, 2017
Enough of the endless unboxings, rumors and presumptions! The GTX 1070 Ti is finally here and it's performance results are impressive to say the least!...
August 14, 2017
After nearly two years of teases, AMD's Vega 64 and Vega 56 have finally arrived. Can these two graphics cards really compete with NVIDIA's Pascal lineup?...
July 30, 2017
AMD has finally revealed almost everything there is to know about RX Vega including its pricing, performance and specifications. Is it a disappointment or everything we were hoping for?...