Image Quality (Uniformity / Panel & Gamma Performance)
Please remember that the settings below have been calibrated for our specific environment and your viewing conditions may differ from ours.
Mode Used: "User Mode"
- All tests done at default settings at 120 cd/m2.
- Unless otherwise noted, the tests were carried out via DisplayPort or HDMI
In a perfect world a screen’s brightness output would be equal throughout the entire panel. This is not a perfect world, but the lower the variation the less chances you will notice overly bright or dark sections on the screen. For the consumer LCD marketplace a variance of 10% is our gold standard but anything below 15% can be considered excellent as we doubt anyone will notice a -7.5 to +7.5 variation. A variation above 15% but below 24% can be considered adequate, but anything above this does not meet our basic minimum standards.
Considering how downright massive this monitor is a panel variance of 13% is not all that bad, in fact it is above average. Above average or not the reason for most of this variance is because of the panel’s curvature. It may be gentle compared to some but it still is enough that creating the same brightness level across the entire screen really is difficult.
In a perfect world a screen’s real world response rate would be so high that motion blur, ‘ghosting’, ‘reverse-ghosting’ would be a thing of the past. No matter how fast the action on screen all images would be represented in pristine condition similar in quality to a static image. This is not a perfect world, but the less amounts of blurring which occurs the less chances you will notice the issue in real world scenarios. While the panels response rate (ms) and and frame rate (Hz) can give a fairly rough idea of how much blurring to expect it is not the end all and be all.
To this end we have taken PRAD’s Pixel Persistence Analyzer ‘Streaky Pictures’ program and using a high speed camera captured exactly how much and what kind of motion blur you can expect from a given monitor.
As expected the combination of 100Hz and G-SYNC is a superior combination to the XR341CK’s 75Hz + FreeSync combo. However, while high speed cameras will pick up a difference between the two, the human eye likely won't be able to distinguish one from the other. The only exception to this is operation below each screen's respective adaptive sync range. While AMD has done an admirable job in rectifying their limitations here, G-SYNC is still arguably superior and continues to provide the best option for blur and judder-free motion.
Gamma correction is one of the hardest terms to explain. However, for our purposes the gamma correction of any electronics device is how bright or dark an image will be displayed on a screen.
All PC devices now use 2.20 gamma as the default. Any variance from this will result in an image being either underexposed which will create black crush and underexposed shadow detail or washed out with too little black level detail (aka being over-exposed).
While 2.20 is the gold standard, a minor deviation of 0.10 will in all likelihood never be noticed by anyone other than professional photographers. Higher levels of deflection however will be noticed by just about everyone.
At 2.22 this monitor may be slightly worse than the XR341CK but it is still close enough to perfection that most users won't care. The only people who will be disappointed with this minor level of deviation are professionals whose livelihood depends on producing perfect, accurate images - and those professionals will own a colorimeter and use it before they ever bother with 'factory' settings. All in all, this may be down to some panel to panel variance rather than any limitation of the X34.
|Latest Reviews in Displays|