From G-Sync Variable Refresh To G-Sync HDR Gaming Experience

The original FreeSync and G-Sync were solutions to a specific and longstanding problem: fluctuating framerates would cause either screen tearing or, with V-Sync enabled, stutter/input lag. The result of VRR has been a considerably smoother experience in the 30 to 60 fps range. And an equally important benefit was compensating for dips and peaks over the wide ranges introduced with higher refresh rates like 144Hz. So they were very much tied to a single specification that directly described the experience, even if the numbers sometimes didn’t do the experience justice.

Meanwhile, HDR in terms of gaming is a whole suite of things that essentially allows for greater brightness, blacker darkness, and better/more colors. More importantly, this requires developer support for applications and production of HDR content. The end result is not nearly as static as VRR, as much depends on the game’s implementation – or in NVIDIA’s case, sometimes with Windows 10’s implementation. Done properly, even with simply better brightness, there can be perceived enhancements with colorfulness and spatial resolution, which are the Hunt effect and Stevens effect, respectively.

So we can see why both AMD and NVIDIA are pushing the idea of a ‘better gaming experience’, though NVIDIA is explicit about this with G-Sync HDR. The downside of this is that the required specifications for both FreeSync 2 and G-Sync HDR certifications are closed off and only discussed broadly, deferring to VESA’s DisplayHDR standards. Their situations, however, are very different. For AMD, their explanations are a little more open, and outside of HDR requirements, FreeSync 2 also has a lot to do with standardizing SDR VRR quality with mandated LFC, wider VRR range, and lower input lag. Otherwise, they’ve also stated that FreeSync 2’s color gamut, max brightness, and contrast ratio requirements are broadly comparable to those in DisplayHDR 600, though the HDR requirements do not overlap completely. And with FreeSync/FreeSync 2 support on Xbox One models and upcoming TVs, FreeSync 2 appears to be a more straightforward specification.

For NVIDIA, their push is much more general and holistic with respect to feature standards, and purely focused on the specific products. At the same time, they discussed the need for consumer education on the spectrum of HDR performance. While there are specific G-Sync HDR standards as part of their G-Sync certification process, those specifications are only known to NVIDIA and the manufacturers. Nor was much detail provided on minimum requirements outside of HDR10 support, peak 1000 nits brightness, and unspecified coverage of DCI-P3 for the 4K G-Sync HDR models, citing their certification process and deferring detailed capabilities to other certifications that G-Sync HDR monitors may have. In this case, UHD Alliance Premium and DisplayHDR 1000 certifications for the Asus PG27UQ. Which is to say that, at least for the moment, the only G-Sync HDR displays are those that adhere to some very stringent standards; there aren't any monitors under this moniker that offer limited color gamuts or subpar dynamic contrast ratios.

At least with UHD Premium, the certification is specific to 4K resolution, so while the announced 65” 4K 120Hz Big Format Gaming Displays almost surely will be, the 35” curved 3440 × 1440 200Hz models won’t. Practically-speaking, all the capabilities of these monitors are tied into the AU Optronics panels inside them, and we know that NVIDIA worked closely with AUO as well as the monitor manufacturers. As far as we know those AUO panels are only coupled with G-Sync HDR displays, and vice versa. No other standardized specification was disclosed, only referring back to their own certification process and the ‘ultimate gaming display’ ideal.

As much as NVIDIA mentioned consumer education on the HDR performance spectrum, the consumer is hardly any more educated on a monitor’s HDR capabilities with the G-Sync HDR branding. Detailed specifications are left to monitor certifications and manufacturers, which is the status quo. Without a specific G-Sync HDR page, NVIDIA lists G-Sync HDR features under the G-Sync page, and while those features are specified as G-Sync HDR, there is no explanation on the full differences between a G-Sync HDR monitor and a standard G-Sync monitor. The NVIDIA G-Sync HDR whitepaper is primarily background on HDR concepts and a handful of generalized G-Sync HDR details.

For all intents and purposes, G-Sync HDR is presented not as specification or technology but as branding for a premium product family, and right now for consumers it is more useful to think of it that way.

The ASUS ROG SWIFT PG27UQ Review: Premium HDR Gaming When DisplayPort 1.4 Isn’t Enough: Chroma Subsampling
Comments Locked

91 Comments

View All Comments

  • Flunk - Tuesday, October 2, 2018 - link

    I'd really like one of these, but I can't really justify $2000 because I know that in 6-months to a year competition will arrive that severely undercuts this price.
  • imaheadcase - Tuesday, October 2, 2018 - link

    That's just technology in general. But keep a eye out, around that time this monitor is coming out with a revision that will remove the "gaming" features" but still maintain refresh rate and size.
  • edzieba - Tuesday, October 2, 2018 - link

    The big omission to watch out for is the FALD backlight. Without that, HDR cannot be achieved outside of an OLED panel (and even then OLED cannot yet meet the peak luminance levels). You;re going to see a lot of monitors that are effectively SDR panels with the brightness turned up, and sold as 'HDR'. If you're old enough to remember when HDTV was rolling uout, remember the wave of budget 'HD' TVs that used SD panels but accepted and downsampled HD inputs? Same situation here.
  • Hixbot - Tuesday, October 2, 2018 - link

    Pretty sure edgelit displays can hit the higher gamut by using a quantom dot filter.
  • DanNeely - Tuesday, October 2, 2018 - link

    quantum dots increase the color gamut, HDR is about increasing the luminescence range on screen at any time. Edge lit displays only have a handful of dimming zones at most (no way to get more when your control consists of only 1 configurable value per row/column). You need back lighting where each small chunk of the screen can be controlled independently to get anything approaching a decent result. Per pixel is best, but only doable with OLED or jumbotron size displays. (MicroLED - we can barely make normal LEDs small enough for this scale.) OTOH if costs can be brought down microLED does have the potential to power a FALD backlight with an order of magnitude or more more dimming zones than current models LCD can do; enough to largely make halo effects around bright objects a negligible issue.
  • Lolimaster - Tuesday, October 2, 2018 - link

    There is also miniled that will replace regular led for the backlight.

    Microled = OLED competition
    Miniled up to 50,000zones (cheap "premium phones" will come with 48zones).
  • crimsonson - Tuesday, October 2, 2018 - link

    I think you are exaggerating a bit. HDR is just a transform function. There are several standards that say what the peak luminance should be to considered HDR10 or Dolby Vision etc. But that itself is misleading.

    Define " (and even then OLED cannot yet meet the peak luminance levels)"
    Because OLED can def reach 600+ nits, which is one of the standards for HDR being proposed.
  • edzieba - Tuesday, October 2, 2018 - link

    "HDR is just a transform function"

    Just A transform function? [Laughs in Hybrid Log Gamma],

    Joking aside, HDR is also a set of minimum requirements. Claiming panels that do not even come close to meeting those requirements are also HDR is akin to claiming that 720x468 is HD, because "it's just a resolution". The requirements range far beyond just peak luminance levels, which is why merely slapping a big-ass backlight to a panel and claiming it is 'HDR' is nonsense.
  • crimsonson - Wednesday, October 3, 2018 - link

    "
    Just A transform function? [Laughs in Hybrid Log Gamma],"

    And HLG is again just a standard of how to handle HDR and SDR. It is not required or needed to display HDR images.

    "HDR is also a set of minimum requirements"

    No, there are STANDARDS that attempts to address HDR features across products and in video production. But in itself does not mean violating those standards equate to a non-HDR image. Dolby Vision, for example, supports dynamic metadata. HDR10 does not. Does that make HDR10 NOT HDR?
    Eventually, the market and the industry to congregate behind 1 or 2 SET of standards (since it is not only about 1 number or feature). But we are not there yet. Far from it.

    Since you like referencing these standards, you do know that Vesa has HDR standards as low as 400 and 600 nits right?

    And I think you are conflating wide gamut vs Dynamic Range. FALD is not needed to achieve wide gamut.

    And using HD to illustrate your points exemplifies you don't understand how standards work in broadcast and manufacturing.
  • edzieba - Thursday, October 4, 2018 - link

    "And HLG is again just a standard of how to handle HDR and SDR. It is not required or needed to display HDR images."

    The joke was that there are already at least 3 standards of HDR transfer functions, and some (e.g. Dolby Vision) allow for on the fly modification of the transfer function.

    "And I think you are conflating wide gamut vs Dynamic Range. FALD is not needed to achieve wide gamut."

    Nobody mentioned gamut. High Dynamic Range requires, as the name implies, a high dynamic range. LCD panels cannot achieve that high dynamic range on their own, they need a segmented backlight modulator to do so.
    As much as marketers would want you to believe otherwise, a straight LCD panel with an edge-lit backlight is not going to provide HDR.

    "And using HD to illustrate your points exemplifies you don't understand how standards work in broadcast and manufacturing."

    Remember how "HD ready" was brought in to address exactly the same problem of devices marketing capabilities they did not have? And how it brought complaints about allowing 720p devices to also advertise themselves as "HD Ready"? Is this not analogous to the current situation where HDR is being erroneously applied to panels that cannot achieve it, and how VESA's DisplayHDR has complaints that anything below Display HDR1000 is basically worthless?

Log in

Don't have an account? Sign up now