Advanced displays clearly drive sales of higher-end GPUs, so NVIDIA has multiple initiatives to make high-end gaming monitors available. One of such initiatives is the BFGD (big format gaming display) project and it is designed to bring ultra-large gaming LCDs to the market. BFGD is currently supported by several suppliers of displays, but at present only HP’s Omen X 64.5-incher is available. This is going to change later this year as ASUS is prepping to launch its own BFGD.

The ASUS ROG Swift PG65UQ uses a 64.5-inch 8-bit 4K Ultra-HD AMVA panel featuring 750-1000 nits brightness (typical/HDR), a 3200:1-4000:1 contrast ratio (minimum/typical), 178° viewing angles, a 120 - 144 Hz refresh rate (normal/overclocked), and a 4 ms GtG response time with overdrive enabled. Since the monitor belongs to the G-Sync HDR range, it has a 384-zone full direct-array backlight enhanced with quantum dots to guarantee high contrast as well as precise reproduction of 95% of the DCI-P3 color space.

While the ROG Swift PG65UQ uses the same panel as HP’s Omen X Emperium, the monitor is configured differently. Unlike HP, ASUS decided not to equip its 64.5-inch gaming display with a high-end sound bar as well as an integrated NVIDIA Shield STB. Meanwhile, the display still has a USB hub and an IR receiver.

The decision not to offer a soundbar and the STB is pretty logical as hardcore gamers (especially among the ROG clientele) tend to use gaming speakers of their choice. Besides, without the soundbar and the STB the monitor costs less and potentially enables ASUS to sell it at a lower price point than HP.

So far, ASUS has not announced a firm launch date as well as MSRP for the ROG Swift PG65UQ, but NVIDIA is already demonstrating the monitor, so the unit appears to be more or less ready.

Want to keep up to date with all of our Computex 2019 Coverage?
Follow AnandTech's breaking news here!
Comments Locked


View All Comments

  • Opencg - Friday, May 31, 2019 - link

    Im so tired of full array. It does nothing for local contrast.
  • quiksilvr - Friday, May 31, 2019 - link

    I'm ready for OLED. With sufficient pixel shifting it would be ideal for gaming.
  • nathanddrews - Friday, May 31, 2019 - link

    It's already ideal for gaming - you just have to be willing to accept image retention as a part of life and the likelihood of eventual, permanent burn-in. If you're the type of person that upgrades displays every few years, then you far more to gain than to lose.
  • Kamus - Friday, May 31, 2019 - link

    Burn in is possible, but mostly FUD. Plasmas had more burn in potential, yet my 8 year old plasma doesn't have any.

    My note 4, which I used for 4 years before retiring never got any either.
  • Opencg - Friday, May 31, 2019 - link

    anyone heard much about these double layered displays? supposedly they are an alternative to oled that offers high contrast on a per pixel basis
  • nathanddrews - Saturday, June 1, 2019 - link

    Nothing you can buy quite yet. Panasonic introduced a proof of concept back in 2016 that used two layers of IPS. HiSense recently showed something too, I think. Basically, you use a cheap, monochrome 1080p IPS panel behind the color 4K IPS panel and shine a SH!T-TON of light through it. The 1080p panel effectively acts like 2-million-zone FALD behind the 4K panel that further assists with blocking light to reduce blooming and increase contrast.

    So you get better-than-VA contrast (won't beat OLED) with the viewing angles of IPS. From articles and interviews I've seen, it's the power efficiency that's holding it back. By the time it reaches the market (2021?), we may also see MicroLED hit the scene, who knows.
  • reckless76 - Friday, May 31, 2019 - link

    Well, that's just not going to fit on my desk..
    What exactly is the use case for this? I kinda thought if I wanted my PC hooked up to a TV in the living room, I'd buy a TV.. Speaking of which, this screen looks a lot like a Samsung Q series in game mode with gsync instead of freesync.
  • Flunk - Friday, May 31, 2019 - link

    You'd buy it because the specs vastly exceed any TV on the market. The main difference between a current-gen TV and a current-gen monitor is the inclusion of a tuner (and maybe some picture tuning).
  • Flunk - Friday, May 31, 2019 - link

    Although I imagine most people would never buy this anyway because the specs on this monitor imply that it will be absurdly expensive.
  • Alistair - Friday, May 31, 2019 - link

    How does it "vastly" exceed a current gen OLED TV from LG? Or the Samsung Q90? My understanding is that those TVs would be similar once an HDMI 2.1 source is available. Spec tables don't mean much without a review.

    Samsung's Q90 hits 10,000:1 contrast with full array local dimming, 1300 nits brightness in HDR, 120hz 4k with port 4 (currently limited to 4:2:0 color with HDMI 2.0, but it is supposedly an HDMI 2.1 port).

    It also has an optical layer that Samsung calls 'Ultra Viewing Angle.' which greatly improves the viewing angles at the expense of lower native contrast ratio, basically making VA similar to IPS in regards to viewing angle, but with still much better contrast like most VA TVs.

    Samsung's TV is very expensive ($3000) but we are hearing the Asus will cost a lot more than that. Buy an LG OLED is still my preference, since it also has an HDMI 2.1 port supposedly. We need an HDMI 2.1 video card first, nVidia or AMD.

Log in

Don't have an account? Sign up now