in ,

How to Choose an HDR Gaming Monitor

Although you don’t have to spend a bunch on the essentials, a good high dynamic range gaming monitor is not inexpensive.

For a better-looking image, sceneries are drawn with brighter highlights, more shadow detail, and a wider variety of colors. In contrast to TV HDR, game HDR means more than just a nicer picture: You are more likely to dodge covert enemies and find clues if you can see what’s hiding in the bright and dark areas. Yet keep in mind that the majority of games are still made to appeal to the middle class: In the middle of the brightness range, you can see everything you need to see.

For the best performance, games still need explicit HDR compatibility, although the addition of Auto HDR in Windows 11 and the Xbox Series X/S changes that: The brightness and color gamut’s of non-HDR games can be automatically expanded by the operating systems. While it’s not the same as having a game that was specifically designed to take advantage of the wider viewing angles, it can give it a boost and make it appear better than it otherwise would.

What is HDR and why do I want it?

HDR mixes a number of components to work its magic. First off, it employs a wider range of brightness levels than a standard monitor’s 256 displayable levels, and in certain circumstances, goes beyond a fantastic monitor’s genuine 1,024 levels. Additionally, it includes profiles required to optimally map content’s color and brightness ranges to the display’s capabilities, a decoder in the monitor that comprehends the mapping, and all other related technologies that hold the puzzle pieces together, not the least of which is the operating system. It also includes more colors than the least-common-denominator sRGB gamut.

Because there aren’t many scenes with extreme brightness or dark shadows, or because the games don’t make use of the wider tonal range, HDR is often not an issue. But, if a game supports it, you’ll likely see better visuals in AAA titles, more creeps in horror titles, fewer ambushes in FPS titles, and so on.

Not whether you want it or not, but how much you want it. The question is how much you’re ready to spend on it—not just for a display that says “HDR10” in its specifications, but for a monitor that will produce the HDR-like visual quality.

Will an HDR gaming monitor work with the Xbox Series X/S and PS5?

Yup! Under the banner of the HDR Gaming Interest Group, Sony, Microsoft, and a plethora of other relevant companies have even created a set of publicly accessible best practices for HDR game creation and monitor design for their consoles and Windows. However, HGIG is not a standards body and does not certify goods, so you must still carefully review specifications. And the confusion just increases.

‘HDMI 2.1’ caveats

Unfortunately, the HDMI protocol has devolved into such a tangle that based solely on the version number, one cannot infer capabilities. Additionally, the specification no longer requires any of the significant new features; in other words, all the glitzy features that made HDMI 2.1 desirable, especially as a choice for consoles, are now optional. Going forward, every HDMI 2.0 connection will be labelled as 2.1a (with the same HDMI 2.0 feature set).

The bottom line is that you must separately confirm support for each feature if you want a monitor for your console that can handle 4K at 120Hz, support variable rate refresh, and auto low-latency mode. The same holds true if you want a PC monitor that can connect through HDMI and support source-based tone mapping, which is covered below, as well as bandwidth-intensive combos of high resolution, quick refresh rates, and high color depth/HDR.

Manufacturers of monitors are required to explicitly mention the functions that are supported; if they don’t, either avoid the monitor or look further. TFT Central does a good job of explaining the problems if you want the graphic information.

What do I look for in an HDR gaming monitor?

Due to marketers extending the definition to include displays in the most popular price range (less than $400), the word “HDR” has become rather muddied. To a certain extent, you must consider a number of specifications to determine whether a device can provide a true HDR experience.

To help you narrow down your options, the VESA display industry organization developed Display HDR, a set of standards and criteria for describing HDR quality levels in consumer monitors. Display HDR 400 is the baby pool of HDR due to its color gamut and brightness requirements, but it’s a nice option if you’re only looking for a bright SDR display.

It is confusing because several manufacturers now refer to monitors as “HDR 600,” for instance. It’s never clear whether they’re just using it as a shortcut for the corresponding Display HDR level because they don’t want to pay for the logo certification programmed or whether they’re using it in an inaccurate manner to indicate that they can reach a certain tier’s peak brightness level. They can choose not to use the logo and instead take the certification tests for internal verification. (The Display HDR Test tool, which is available in the Microsoft Store, allows you to do this as well.)

HDR10 and HDR10 Plus Gaming

From a technical perspective, HDR10 support is essentially meaningless because it just indicates that the monitor can interpret the data stream and render it, not that it can render it effectively. The most fundamental need for a monitor to qualify as “HDR” (and the least expensive to include) is compliance with the HDR10 standard. It simply means that the monitor can support the algorithms required by an operating system to map HDR content correctly to the capabilities of the monitor, including brightness mapping, handling 10-bit calculations required for mapping (for EOTF and SMPTE ST. 2084 gamma), being able to work with compressed color sampling in video, and being able to handle and map colors notated within the SMPTE ST. 2084 standard.

The HDR10 standard’s creators revealed a new level at CES2022, the upcoming HDR10 Plus Gaming standard, a version of the HDR10 Plus that has been on TVs for some time. In contrast to HDR10, which only has a single range that must function throughout the entire game, Source Side Tone Mapping (SSTM) expands the brightness range on a scene-level based on information inserted by the game developer. Moreover, it features support for variable refresh rates in 4K at 120Hz on consoles and the ability to immediately activate a display’s low latency mode to offset the increased overhead caused by HDR data (more crucial for TVs than monitors) (still not implemented in the PS5 as of today).

Because the license also covers usage rights to particular member manufacturers’ patents, HDR10 Plus requires certification and a license fee from hardware manufacturers (including GPUs), but not from software developers. At CES, Samsung declared that all of its gaming displays will feature HDR10 Plus by 2022.

Color and brightness

The brightness of a screen is a measurement of how much light it can produce, typically represented in nits (candelas per square meter). Standard definition range (SDR) desktop monitors normally range from 250 to 350 nits, while HDR displays additionally define a peak brightness that they can reach for brief periods of time and typically just for a small section of the screen. At the absolute least, HDR-capable displays should have a peak brightness of 400 nits, while they can now reach 1,600 nits. (Laptop screens, however, benefit from higher brightness levels even without HDR capabilities because they must be able to be viewed in many types of illumination, such as direct sunlight.)

Despite how bright they can go, OLED panels achieve nearly black-to-black brightness levels, which is what gives them such strong contrast. Contrast is one of the main factors that influence how we judge the quality of an image.

The color space you’re most interested in for gaming and monitors in general is P3, which comes in two somewhat different varieties: DCI-P3 and D65 P3. In actual use, their sole difference is their white points; DCI is slightly warmer (6300K as opposed to 6500K) and was developed for film editing. Nonetheless, I frequently see DCI-P3 stated in specifications when D65 is intended. That’s okay because the D65 standard, which Apple championed for its own screens, is the one that matters for gaming monitors. They both have the same gamut’s, therefore I just use the term “P3” unless I’m clearly differentiating between the two. (If you have trained eyes, you can distinguish between the two whites, but most people don’t care.)

Moreover, gamut’s are frequently expressed as a percentage of Adobe RGB, which is acceptable. As printers employ cyan ink, Adobe RGB is somewhat skewed towards the green/cyan end of the spectrum, but P3 extends further out on the green/yellows, which are simpler for high-quality monitors to create. And that’s basically why it’s worthless when specifications state “over a billion colors” (the result of employing 10-bit math). Which billion is important.

In order to provide least-common-denominator color matching in Windows, HP and Microsoft created the sRGB color space in 1996. This color space is roughly equivalent to the Rec 709 SDR video standard, so any monitor you take into consideration for decent HDR viewing should unquestionably cover much more than 100% of that range. The preceding chart makes it clear why sRGB-calibrated monitors and images have terrible greens and everything appears to have low contrast (since it is unable to achieve high saturation levels for the majority of colors).

A decent HDR monitor, in my opinion, should be able to reach a peak brightness of between 600 and 1,000 nits and cover at least 95% of the P3 or Adobe RGB color gamut, based on my experience. (Windows appears terrible in HDR mode because to the operating system’s poorly built components, sRGB-only color gamut, poorer brightness capacity, and math.)

Backlight type

All screen technologies, excluding OLED, use light to create images by passing it through a number of liquid crystal layers and color filters. OLED, however, uses pixels that can illuminate themselves. The majority of backlit panels may show some abnormalities, most notably the so-called “backlight bleed,” which is actually an artefact of edge illumination and appears as light around the edges of a dark screen.

Mini LED, a more recent backlighting method that is excellent for HDR, enables a monitor to use local dimming like a TV to generate high brightness with minimal bleed and bright halos when they appear close to dark areas; the brighter the display, the more obvious unwanted brightness tends to be. The most recent generation of HDR displays, which have brightness levels of 1,000 nits or higher, employ mini LED. More local dimming zones are better, much like with TVs.

However the heat produced by all those LEDs might be quite high. Reducing the number of zones from when monitors with tiny LED arrays originally launched has been one tendency. For instance, 2022 monitors will only contain half the zones of the original 1,152-zone versions.

Samsung QD-OLED displays are a more modern innovation that combine Quantum Dot color rendering technology with a blue OLED backlight to create great contrast and quick response times while utilizing the Quantum Dot array to render a wide range of colors. The Alienware 34 QD-OLED is the first monitor to come with the screen. The AW34 straddles the brightness spectrum by offering both a more constrained 1,000 nit setting and a regular 400-nit HDR option, which is better than it sounds due to contrast given by the nearly perfect black. It gives off some heat, but it doesn’t get nearly as hot as the conventional monitors with 1,000+ nits.

Since prices increase as brightness increases, 400-nit panels are particularly desirable to both buyers and sellers. Further cost increases may result from adding features like a high refresh rate for gaming.