RTX HDR Tested – The Ultimate Game Changer




/ 1 month ago

« Previous Page

Next Page »

How Does it Work?

So, the big question, how does RTX HDR work? Despite what the name might suggest, it doesn’t actually use ray tracing. Instead, it leverages machine learning models trained on vast amounts of SDR and HDR game data. The process starts by feeding the SDR game output into a model, which expands the colour space of the original SDR content based on its training. The result is a game that appears to have HDR-like qualities, with an expanded colour range and contrast.

The machine learning model at the heart of RTX HDR has been trained on a wide variety of game scenes, allowing it to make intelligent decisions about how to enhance different types of environments. For example, it might handle a bright, colorful outdoor level differently from a dark, moody dungeon. This approach allows for more natural-looking results compared to simpler SDR-to-HDR conversion methods.

This is where RTX HDR differs from something like Microsoft’s AutoHDR. While AutoHDR does a commendable job of adding HDR effects to SDR games, it’s still an approximation based on more general rules. RTX HDR, on the other hand, uses its AI training to potentially provide more nuanced and context-aware conversions. One of the key benefits of using RTX HDR over AutoHDR is the level of control it offers. While AutoHDR is more of a one-size-fits-all solution, RTX HDR allows users to fine-tune various parameters to achieve their desired look. This can be particularly useful for enthusiasts who want to get the best possible results from their SDR games.

Now, you might be wondering what you need to actually use RTX HDR. First and foremost, you’ll need an NVIDIA RTX series GPU, so anything from the RTX 20 series or newer meaning that RTX HDR will work on all RTX 20, 30, and 40 class cards. The technology relies on the Tensor Cores found in these cards, which are specially designed for these types of tasks. This means that older NVIDIA cards, like the GTX series, or GPUs from other manufacturers won’t be able to take advantage of this feature because they lack Tensor cores. Additionally, on the display side, you’ll want an HDR-capable monitor with a minimum of 1000 nits of brightness and one that supports HDR10.

On the software side, there are a  few important requirements to keep in mind. You’ll need to be running Windows 11, as the feature isn’t available on earlier versions of Windows and you’ll also need to disable AutoHDR in your Windows settings, as RTX HDR is designed to replace this functionality. It’s also worth noting that at the moment, RTX HDR doesn’t work with multi-monitor configurations. However, NVIDIA has stated that support for multi-monitor setups is planned for the future. For now though, you’ll need to stick to a single display if you want to take advantage of this feature.

« Previous Page

Next Page »


Topics: , , , , ,

Support eTeknix.com

By supporting eTeknix, you help us grow and continue to bring you the latest newsreviews, and competitions. Follow us on FacebookTwitter and Instagram to keep up with the latest technology news, reviews and more. Share your favourite articles, chat with the team and more. Also check out eTeknix YouTube, where you'll find our latest video reviews, event coverage and features in 4K!

Looking for more exciting features on the latest technology? Check out our What We Know So Far section or our Fun Reads for some interesting original features.

eTeknix Facebook eTeknix Twitter eTeknix Instagram eTeknix Instagram
  • Be Social With eTeknix

    Facebook Twitter YouTube Instagram Reddit RSS Discord Patreon TikTok Twitch
  • Features


Send this to a friend
})