AMD has just updated their drivers to support FreeSync and Sapphire also demonstrated it during CeBIT 2015. This has once again sparked the discussion between the free and open AMD standard and Nvidia’s proprietary G-Sync technology, is Nvidia’s solution worth the extra costs? Forbes recently had an interesting talk with Tom Petersen, Distinguished Engineer at Nvidia, about the situation and why G-Sync is better.
First of all let’s get one thing clear, whether you use G-Sync or FreeSync, you’ll have a far better gaming experience than using none. The improvements are big and you will notice them, but there has to be a reason for the $200-250 premium on the G-Sync and according to Petersen there is.
The interview starts out as many do, someone dodging the actual question by pointing out some flaw in the way it’s asked. This is of course about the premium price on the G-Sync enabled monitors and the result is that people will pay the price if it’s worth it. True but moot as there isn’t any alternative for Nvidia graphics card owners. But okay, lets put that aside for now. It costs what it costs, the purchase is optional.
The reason why Nvidia’s G-Sync is better than AMD’s FreeSync is simply because they can control both sides of the signal, not just the output. There are a lot of different panels on the market and AMD is ‘speaking’ with them on a driver base while Nvidia tunes the module specifically for the monitor it’s built into. FreeSync is running great while operating at the panels sweet spot, but gets in trouble when get gets higher or lower; a problem Petersen says that G-Sync doesn’t have.
Tom Petersen: “There’s also a difference at the high frequency range. AMD really has 3 ranges of operation: in the zone, above the zone, and below the zone. When you’re above the zone they have a feature which I like (you can either leave V-Sync On or V-Sync Off), that we’re going to look at adding because some gamers may prefer that. The problem with high refresh rates is this thing called ghosting. You can actually see it with AMD’s own Windmill Demo or with our Pendulum Demo. Look at the trailing edge of those lines and you’ll see a secondary image following it.”
So the issue lies more in the transition from the high frequencies to the low frequencies of FPS and back again, resulting in both flicker and ghosting – although far from the same extent as without. Another argument for G-Sync is the GPU compatibility ranging all the way back to the GTX 650Ti.
Nvidia paid special attention to dealing with the low end of refresh rates, so as a game transitions from 45fps to 25fps and back during intense game situations, the G-Sync technology module kicks in and helps to deliver a smooth experience, even outside the panels normal area of operation.
The video above is a demonstration of AMD’s FreeSync Windmill demo on different monitors. The stuttering is a result of recording at high speed and it’s the trailing lines or ghosting that one has to pay attention to.
Thanks to Forbes for providing us with this information
As one of the most popular online games lately, it’s no surprise that Xbox fans…
We've finally reached the month of November, and that means one thing for Xbox users:…
For those who haven't had it on their radar, this week we take a new…
An overclocker from the MSI team has managed to push the Kingston Fury Renegade CUDIMM…
It seems that NVIDIA wants to launch its next products ahead of time. We are…
The trend of upgrading storage from traditional hard drives to SSDs has become increasingly popular,…