AMD has just updated their drivers to support FreeSync and Sapphire also demonstrated it during CeBIT 2015. This has once again sparked the discussion between the free and open AMD standard and Nvidia’s proprietary G-Sync technology, is Nvidia’s solution worth the extra costs? Forbes recently had an interesting talk with Tom Petersen, Distinguished Engineer at Nvidia, about the situation and why G-Sync is better.
First of all let’s get one thing clear, whether you use G-Sync or FreeSync, you’ll have a far better gaming experience than using none. The improvements are big and you will notice them, but there has to be a reason for the $200-250 premium on the G-Sync and according to Petersen there is.
The interview starts out as many do, someone dodging the actual question by pointing out some flaw in the way it’s asked. This is of course about the premium price on the G-Sync enabled monitors and the result is that people will pay the price if it’s worth it. True but moot as there isn’t any alternative for Nvidia graphics card owners. But okay, lets put that aside for now. It costs what it costs, the purchase is optional.
The reason why Nvidia’s G-Sync is better than AMD’s FreeSync is simply because they can control both sides of the signal, not just the output. There are a lot of different panels on the market and AMD is ‘speaking’ with them on a driver base while Nvidia tunes the module specifically for the monitor it’s built into. FreeSync is running great while operating at the panels sweet spot, but gets in trouble when get gets higher or lower; a problem Petersen says that G-Sync doesn’t have.
Tom Petersen: “There’s also a difference at the high frequency range. AMD really has 3 ranges of operation: in the zone, above the zone, and below the zone. When you’re above the zone they have a feature which I like (you can either leave V-Sync On or V-Sync Off), that we’re going to look at adding because some gamers may prefer that. The problem with high refresh rates is this thing called ghosting. You can actually see it with AMD’s own Windmill Demo or with our Pendulum Demo. Look at the trailing edge of those lines and you’ll see a secondary image following it.”
So the issue lies more in the transition from the high frequencies to the low frequencies of FPS and back again, resulting in both flicker and ghosting – although far from the same extent as without. Another argument for G-Sync is the GPU compatibility ranging all the way back to the GTX 650Ti.
Nvidia paid special attention to dealing with the low end of refresh rates, so as a game transitions from 45fps to 25fps and back during intense game situations, the G-Sync technology module kicks in and helps to deliver a smooth experience, even outside the panels normal area of operation.
The video above is a demonstration of AMD’s FreeSync Windmill demo on different monitors. The stuttering is a result of recording at high speed and it’s the trailing lines or ghosting that one has to pay attention to.
Thanks to Forbes for providing us with this information
Electronic Arts (EA) announced today that its games were played for over 11 billion hours…
Steam's annual end-of-year recap, Steam Replay, provides fascinating insights into gamer habits by comparing individual…
GSC GameWorld released a major title update for STALKER 2 this seeking, bringing the game…
Without any formal announcement, Intel appears to have revealed its new Core 200H series processors…
Ubisoft is not having the best of times, but despite recent flops, the company still…
If you haven’t started playing STALKER 2: Heart of Chornobyl yet, now might be the…