FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • YukaKun - Thursday, March 19, 2015 - link

    Until we don't have video showing the 2 of them going in parallel, we can't decide for a winner. There might be a lot of metrics for measuring "tearing", but this is not about "hard metrics", but how the bloody frame sequences look on your screen. Smooth or not.

    Cheers!
  • eddman - Thursday, March 19, 2015 - link

    The difference cannot be shown on video. How can a medium like video which has a limited and constant frame-rate be used to demonstrate a dynamic, variable frame rate technology?

    This is one of those scenarios where you can experience it only on a real monitor.
  • Murloc - Thursday, March 19, 2015 - link

    putting it on video makes the comparison kinda useless.
  • invinciblegod - Thursday, March 19, 2015 - link

    I am one of those who switch every time I upgrade my GPU (which is every few years). Sometimes, AMD is on top while other time Nvidia is better. Now, I must be locked into one forever or buy 6 monitors (3 for eyefinity and 3 for nvidia surround)!
  • jackstar7 - Thursday, March 19, 2015 - link

    If they can put out a confirmed 1440p 21:9 w/Freesync they will get my money. The rumors around the Acer Predator are still just rumors. Please... someone... give me the goods!
  • Black Obsidian - Thursday, March 19, 2015 - link

    It's pretty likely that LG will do just that. They already make two 1440p 21:9 monitors, and since it sounds like FreeSync will be part of new scalers going forward, you can probably count on the next LG 1440p 21:9 picking up that ability.
  • xthetenth - Thursday, March 19, 2015 - link

    I'm right there with you. I'm already preparing to get the update on the LG 1440 21:9 and a 390X, because if the rumors for the latter are anything like what the card is, it's going to be fantastic, and after getting a 21:9 for work I can't make myself use any other resolution.
  • Black Obsidian - Thursday, March 19, 2015 - link

    Same deal here. If nVidia supported FreeSync and priced the Titan X (or impending 980 Ti) in a more sane manner I'd consider going that way because I have no great love for either company.

    But so long as they expect to limit my monitor choices to their price-inflated special options and pretend that $1K is a reasonable price for a flagship video card, they've lost my business to someone with neither of those hangups.
  • kickpuncher - Thursday, March 19, 2015 - link

    I have no experience with 144hz screens. I've been waiting for freesync to come but you're saying the difference is negligble with a static 144hz monitor? Is that with any FPS or does the FPS also have to be very high? (in regards to 4th paragarph on last page). Thanks
  • JarredWalton - Thursday, March 19, 2015 - link

    I'd have to do more testing, but 144Hz redraws the display every 6.9ms compared to 60Hz redrawing every 16.7ms. With pixel response times often being around 5ms in the real world (not the marketing claims of 1ms), the "blur" between frames will hide some of the tearing. And then there's the fact that things won't change as much between frames that are 7ms apart compared to frames that are 17ms apart.

    Basically at 144Hz tearing can still be present but it ends up being far less visible to the naked eye. Or at least that's my subjective experience using my 41 year old eyes. :-)

Log in

Don't have an account? Sign up now