Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
Comments Locked

350 Comments

View All Comments

  • Midwayman - Thursday, March 19, 2015 - link

    If you have a display with backlight strobing (newest light boost, benq blur reduction, etc) the difference is readily apparent. Motion clarity is way way way better than with out. The issue is its like a CRT and strobbing is annoy at low rates. 75hz is about the absolutely min, but 90hz and above are better. I doubt any of the displays support strobing and adaptive sync at the same time currently, but when you can push the frames, its totally worth it. The new BenQ mentioned in the article will do both for example (Maybe not at the same time.) That way you can have adaptive sync for games with low FPS and strobing for games with high fps.
  • darkfalz - Thursday, March 19, 2015 - link

    Games at 100+ FPS look much smoother. Think of it like perfect motion blur. If you can keep your game between 72-144 Hz it's gaming nirvana.
  • eddman - Thursday, March 19, 2015 - link

    I'm not a fan of closed, expensive solutions, but this hate towards g-sync that some here are showing is unwarranted.

    nvidia created g-sync at a time where no other alternative existed, so they created it themselves, and it works. No one was/is forced to buy it.

    It was the only option and those who had a bit too much money or simply wanted the best no matter what, bought it. It was a niche market and nvidia knew it.

    IMO, their mistake was to make it a closed, proprietary solution.

    Those consumers who were patient can now enjoy a cheaper and, in certain aspects, better alternative.

    Now that DP adaptive-sync exists, nvidia will surly drop the g-sync hardware and introduce a DP compatible software g-sync. I don't see anyone buying a hardware g-sync monitor anymore.
  • Murloc - Thursday, March 19, 2015 - link

    you don't understand the hate because you think nvidia will drop g-sync immediately.
    It's likely you're right but it's not a given.
    Maybe it will be a while before the market forces nvidia to support adaptive sync.
  • MikeMurphy - Thursday, March 19, 2015 - link

    nVidia will protect manufacturers that invested resources into G-Sync. They will continue support for G-Sync and later introduce added support for Freesync.
  • ddarko - Thursday, March 19, 2015 - link

    The fact that only AMD cards work with Freesync now is not because Freesync is closed but because Nvidia refuses to support it. It takes a perverse kind of Alice in Wonderland logic to use the refusal of certain company to support an open standard in its hardware as proof that the open standard is in fact "closed."

    Freesync is open because it is part of the "open" Displayport standard and any display and GPU maker can take advantage of it by supporting that relevant Displayport standard (because use of the Displayport standard that Freesync is part of is free). Nvidia's Gsync is "closed" because Nvidia decides who and on what terms gets to support it.

    Whatever the respective technical merits of Freesync and Gsync, please stop the trying to muddy the water with sophistry about open and closed. Nvidia GPU can work with Freesync monitors tomorrow if Nvidia wanted it - enabling Freesync support Nvidia a dime of licensing fees or requirement the permission of AMD or anyone else. The fact that they choose not to support it is irrelevant to the definition of Displayport 1.2a (of which Freesync is a part of) as an open standard.
  • mrcaffeinex - Thursday, March 19, 2015 - link

    Are NVIDIA's partners able to modify their cards BIOS and/or provide customized drivers to support FreeSync or do they have to rely on NVIDIA to adopt the feature? I know different manufacturers have made custom cards in the past with different port layouts and such. I never investigated to see if they required a custom driver from the manufacturer, though. Is it possible that this could be an obstacle that an EVGA, ASUS, MSI, etc. could overcome on their own?
  • JarredWalton - Thursday, March 19, 2015 - link

    It would at the very least require driver level modifications, which the card manufacturers wouldn't be able to provide.
  • chizow - Thursday, March 19, 2015 - link

    How is this even remotely a fact when AMD themselves have said Nvidia can't support FreeSync, and even many of AMD's own cards in relevant generations can't support it? Certainly Nvidia has said they have no intention of supporting it, but there's also the possibility AMD is right and Nvidia can't support it.

    So in the end, you have two effectively closed and proprietary systems, one designed by AMD, one designed by Nvidia.
  • iniudan - Thursday, March 19, 2015 - link

    Nvidia cannot use FreeSync as it is AMD implementation of VESA's Adaptive Sync, they have to come up with their own implementation of the specification.

Log in

Don't have an account? Sign up now