We met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. Overall AMD and their partners are still on target to launch FreeSync displays this quarter, with AMD telling us that as many as 11 displays could hit the market before the end of March. For CES AMD had several displays running, including a 28” 60Hz 4K display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.

More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had serveral demos running, including a tearing test demo with a large vertical block of red moving across the display, and a greatly enhanced version of their earlier windmill demo. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary (sweep) over time between 40 Hz and 55 Hz. The Samsung display meanwhile was even able to show the current refresh rate in its OSD, and with FreeSync enabled we could watch the fluctuations, as can be seen here. [Update: Video of demo has been added below.]

Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months. Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum.

On the retail front, what remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.

Perhaps the bigger question though will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. Meanwhile if FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.

POST A COMMENT

118 Comments

View All Comments

  • chizow - Friday, January 9, 2015 - link

    @tuxroller, yes I fully understand this, but it is clear Creig is quoting this dated spec info in a misleading attempt to back his point FreeSync is superior to G-Sync, when in reality, FreeSync's supported frequency bands are in fact, worst. What is the purpose of this if not to mislead and misinform? Its a disservice to everyone involved and interested in these products to try and claim this as an advantage when in actual implementation, the displays are not capable of these low refresh rates, nor are desktop (or even HDTV displays) capable of driving up to 240Hz.

    Would it not be misleading to say "Hey, this product is better because it is capable of time *travel."

    *Time travel not available anytime soon, if ever.
    Reply
  • Creig - Friday, January 9, 2015 - link

    @chizow:

    First of all, there is nothing at all wrong with the specs I listed. According to the information that has been released, FreeSync is capable of operating anywhere from 9Hz to 240Hz. It does so in ranges.

    9Hz - 60Hz
    17Hz - 120Hz
    21Hz - 144Hz
    36Hz - 240Hz

    As I highly doubt that any one panel out there will be capable of operating in the full range of 9Hz - 240Hz, I don't see what the problem is. The monitor manufacturer simply chooses which range will cover the specs of the panel they intend to produce. The fact that no panels out there today can go as low as 9Hz or as high as 240Hz yet is irrelevant. FreeSync will be ready for them if and when they eventually make it to market. Your "issue" is a non-issue.

    From your quote:"there MIGHT be existing monitors on the market", "PERHAPS possible via a firmware update". See the words "MIGHT" and "PERHAPS"? He didn't say "WILL". Monitor firmware updates are beyond AMD's abililty to control. It was obvious that the monitor they used was capable of FreeSync with an updated firmware. But it is up to the manufacturer to offer the update, not AMD. You may as well blame the weather on the television forecasters while you're at it. It makes just about as much sense as what you just said.

    Only AMD will support "FreeSync" because "FreeSync" is simply AMD's implementation of Adaptive-Sync. As far as other companies such as Intel, they are free to develop their own version of FreeSync because Adaptive-Sync is a VESA industry standard. The VESA board evidently considered the spec to be of great enough benefit to include it in their latest version. So it's not only AMD who found merit in its design. Intel has had eDP for a couple of years now so it's entirely possible that they already have Adaptive-Sync capability built into their shipping products. If they don't already possess it, I can't see why they wouldn't want to include it in the future. It's an industry standard spec, there are no licensing costs and it gives the end user an overall better experience.

    I was pulling from memory the GPUs that will be FreeSync capable. I thought I read somewhere that Tahiti will have partial FreeSync support in that they will handle the video playback aspect, but not the 3D rendering. I'll see if I can find that info. And even if it turns out that it's only Hawaii and newer, what of it? There will always be new technology that isn't compatible with old hardware. There has to be a cutoff line somewhere. Are you raging because there are no more AGP slots on motherboards? Are you upset because you can't find a new laptop that comes with a floppy drive? Newer technology isn't always compatible with older technology. Does that mean we should simply stop innovating?

    Both websites of PCPER and Blur Busters have personally been to the AMD booth at CES and both websites have reported no visual difference between FreeSync and G-Sync. Obviously we'll have to wait for official reviews to get the final word, but I fail to see why you are still trying to claim that FreeSync is inferior to G-sync in nearly every way when people who have actually seen both in operation are saying otherwise.

    Really chizow, you might be taken a bit more seriously around here if you would simply tone down your pro-nvidia "RAH RAH RAH" eight or nine levels.
    Reply
  • chizow - Friday, January 9, 2015 - link

    @Creig.

    So you can admit, that because the frequency range FreeSync monitor mfgs chose to support are inferior to what G-Sync supports, FreeSync is an inferior solution to G-Sync correct? Because I would hate for someone to get the impression FreeSync is better than G-Sync based on some dated specs you pulled off an AMD whitepaper when in reality, there are no monitors on the market that support anything close to these ranges on either the top or bottom end. Just making sure. :)

    So after that quote about FreeSync being free, and after we have seen there were in fact no displays on the market that could just support FreeSync with a firmware update, for free, you can admit what AMD said was misleading, and that their entire naming structure is really just a misnomer. Do all the people AMD misled with their claims deserve an apology, in your opinion? Do you think AMD should have made these claims without first verifying any of it being true, first? Just wondering. :)

    LOL gotta love the "It is open but they can't use AMD's implementation, but as usual, anyone is free to develop" take you are heading towards here with FreeSync. Keep towing that company line though! I know that is the favored mantra for AMD and their fanboys when they develop something proprietary under the guise of "Openness" just so the dim-witted and misinformed can parrot it and find no fault with AMD. Just like Mantle right? Intel, Nvidia and anyone else are free to develop their own Mantle implementation? :D Wonder who spread that bit of noise/FUD around...

    But yeah you were wrong about Tahiti, so you really should be more careful when referencing major points if you are going to try and counter my point, which in this case, was market share. You are now of course trying to downplay the fact that only a tiny fraction of AMD cards even support FreeSync, which are only a minority share of the dGPU market to begin with, but it reinforces my point that FreeSync is the technology that has the huge uphill battle because so few GPUs can even make use of it. Its also quite funny that you are now trying to downplay the importance of hardware install-base. Who is going on about floppy drives and AGP slots? We are talking about relevant DX11 hardware from the last 3 years, most of which can't support AMD's own FreeSync standards. If install-base and legacy support aren't important, what chances would you give FreeSync to succeed if they started the ticker at 1 starting with their next 14 or 20nm GPU, against the tens of millions of Kepler and newer GPUs that support G-Sync? You don't think the fact many AMD users will have to upgrade both their GPU *AND* their monitor will be a barrier to entry, and an additional cost of adoption for FreeSync? Maybe its time to reassess the fees attached and total cost of ownership once you factor in a new AMD GPU too?

    And reported no visual difference? Wrong, they reported no visual difference until the demos went out of supported frequency band, at which point, everything fell apart. This cannot happen with G-Sync, by design. Also, visual difference in the form of tearing and stutter was only part of the equation and problem with Vsync that was solved by G-Sync, the other half of the equation was input lag/latency, which we have no insight on because the demos weren't truly interactive. But again, based on the impressions and various screenshots indicate AMD's solution is still tied to V-Sync, so there is a strong possibility they were not able to resolve this input lag, as G-Sync does.

    And tone down the RAH RAH tone? Hahah that's funny from the guy who is now forced to re-scream all the nonsense of the past 12 months from the rooftops, but I fully understand, you've backed yourself into this position long ago when you reference and gave creedence to all the nonsense AMD claimed about FreeSync that ultimately, ended up being BS.
    Reply
  • Will Robinson - Sunday, January 11, 2015 - link

    I'll enjoy watching you eat humble pie over this.
    Be sure to be man enough to admit your rants were wrong and heckling everyone over their rebuttals was juvenile.
    Reply
  • chizow - Sunday, January 11, 2015 - link

    @Will Robinson,

    Are you enjoying that humble pie defending all the FUD/misinformation AMD said about Not-So-FreeSync before they actually did the work and productized it? Certainly you are "man enough" to admit much of what AMD said over the past year regarding FreeSync was in fact misleading?
    Reply
  • FlushedBubblyJock - Tuesday, February 24, 2015 - link

    I hope to soon SUE the lying AMD company to THE HILT OF THEIR EMPTY BANK ACCOUNT - because they have lied to me about Freesync and my Hawaii core AMD cpu !
    Yes, it has 4GB of ram, but when the screen is shredding and tearing apart - what good is it ?!
    Reply
  • Intel999 - Friday, January 9, 2015 - link

    What about future hardware? User has choice to purchase a GPU that supports monitors $150-$200 cheaper than a GPU that requires a more expensive monitor to get similar performance. Only hardcore team green loyalists will choose the latter. AMD will hold onto their loyalists. And those with common sense and go back and forth between green and red will have one more reason to go to AMD. Especially, in the back half of this year where it appears team Red will have the best performing cards for at least six months. How Red prices the new GPUs should dictate the success of market share gains. I suspect $125 premium to Nvidia cards will be the case. Reply
  • chizow - Sunday, January 11, 2015 - link

    @Intel999

    Or more likely, the 70% of the market that already owns an Nvidia dGPU from the last 3 years (Kepler launched in Mar 2012) can just buy a new G-Sync monitor, the same set of users that already saw a benefit from Nvidia products independent of the relatively new innovation of G-Sync.

    But yes, if both Nvidia/AMD hold onto their "loyalists" or repeat buyers, you can already see, AMD is going to run up against a huge uphill battle where they control an extremely minor share of the dGPU market (desktop and mobile) at ~70/30 clip. What's the point of referencing future GPUs of unknown commodity at this point? You don't think Nvidia is going to release another high-end GPU to combat AMD's next offering?

    And you can't say for sure these users will have 1 more reason to go AMD, because there is a premium and value for better technology. G-Sync may just be better than FreeSync, and while we don't know this for sure right now, we do know G-Sync does everything Nvidia said it would for over a year, which has held up against the test of time from both reviewers and consumers alike.

    We simply can't say the same about FreeSync right now, can we?
    Reply
  • medi03 - Sunday, January 11, 2015 - link

    70% eh?

    http://www.anandtech.com/show/8446/the-state-of-pc...
    Reply
  • chizow - Sunday, January 11, 2015 - link

    @medi03, actually 70% would be a generous number for AMD GPUs in this discussion, because again, Nvidia supports G-Sync with all Kepler and Maxwell-based GPUs, which goes back to March 2012. AMD GPUs that can support FreeSync are far fewer, with only GPUs based on Hawaii, Tonga, and Bonaire and any APU from Kaveri onwards.

    While AMD has stated all new GPUs based on new ASICs will support FreeSync, the most market data shows they are getting destroyed in the marketplace, which is no surprise given the reception Maxwell has received in the marketplace:

    Source: Jon Peddie Research
    http://jonpeddie.com/publications/add-in-board-rep...

    "Nvidia continues to hold a dominant market share position at 72%."
    http://jonpeddie.com/images/uploads/publications/A...

    That was only with 1 month of sales for the new Maxwell cards, market reports are expecting similar, if not more pronounced results in favor of Nvidia for Q4, but I wouldn't be surprised to see a slight decline with AMD's price cuts as 72% is REALLY hard to improve upon.
    Reply

Log in

Don't have an account? Sign up now