We met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. Overall AMD and their partners are still on target to launch FreeSync displays this quarter, with AMD telling us that as many as 11 displays could hit the market before the end of March. For CES AMD had several displays running, including a 28” 60Hz 4K display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.

More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had serveral demos running, including a tearing test demo with a large vertical block of red moving across the display, and a greatly enhanced version of their earlier windmill demo. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary (sweep) over time between 40 Hz and 55 Hz. The Samsung display meanwhile was even able to show the current refresh rate in its OSD, and with FreeSync enabled we could watch the fluctuations, as can be seen here. [Update: Video of demo has been added below.]

Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months. Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum.

On the retail front, what remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.

Perhaps the bigger question though will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. Meanwhile if FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.

POST A COMMENT

118 Comments

View All Comments

  • chizow - Friday, January 9, 2015 - link

    Interesting take on this Jarred.

    So posting pro-AMD diatribes and throwing out misinformation doesn't create noise? That's interesting view on things since we ARE STILL trying to filter out all the noise AMD threw out there for the past 12 months, perpetuated and regurgitated by their pro-AMD fanboys. Shall we recount?

    1) Koduri telling everyone at CES last year FreeSync would "effectively be free" and might even be supported on existing monitors with just a firmware update. Only much later do we see them reverse course and say they were only referencing "royalties" that no one, including Nvidia, has ever confirmed, existing. Clearly this lie has legs, because there are STILL tech sites perpetuating this myth even to this day!

    2) Koduri and AMD saying FreeSync wouldn't need additional hardware, that they could work with existing DP specs that supported eDP. Months later, after actually doing the work, pushing a spec, and working with scaler makers, we see these monitors will in fact need more expensive scalers, and the Free in FreeSync is no longer "essentially free", its "royalty free".

    3) Koduri telling everyone that Nvidia needed expensive hardware because the display controllers in their GPUs couldn't handle Adaptive Sync as well as their own display controllers. Yet months later, we find that even many of AMD's own display controllers weren't quote AMDawesome enough to handle this!

    4) AMD saying FreeSync would support 9-240Hz bands, spurring fanboys like Creig to repeatedly quote this misinformation even in light of the fact FreeSync in actual pre-production samples is only supporting 40-60Hz and 30-144Hz in the models on display.

    5) AMD claiming G-Sync will die because it is proprietary, while FreeSync is open and "royalty free", when in reality, AMD is the only one supporting FreeSync but with a much smaller share of the total addressable market for these products than Nvidia, with Nvidia having shipped and sold their product for close to a year already. Oh, and we STILL don't know how much "Free'er" these monitors will be, do we?

    So now let's compare that with G-Sync and the mysteries surrounding its launch:

    1) Nvidia announces, demos, and launches G-Sync live for all the world to see and ships it in actual product 2 months later, and it does everything they said it does, from Day 1.

    So again Jarred, where is all the noise coming from, again? :) It seems to me, if AMD said nothing at all about FreeSync until it was actually done, there wouldn't be all the noise surrounding it.
    Reply
  • Creig - Friday, January 9, 2015 - link

    1) "MIGHT" be supported on existing monitors, not "WILL" be supported. It's up to the monitor manufacturer to release a firmware update to support FreeSync, not AMD. Therefore, AMD did not lie.

    2) FreeSync was already shown to work on laptops with the required specs without any additional hardware. However, laptop displays are not desktop displays. It is necessary to have a panel that is capable of variable refresh rates. This is why we have had to wait for monitor manufacturers to produce desktop displays with the VRR ability of certain laptop displays. Therefore, AMD did not lie.

    3) AMD certainly does have GPUs that are Adaptive-Sync capable while Nvidia does not. Therefore, AMD did not lie.

    4) FreeSync reportedly can operate anywhere from 9Hz to 240Hz. Just because current panels cannot go that high or that low does not mean that FreeSync is not capable of operating at those frequencies. Therefore, AMD did not lie.

    5) Whether or not G-sync will die remains to be seen. It is a fact, however, that AMD charges no royalties connected with FreeSync while it is reported that Nvidia does collect fees for every G-sync monitor sold.

    The noise, chizow, is coming from you distorting the facts to fit your twisted view of FreeSync.
    Reply
  • Creig - Friday, January 9, 2015 - link

    Just to clarify #2). It appears that the only difference between a FreeSync capable monitor and a non-FreeSync capable monitor is the scaler. Most current scalers are v1.2 and FreeSync requires 1.2a. As old monitors get replaced with new versions, it will be a simple and inexpensive matter for manufacturers to update them with DP1.2a or DP1.3 scalers which will make them FreeSync compatible and give them the necessary variable refresh rate capability.

    I am not claiming infallibility with the points I bring up. It's possible that I may make a mistake and state something that is in error. But I am trying to be as factual as possible.
    Reply
  • chizow - Friday, January 9, 2015 - link

    See, this is a perfect example of how all the misinformation and FUD AMD has put out there over the last year regarding FreeSync, just dies hard. You now have all these half-truths, lies, myths and straight nonsense put out there from AMD, perpetuated by their fanboys who of course feel compelled to continue the myths, FUD, lies and half-truths simply because they backed themselves into these untenable positions months ago, coupled with the fact they simply can't acknowledge AMD lied or was spreading misinformation regarding FreeSync this whole time.

    1) Creig, you are simply trying to argue semantics here when it is obvious what AMD said regarding FreeSync being essentially free, or possibly being supported with just a firmware update was misinformation, plain and simple. How you as an AMD fan and supporter aren't disappointed by this dishonesty is somewhat unsurprising, but to continue covering for them and thus, perpetuating the lie is somewhat shocking. Is there a single monitor on the market that is just upgradeable via a firmware update, essentially free that can support FreeSync? No, there is not, therefore, AMD lied whether intentional or not. There is no single monitor or display on the market that can support FreeSync via firmware update, essentially free.

    2) No, FreeSync was not shown to work, unless you consider a fixed refresh demo of a windmill a working version of FreeSync. But thanks again for providing another example where AMD was less than honest and forthcoming about what they showed and what they said they demonstrated.

    3) Great! So I guess that debunks your claims that Nvidia is the one choosing not to support FreeSync, when in reality, its certainly possible AMD designed a spec that only their hardware could support, knowing Nvidia and Intel GPUs could not. While FreeSync may not be proprietary in name, it certainly is in practice, is it not? Which brings us back to my original point, AMD is currently the only GPU vendor that supports FreeSync, just as Nvidia is the only GPU vendor that supports G-Sync, but of course, that also means Nvidia commands an overwhelming % of the TAM for these displays. The rest, is just "noise".

    4) No, it just means G-Sync as currently implemented is BETTER than FreeSync, just as I originally stated. You can claim FreeSync can lower your mortgage on paper, but if it doesn't do it in reality, who gives a rats ass? 9-240Hz on a piece of paper is just a way to deceive the ignorant and non-technical into thinking FreeSync is better because it "supports" a wider range of frequencies, when we see in reality, the supported band is MUCH smaller. Mission accomplished, it seems!

    5) Again, reported by whom? AMD? LOL. Again, the noise regarding royalties and fees have come from AMD and AMD only, but as we have seen, they have continually backed off this stance saying there is now additional BoM cost due to better scalers and better displays capable of handling these refresh rates and LCD decay times. Yet, somehow, the $200 G-Sync premium for Nvidia's BoM, R&D and QA costs per board are unjustified??? And we STILL don't know how much more AMD's solution will cost, so again, why is AMD saying anything at all until they know for sure?

    And, your clarification is wrong too, there are more differences than just the scalers, the panels themselves are higher quality also, as they need to support lower decay times to address the minimum refresh rates. 4K, IPS and 120+Hz will also command premium panel prices.

    So yes, as usual, the noise originated from AMD and has been echoed and parroted by AMD and their fanboys like you, Creig. If AMD simply shut their mouths and waited til they introduced actual product this week at CES, you wouldn't feel the need for all this backpedaling and revisionist history to cover all the misinformation they've been spreading over the last 12 months, but thanks for proving my point with your elaborate attempt to cover-up all of AMD's missteps.
    Reply
  • Creig - Friday, January 9, 2015 - link

    1) Are there monitors out there that can be made FreeSync compatible with nothing but a firmware flash? Yes. End of story.

    2) From what I understand, the laptop FreeSync demo was to showcase the fact that they could display a variable refresh rate. Full implementation of FreeSync requires dynamic variable refresh rates. The demo simply showed that FreeSync was possible, even if it didn't have all the features yet. Try to keep in mind that it was a demonstration of a beta work-in-progress just to show that FreeSync was possible.

    3) So AMD shouldn't have come out with FreeSync simply because Nvidia cards might not currently have the capability of utilizing it? And we don't know whether or not Intel currently has hardware that is Adaptive-Sync compatible. But since it's an industry standard now, Nvidia or Intel are free to incorporate it into their own hardware. That's more than can be said about G-sync.

    4) First you said AMD's support of 9Hz to 250Hz was a lie. Now you've admitted that it isn't a lie. It isn't AMD's fault that current monitors don't go that low or that high. But when they do, FreeSync will be ready to support them. How can you possibly twist that into a BAD thing?

    5) However you want to look at it, FreeSync will be cheaper than G-sync. It's an inescapable truth. FreeSync simply needs an updated scaler while G-sync requires the entire scaler to be replaced! And that replacement scaler has to be custom tuned to the panel in question. And don't forget about Nvidia's royalty fees. FreeSync will end up being cheaper than G-sync. No question about it.

    There is no backpedaling or revisionist history going on here. In fact, the only thing going on (and on and on and on) is you. I realize that you're upset that AMD appears to have beaten Nvidia at its own game and that the industry is excited about the forthcoming release of FreeSync. But no amount of ranting on your part is going to change that. So please just calm yourself down and try to stick to facts.
    Reply
  • chizow - Saturday, January 10, 2015 - link

    LMAO, again, it is amazing you're not ashamed to continue perpetuating these lies and myths. And for what? To try and defend the accumulated lies and misinformation from AMD, or as Jarred would say, "noise" that has piled up over the last year regarding FreeSync?

    1) No, there are not any monitors that can be made FreeSync compatible with just a firmware flash. Will you personally guarantee this level of support out of pocket for anyone misled by this statement? Will AMD stand by this? Will the monitor mfg? No. No one wants to guarantee this because there ARE costs associated with "Free"Sync and no one is willing to guarantee this.

    2) Well you understood incorrectly because AMD *WAS* telling people this was FreeSync with dynamic refresh rates when in fact, it was not. I mean how can you even sit here and say this was a demonstration of FreeSync and a worthy analogue to G-Sync when it was not even close to feature complete, including missing the MOST important aspect which is actual DYNAMIC/ADAPTIVE frame rate adjustments. Only someone intent on deception or misinformation would even throw this out there as you did as a counterpoint to try and prove AMD had already shown working demos of FreeSync to try and back AMD's original lie about existing panels on the market being able to support FreeSync with just firmware updates. So again, now that FreeSync is complete, why can't these older panels just support a feature complete FreeSync with just a firmware update? Oh right, because they can't. They lack the necessary hardware, hardware which AMD originally claimed wasn't necessary. But AMD subsequently developed more advanced scalers because they found out you couldn't actually just upgrade to FreeSync for free, with just a firmware update. Conclusion: AMD lied and fed the public misinformation, whether intentional or not, and their fanboys like you CHOOSE to continue to perpetuate this "noise" rather than just admit AMD was wrong and move on.

    3) Who said anything of the sort? AMD is always free to develop whatever they like to improve their products for existing customers and to entice future customers, but what they shouldn't be doing is making MISLEADING statements about what their competitors can or cannot do. I mean it would be just as disingenuous as Nvidia saying, well AMD can support G-Sync at any time if they want to, they just have to invest a huge amount of R&D to ensure their display controllers work with a custom board and FPGA ASIC. And as for Intel, we do know they have shown no interest whatsoever in supporting FreeSync. We also don't know if they are even capable, again, given not even all AMD GPUs have the secret display controller sauce to support FreeSync. But neither are "free" to implement because again, this may take a considerable level of effort and R&D and in Intel's case, they may not even care. In Nvidia's case, why bother when they already have a better solution they brought to market before FreeSync? And again, who cares if its an open standard if only AMD supports it? I guess we can give it the same chances as other failed open standards, like HD-DVD? How'd that go?

    4) Where did I admit it wasn't a lie? LMAO. Quoting AMD's bullshit is one thing, but please don't misquote me, thanks. But back on topic, do any of the demonstrated displays support 9Hz on the low end in FreeSync mode, or 240Hz on the high end, in FreeSync mode? No, they do not. Continuing to quote this lie as a benefit in favor of FreeSync is dishonest, simple as that, but I fully expect AMD and their disingenuous fanboys like you Creig, to continue to perpetuate this lie for years to come, because even now that we have actual demonstrated FreeSync monitors, none of them support these ranges.

    5) No, its not really an inescapable truth Creig. If you are one of the >70% of dGPU owners that own an Nvidia Kepler or Maxwell based GPU, you have 2 options:

    a) Buy a G-Sync monitor, game happily on a solution that does everything it says it does.
    b) Buy an AMD GPU that supports FreeSync and buy a FreeSync monitor.

    Are you and AMD willing to back your claim that FreeSync is the cheaper solution out of your own pocket for this subset of users just to perpetuate a myth and your flawed analysis? And what royalty fees are you talking about again? The ones AMD associated to G-Sync? LOL. Again, more "noise" as Jarred would say.

    Hahah no revisionist history. That's classic, you've backpedaled on every point when it was shown AMD did NOT demonstrate or make good on what they said on the original lies, myths and FUD you continue to try and perpetuate even after we see FreeSync in its final form has moved beyond most of these lies. That folks, is "noise".

    And AMD has beaten Nvidia at its own game? LOL. Yes, once again getting ahead of ourselves aren't we Creig? Because AMD sure has a great track record when it comes to supporting their initiatives, how's that Open Mantle SDK coming along btw? But I am sure in a few months once FreeSync actually makes it to market, we can throw this up there as another not-quite-as-good reactionary half-baked solution from AMD in an attempt to match Nvidia, the industry leader that developed and introduced this tech:

    SLI > CF
    Shadowplay > some AMD junk in their bloated adware client
    DSR > VSR
    GPU Boost > Turbo Core
    CUDA > OpenCL
    3D Vision > HD3D
    PhysX > BulletPhysics
    Optimus > Enduro

    I'm sure there's more, but that's just a small sample of how AMD has "beaten Nvidia at its own game" in the past. Don't worry, there's plenty more room on the list for G-Sync > FreeSync, too! :D
    Reply
  • Creig - Saturday, January 10, 2015 - link

    If there's a fanboy here, Chizow, it's you. I'm not going to bother with your rantings any longer as it's obvious that you refuse to acknowledge facts. As more and more FreeSync capable monitors hit the market, everybody will just laugh at you all the harder. So just keep on tilting at those AMD windmills. I'm sure you'll stop them in their tracks single-handed. Reply
  • chizow - Sunday, January 11, 2015 - link

    Haha the difference Creig, is that I actually use their products because they are the best at satisfying my informed demands as a user, and I'm not willing to perjure myself to suit their agenda, as you CLEARLY are.

    What is your excuse again for buying inferior tech? Save a few bucks? What's your excuse for defending all of these accumulated lies and misinformation, ie. "noise"? I'm simply trying to set the record straight here, y'know, filter out all the "noise" because it is clear where that noise originated (AMD) and it is clear there are certain folks who's agenda is to not only perpetuate that noise in order to confuse the market or create a competitive advantage to distort the reality FreeSync is the one that faces the greater obstacles on the market, not G-Sync.

    But yes, until then, we will just continue enjoying our G-Sync monitors, laughing as FreeSync falls further and further away from AMD's original claims because we have the benefit and luxury of knowing G-Sync does everything it said it would and has been for close to a year!
    Reply
  • Will Robinson - Tuesday, January 13, 2015 - link

    SLI is better than Crossfire?
    LOL...get current dude.The XDMA bus is a far better and more elegant solution as demonstrated in its total ownership of SLI for both frame rates and frame pacing.
    Take off the green goggles before you go blind.
    Reply
  • chizow - Tuesday, January 13, 2015 - link

    LMAO, ah yes, frame pacing, the problem AMD fanboys like you spent months, years, downplaying and sweeping under the rug as if it didn't exist, right? FCAT was just an Nvidia viral marketing scam until it actually forced AMD to go back and fix their broken CF implementation YEARS later, right?

    But its OK, I do recognize progress and superior tech when it is appropriate and XDMA is certainly a better approach than Nvidia's aging SLI bridge.

    But unfortunately for you and AMD users, XDMA and hardware is only part of the problem, and they have only fixed part of their CF implementation by sorting out the frame pacing/microstutter problems AMD fanboys spent years downplaying.

    The biggest problem for AMD and their CF implementation is that the end-user is STILL bound to AMD's driver updates because they don't expose the compatibility bits in their CF profiles, as Nvidia has done for years. There are some half-baked workaround that require you to copy profiles for other games, but this has the chance to break other features, like AA, because you don't have the granularity to change individual bit settings like you can with Nvidia profiles via a simple XML change using something like Nvidia Inspector.

    But yes, all in all, AMD fans can once again thank Nvidia and their supporters for bringing about positive change for your AMD products. Because we sure as hell know nothing would've gotten fixed on the CF front otherwise! Not surprising when you have a fan base that is content with mediocrity and would rather downplay and sweep a problem under the rug, rather than demand support and a fix for it!
    Reply

Log in

Don't have an account? Sign up now