FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
POST A COMMENT

350 Comments

View All Comments

  • medi03 - Thursday, March 19, 2015 - link

    nVidia's roalty cost for GSync is infinity.
    They've stated they were not going to license it to anyone.
    Reply
  • chizow - Thursday, March 19, 2015 - link

    It's actually nil, they have never once said there is a royalty fee attached to G-Sync. Reply
  • Creig - Friday, March 20, 2015 - link

    TechPowerup
    "NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties"

    WCCF Tech
    "AMD FreeSync, unlike Nvidia G-Sync is completely and utterly royalty free"

    The Tech Report
    "Like the rest of the standard—and unlike G-Sync—this "Adaptive-Sync" feature is royalty-free."
    Reply
  • chizow - Friday, March 20, 2015 - link

    @Creig

    Again, please link confirmation from Nvidia that G-Sync carries a penny of royalty fees. BoM for the G-Sync module is not the same as a Royalty Fee, especially because as we have seen, that G-Sync module may very well be the secret sauce FreeSync is missing in providing an equivalent experience as G-Sync.

    Indeed, a quote from someone who didn't just take AMD's word for it: http://www.pcper.com/reviews/Displays/AMD-FreeSync...

    "I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. "
    Reply
  • JarredWalton - Friday, March 20, 2015 - link

    Finish the quote:

    "It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."

    Of course, royalties for using the G-SYNC brand is the real question -- not royalties for using the G-SYNC module. But even if NVIDIA doesn't charge "royalties" in the normal sense, they're charging a premium for a G-SYNC scaler compared to a regular scaler. Interestingly, if the G-SYNC module is only $40-$60, that means the LCD manufacturers are adding $100 over the cost of the module.
    Reply
  • chizow - Friday, March 20, 2015 - link

    Why is there a need to finish the quote? If you get a spoiler and turbo charger in your next car, are they charging you a royalty fee? It's not semantics to anyone who actually understands the simple notion: better tech = more hardware = higher price tag. Reply
  • AnnihilatorX - Sunday, March 22, 2015 - link

    Nvidia is making profit over the Gsync module, how's that different from a royalty? Reply
  • chizow - Monday, March 23, 2015 - link

    @AnnihilatorX, how is "making profit from" suddenly the key determining factor for being a royalty? Is AMD charging you a royalty every time you buy one of their graphics cards? Complete and utter rubbish. Honestly, as much as some want to say it is "semantics", it really isn't, it comes down to clear definitions that are specific in usage particularly in legal or contract contexts.

    A Royalty is a negotiated fee for using a brand, good, or service that is paid continuously per use or at predetermined intervals. That is completely different than charging a set price for Bill of Material for an actual component that is integrated into a product you can purchase at a premium. It is obvious to anyone that additional component adds value to the product and is reflected in the higher price. This is in no way, a royalty.
    Reply
  • Alexvrb - Monday, March 23, 2015 - link

    Jarred, Chizow is the most diehard Nvidia fanboy on Anandtech. There is nothing you could say to him to convince him that Freesync/Adaptive Sync is in any way better than G-Sync (pricing or otherwise). Just being unavailable on Nvidia hardware makes it completely useless to him. At least until Nvidia adopts it. Then it'll suddenly be a triumph of open standards, all thanks to VESA and Nvidia and possibly some other unimportant company. Reply
  • chizow - Monday, March 23, 2015 - link

    And Alexvrb is one of the staunchest AMD supporters on Anandtech. There is quite a lot that could convince me FreeSync is better than G-Sync, that it actually does what it sets out to do without issues or compromise, but clearly that isn't covered in this Bubble Gum Review of the technology. Unlike the budget-focused crowd that AMD targets, Price is not going to be the driving factor for me especially if one solution is better than the other at achieving what it sets out to do, so yes, while Freesync is cheaper, to me it is obvious why, it's just not as good as G-Sync.

    But yes, I'm honestly ambivalent to whether or not Nvidia supports Adaptive Sync or not, as long as they continue to support G-Sync as their premium option than it's np. Supporting Adaptive Sync as their cheap/low-end solution would just be one less reason for anyone to buy an AMD graphics card, which would probably be an unintended consequence for AMD.
    Reply

Log in

Don't have an account? Sign up now