FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
POST A COMMENT

350 Comments

View All Comments

  • JarredWalton - Friday, March 20, 2015 - link

    FYI, ghosting is a factor of the display and firmware, not of the inherent technology. So while it's valid to say, "The LG FreeSync display has ghosting..." you shouldn't by extension imply FreeSync in and of itself is the cause of ghosting. Reply
  • chizow - Friday, March 20, 2015 - link

    So are you saying a firmware flash is goiing to fix this, essentially for free? Yes that is a bit of a troll but you get the picture. Stop making excuses for AMD and ask these questions to them and panel makers, on record, for real answers. All this conjecture and excuse-making is honestly a disservice to your readers who are going to make some massive investment (not really) into a panel that I would consider completely unusable.

    You remember that Gateway FPD2485W that you did a fantastic review of a few years ago? Would you go back to that as your primary gaming monitor today? Then why dismiss this problem with FreeSync circa 2015?
    Reply
  • chizow - Friday, March 20, 2015 - link

    Who said no ghosting? lol. There's lots of ghosting, on the FreeSync panels. Reply
  • TheJian - Sunday, March 22, 2015 - link

    You're assuming gsync stays the same price forever. So scalers can improve pricing (in your mind) to zero over time, but NV's will never shrink, get better revs etc...LOL. OK. Also you assume they can't just lower the price any day of the week if desired. Microsoft just decided to give away Windows 10 (only to slow android but still). This is the kind of thing a company can do when they have 3.7B in the bank and no debt (NV, they have debt but if paid off, they'd have ~3.7b left). They could certainly put out a better rev that is cheaper, or subsidize $50-100 of it for a while until they can put out a cheaper version just to slow AMD down.

    They are not equal. See other site reviews besides and AMD portal site like anandtech ;)

    http://www.pcper.com/reviews/Displays/AMD-FreeSync...
    There is no lic fee from NV according to PCper.
    "It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."
    Which basically shows VENDORS must be marking things up quite a lot. But that is too be expected with ZERO competition until this week.

    "For NVIDIA users though, G-Sync is supported on GTX 900-series, 700-series and 600-series cards nearly across the board, in part thanks to the implementation of the G-Sync module itself."
    Not the case on the AMD side as he says. So again not so free if you don't own a card. NV people that own a card already are basically covered, just buy a monitor.

    Specs of this is misleading too, which anandtech just blows by:
    "The published refresh rate is more than a bit misleading. AMD claims that FreeSync is rated at 9-240 Hz while G-Sync is only quoted at 30-144 Hz. There are two issues with this. First, FreeSync doesn’t determine the range of variable refresh rates, AdaptiveSync does. Second, AMD is using the maximum and minimum refresh range published by the standards body, not an actual implementation or even a planned implementation from any monitor or display. The 30-144 Hz rating for G-Sync is actually seen in shipping displays today (like the ASUS PG278Q ROG Swift). The FreeSync monitors we have in our office are either 40-144 Hz (TN, 2560x1440) or 48-75 Hz (IPS, 2560x1080); neither of which is close to the 9-240 Hz seen in this table."

    Again, read a site that doesn't lean so heavily to AMD. Don't forget to read about the GHOSTING on AMD. One more point, PCper's conclusion:
    "My time with today’s version of FreeSync definitely show it as a step in the right direction but I think it is far from perfect."
    "But there is room for improvement: ghosting concerns, improving the transition experience between VRR windows and non-VRR frame rates and figuring out a way to enable tear-free and stutter-free gaming under the minimum variable refresh rate."
    "FreeSync is doing the right things and is headed in the right direction, but it can’t claim to offer the same experience as G-Sync. Yet."

    Ok then...Maybe Freesync rev2 gets it right ;)
    Reply
  • soccerballtux - Friday, March 20, 2015 - link

    you must be a headcase or, more likely, are paid for by NVidia to publicly shill. Gsync requires a proprietary NVidia chip installed in the monitor that comes from, and only from, NVidia.

    It's much easier to simply set a flag-byte in the DisplayPort data stream that says "ok render everything since the last render you rendered, now". There's nothing closed about that.
    Reply
  • chizow - Friday, March 20, 2015 - link

    And? Who cares if it results in a better solution? LOL only a headcase or a paid AMD shill would say removing hardware for a cheaper solution that results in a worst solution is actually better. Reply
  • soccerballtux - Friday, March 20, 2015 - link

    wellll, if it's cheaper and a better solution, then the market cares. Reply
  • chizow - Friday, March 20, 2015 - link

    Except its cheaper and worst, therefore it should be cheaper Reply
  • bloodypulp - Friday, March 20, 2015 - link

    Oh darn... so what you're saying is that I have to purchase the card that costs less, then I have to purchase the monitor that costs less too? Sound like a raw deal... ROFL!!

    And as far as your bogus oppenness argument goes: There is nothing preventing Nvidia from supporting Adaptive Sync. NOTHING. Well, unless you count hubris and greed. In fact, Mobile G-Sync already doesn't even require the module! I guess that expensive module really wasn't necessary after all...

    And lastly, Nvidia has no x86 APU to offer, so they can't offer what AMD can with their Freesync-supporting APUs. Nvidia simply has nothing to compete with there. Even gamers on a tight budget can enjoy Freesync! The same simply cannot be said for GSync.
    Reply
  • Denithor - Friday, March 20, 2015 - link

    ONLY closed because NVIDIA refuses to support freesync. Much like OpenCL. And PhysX. Reply

Log in

Don't have an account? Sign up now