Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
POST A COMMENT

350 Comments

View All Comments

  • Welsh Jester - Friday, March 20, 2015 - link

    To add to my last post, i think 1440p Freesync screens will be good for people with only 1 higher end GPU. Since they won't have to deal with micro stuttering that multi cards bring, smooth experience at 40+ fps. Good news. Reply
  • FlushedBubblyJock - Saturday, March 21, 2015 - link

    I was just about ready to praise AMD but then I see "must have and use display port"...
    Well, at least it appears AMD got it to work on their crap, and without massive Hz restrictions as they were on earlier reviews.
    So, heck, AMD might have actually done something right for once ?
    I am cautious - I do expect some huge frikkin problem we can't see right now --- then we will be told to ignore it, then we will be told it's nothing, then there's a workaround fix, then after a couple years it will be full blow admitted and AMD will reluctantly "fix it" in "the next release of hardware".
    Yes, that's what I expect, so I'm holding back the praise for AMD because I've been burned to a crisp before.
    Reply
  • cykodrone - Saturday, March 21, 2015 - link

    I actually went to the trouble to make an account to say sometimes I come here just to read the comments, some of the convos have me rolling on the floor busting my guts laughing, seriously, this is free entertainment at its best! Aside from that, the cost of this Nvidia e-penis would feed 10 starving children for a month. I mean seriously, at what point is it overkill? By that I mean is there any game out there that would absolutely not run good enough on a slightly lesser card at half the price? When I read this card alone requires 250W, my eyes popped out of my head, holy electric bill batman, but I guess if somebody has a 1G to throw away on an e-penis, they don't have electric bill worries. One more question, what kind of CPU/motherboard would you need to back this sucker up? I think this card would be retarded without at least the latest i7 Extreme(ly overpriced), can you imagine some tool dropping this in an i3? What I'm saying is, this sucker would need an expensive 'bed' too, otherwise, you'd just be wasting your time and money. Reply
  • cykodrone - Saturday, March 21, 2015 - link

    This got posted to the wrong story, was meant for the NVIDIA GeForce GTX Titan X Review, my humble apologies. Reply
  • mapesdhs - Monday, March 23, 2015 - link

    No less amusing though. ;D

    Btw, I've tested 1/2/3x GTX 980 on a P55 board, it works a lot better than one might think.
    Also test 1/2x 980 with an i5 760, again works quite well. Plus, the heavier the game, the
    less they tend to rely on main CPU power, especially as the resolution/detail rises.

    Go to 3dmark.com, Advanced Search, Fire Strike, enter i7 870 or i5 760 & search, my
    whackadoodle results come straight back. :D I've tested with a 5GHz 2700K and 4.8GHz
    3930K aswell, the latter are quicker of course, but not that much quicker, less than most
    would probably assume.

    Btw, the Titan X is more suited to solo pros doing tasks that are mainly FP32, like After Effects.
    However, there's always a market for the very best, and I know normal high street stores make
    their biggest profit margins on premium items (and the customers who buy them), so it's an
    important segment - it drives everything else in a way.

    Ian.
    Reply
  • mapesdhs - Monday, March 23, 2015 - link

    (Damn, still no edit, I meant to say the 3-way testing was with an i7 870 on a P55) Reply
  • Vinny DePaul - Sunday, March 22, 2015 - link

    I am a big fan of open standard. More the merrier. I stay with nVidia because they support their products better. I was a big fan of AMD GPU but the drivers were so buggy. nVidia updates their drivers so quickly and the support is just a lot better. I like G-sync. It worths the extra money. I hope my monitor can support FreeSync with a firmware upgrade. (Not that I have an AMD GPU.) Reply
  • Teknobug - Sunday, March 22, 2015 - link

    Now if I only can find a 24" monitor with these features, anything bigger than 24" is too large for me. Reply
  • gauravnba - Monday, March 23, 2015 - link

    Lot of G-Sync versus AMD bashing here. Personally it all comes down to whether or not I am being confined to an eco-system when going for either technology. If nVidia starts to become like Apple in that respect, I'm not very comfortable with it.
    However, I wonder if to adapt to FreeSync, does it take a lot of addition or modification of hardware on the GPU end. That might be one reason that nVidia didn't have to change much of their architecture on the GPU during the G-Sync launch and confined that to the scaler.
    AMD worked with VESA to get this working on GCN 1.1, but not on GCN 1.0. This may be another area where the technologies are radically different- one is heavily reliant on the scaler while the other may be able to divide the work to a certain extent? Then again, I'm quite ignorant of how scalers and GPUs work in this case.
    Reply
  • PixelSupreme - Monday, March 23, 2015 - link

    To be honest I don't give half a fudge about FreeSync or G-Sync. What gets my attention is ULMB/ strobed backlight. An IPS-Display (or well, OLED but...), WQHD and strobing that works on a range of refresh rates, including some that are mutliples of 24. That would be MY holy grail. The announced Acer XB270HU comes close but ULMB apparently only works on 85Hz ans 100Hz. Reply

Log in

Don't have an account? Sign up now