Earlier today NVIDIA announced G-Sync, its variable refresh rate technology for displays. The basic premise is simple. Displays refresh themselves at a fixed interval, but GPUs render frames at a completely independent frame rate. The disconnect between the two is one source of stuttering. You can disable v-sync to try and work around it but the end result is at best tearing, but at worst stuttering and tearing.

NVIDIA's G-Sync is a combination of software and hardware technologies that allows a modern GeForce GPU to control a variable display refresh rate on a monitor equipped with a G-Sync module. In traditional setups a display will refresh the screen at a fixed interval, but in a G-Sync enabled setup the display won't refresh the screen until it's given a new frame from the GPU.

NVIDIA demonstrated the technology on 144Hz ASUS panels, which obviously caps the max GPU present rate at 144 fps although that's not a limit of G-Sync. There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame.

There's a bunch of other work done on the G-Sync module side to deal with some funny effects of LCDs when driven asynchronously. NVIDIA wouldn't go into great detail other than to say that there are considerations that need to be taken into account.

The first native G-Sync enabled monitors won't show up until Q1 next year, however NVIDIA will be releasing the G-Sync board for modding before the end of this year. Initially supporting Asus’s VG248QE monitor, end-users will be able to mod their monitor to install the board, or alternatively professional modders will be selling pre-modified monitors. Otherwise in Q1 of next year ASUS will be selling the VG248QE with the G-Sync board built in for $399, while BenQ, Philips, and ViewSonic are also committing to rolling out their own G-Sync equipped monitors next year too. I'm hearing that NVIDIA wants to try and get the module down to below $100 eventually. The G-Sync module itself looks like this:

There's a controller and at least 3 x 256MB memory devices on the board, although I'm guessing there's more on the back of the board. NVIDIA isn't giving us a lot of detail here so we'll have to deal with just a shot of the board for now.

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

Although we only have limited information on the technology at this time, the good news is we got a bunch of cool demos of G-Sync at the event today. I'm going to have to describe most of what I saw since it's difficult to present this otherwise. NVIDIA had two identical systems configured with GeForce GTX 760s, both featured the same ASUS 144Hz displays but only one of them had NVIDIA's G-Sync module installed. NVIDIA ran through a couple of demos to show the benefits of G-Sync, and they were awesome.

The first demo was a swinging pendulum. NVIDIA's demo harness allows you to set min/max frame times, and for the initial test case we saw both systems running at a fixed 60 fps. The performance on both systems was identical as was the visual experience. I noticed no stuttering, and since v-sync was on there was no visible tearing either. Then things got interesting.

NVIDIA then dropped the frame rate on both systems down to 50 fps, once again static. The traditional system started to exhibit stuttering as we saw the effects of having a mismatched GPU frame rate and monitor refresh rate. Since the case itself was pathological in nature (you don't always have a constant mismatch between the two), the stuttering was extremely pronounced. The same demo on the g-sync system? Flawless, smooth.

NVIDIA then dropped the frame rate even more, down to an average of around 45 fps but also introduced variability in frame times, making the demo even more realistic. Once again, the traditional setup with v-sync enabled was a stuttering mess while the G-Sync system didn't skip a beat.

Next up was disabling v-sync with hopes of reducing stuttering, resulting in both stuttering (still refresh rate/fps mismatch) and now tearing. The G-Sync system, once again, handled the test case perfectly. It delivered the same smoothness and visual experience as if the we were looking at a game rendering perfectly at a constant 60 fps. It's sort of ridiculous and completely changes the overall user experience. Drops in frame rate no longer have to be drops in smoothness. Game devs relying on the presence of G-Sync can throw higher quality effects at a scene since they don't need to be as afraid of drops in frame rate excursions below 60 fps.

Switching gears NVIDIA also ran a real world demonstration by spinning the camera around Lara Croft in Tomb Raider. The stutter/tearing effects weren't as pronounced as in NVIDIA's test case, but they were both definitely present on the traditional system and completely absent on the G-Sync machine. I can't stress enough just how smooth the G-Sync experience was, it's a game changer.

The combination of technologies like GeForce Experience, having a ton of GPU performance and G-Sync can really work together to deliver a new level of smoothness, image quality and experience in games. We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level.

Update: NVIDIA has posted a bit more information about G-Sync, including the specs of the modified Asus VG248QE monitor, and the system requirements.

NVIDIA G-Sync System Requirements
Video Card GeForce GTX 650 Ti Boost or Higher
Display G-Sync Equipped Display
Driver R331.58 or Higher
Operating System Windows 7/8/8.1

Comments Locked

217 Comments

View All Comments

  • TheJian - Saturday, October 19, 2013 - link

    Incorrect. Mantle and TrueAudio will need to have heavy investment from a Broke AMD. Much like OpenCL...Why code for it? Just to be open? That isn't free coding and cuda is already there for every app funded by billions from NV over 7yrs. On the other hand Gsync will be used and is desired by all who are tired of tearing/stutter. It's a lot easier to push something that we all want, and will likely be asked to pay a premium for (thus companies see more profit and pushes new product purchases), vs something that just costs more to implement for a small subset of AMD's small discrete market share. You can't charge $80 for a mantle optimized game. People will still only pay $60 (not so with enhanced monitors etc). Mantle died the second it said it would help people get off consoles, which of course caused consoles to freak. MS had an immediate response and Sony will do the same (BLOCK IT). If your games run great elsewhere you don't need my consoles. Thus I block your tech forever. 8mil for Dice to use Mantle on Battlefield4. At that price AMD can't afford another game or two let alone dozens yearly. Mantle has been slain like it or not. Trueaudio is the same story, same subset, doesn't allow higher priced games etc. Just costly to implement. I already have a sound card that works ok.

    NV wins the proprietary war (good or bad, just the facts it seems). Choice over if it's based on tech and the chances of that tech being implemented everywhere. :) I will gladly pay $100 for years of great gaming across titles without anything but NV driver work rather than praying for every dev coding for it. My current 24in/22in combo (dell 24+LG 22) are both 5+yrs old (dell is over 6yrs). Less than $20 a year for a vastly improved experience all the time. Sign me up.

    "I can't stress enough just how smooth the G-Sync experience was, it's a game changer."
    All I need to know as I'm planning a maxwell purchase and new monitor (27 or 30in to drop my 22in) next year. They'll get a tegra 5/6 tablet or shield rev2 or 3 out of my wallet eventually to keep it all tied together (much like getting sucked into the Apple ecosystem used to cause). I'm not seeing how AMD's new tech will stop me from switching knowing it probably won't be supported unless paid for by AMD.

    To get the tablet from me they'll need to put tegra 5/6 in a 1080p 13 or 20in though. 2560x1600 just makes games too slow until 16/14nm. My radeon 5850 has be be jacked all around just for my 1920x1200 dell 24 and when that fails I go to the 22in 1680x1050 for happiness...LOL. Expecting a tablet to push great games at 2560x1600 in an 6-8w envelope or so is just stupid. My dad's nexus 10 looks great, but gaming is pretty crap at that res for a lot of stuff and I have no need to have a res higher than 1080p for movies in that small of a form factor. (nor games, my dell looks great at 1920x1200). In a 13-20in 1080p is fine for me an a perfect fit for movies on the go etc.
  • SlyNine - Saturday, October 19, 2013 - link

    You talk and talk, but its all your opinion or complete conjecture. You sound very arrogant. Like if anyone really believes you have some magical crystal ball...

    mantel is pretty interesting, we will see what happens. Don't kid yourself into thinking you know.
  • Klimax - Saturday, October 19, 2013 - link

    All he needs verified facts for base and then just build up the case. Which he did.

    You'll need bit more then the cheap ad hominem and misdirection.
  • SlyNine - Saturday, October 19, 2013 - link

    What facts did he use again?
  • TheJian - Monday, October 28, 2013 - link

    Paul Graham's Hierarchy of Disagreement
    https://en.wikipedia.org/wiki/Ad_hominem
    You need to read this :)

    Facts I gave were TrueAudio and Mantle require INVESTMENT from AMD as devs don't just pony up their own funds for free. To write for mantle for 1/3 of the market (and less than that as all cards from AMD don't support it anyway, so writing for niche) will be on TOP of writing for everyone else (amd cards that don't support it, NVidia, Intel etc). They will just write for ALL unless AMD pays them. That is a FACT. AMD paid 8mil for BF4 get it? You can't charge more for a MANTLE optimized game right? So it gains devs nothing financially, so again AMD pay me or buzz off with your mantle extra code I need to write crap...Get it?

    On the other hand we ALL want stutter free, tear free gaming. I can sell that for more money (if I'm a monitor maker etc). So I'm inspired to pitch that as it adds new products and EXTRA profits over regular models. NV doesn't have to pay me to want more money, I just get to charge it anyway...LOL. FACT. Easy to push something like that vs. something that gets a dev NOTHING. Understand the difference? EA can't charge more for BF4 because it works better on mantle cards, so they gain nothing but extra dev cost, which they won't pay, hence the check from AMD.

    Each game needs mantle/Truaudio help from devs (extra work), but once I get a monitor with gsync I'm gold for years. Which is easier to push? FACT:the one that takes less work and makes a lot of people extra cash (Gsync).

    Also with Gysnc devs get something they've wanted for a long time at some point if desired (don't have to but if desired once ubiquitous). When you are above perf you need in a portion of the game (say when you're hitting 100fps) they can code to use the extra fps to use extra graphics. So when you have extra power the game AMPS up so to speak on the fly. It's NV's job to get that on the fly part right (or AMD if they end up on board or make their own compatible type gysnc). I'd much rather be dependent on drivers than 100 devs etc to figure out if I can run full blast here or there. It puts more work on NV rather than a dev. They are tired of writing for 30/60fps and being limited by the lowest point in a game, they want to take that extra power to amp up when available.

    Also as the OP stated I added to my base statements with a quote from Anand himself:
    "I can't stress enough just how smooth the G-Sync experience was, it's a game changer."
    Then explained why it affects my purchase and why most will do the same. They get me for the monitor because I just want that stutter free smooth stuff and better games later, then they get me for a card, probably tablet etc, just like the apple ecosystem. Then I note AMD has no way (currently?) that I can see stopping me from getting hit buy an ecosystem.

    BASE (central point, whatever)
    Supporting statements
    Conclusion

    You countered with nothing. 'you have an opinion and have no crystal ball'...Pfft. Best you got?

    It's not conjecture to see Apple's ecosystem suck people in and keep them. This is the same strategy. AMD has a strategy too, it's just not the right one to win for all the reasons I pointed out (they have to PAY people to do it for no return on the time, no higher priced games etc). IF they could get away with charging $80 instead of $60 because it's a "special mantle game" then maybe devs would bite. But reality is that's a tough sell. So if you can't get your tech ubiquitous then it won't sell more of your cards/tech. AMD's ecosystem is a tough sell. NV's isn't and makes money for companies, pushes products, raises prices (at least until gsync is a commodity item at some point). You don't like the facts, so you choose to ignore them. I never said mantle isn't interesting, and I don't even care if its AWESOME (could be, but who cares? only AMD). It's not EXTRA profitable for game devs to write for it, so useless to them. Understand? Code ALWAYS will have to be written anyway for everyone else (until AMD owns 90% of the market...LOL) so this will never be adopted. Remember Glide? Did you watch Carmack, Andersson, Sweeney video discussing this? They all said this is NOT the way to go, with more API's and hope NV doesn't pull this too. Nuff said. No love from top three devs, only they hope it leads to ideas in the mainstream (like stuff added into opengl or something), not that they want it to live. Carmack came right out and said we already have this with nvidia extensions etc if you want direct to hardware it's pretty much already in there. Translation:he's not impressed.
  • JacFlasche - Thursday, November 14, 2013 - link

    It seems to me that your entire harangue is base upon a faulty premise. One of the main objectives of mantle as recently stated in the interview on TH is that the companies that code games for consoles will be able to port low level work directly to a PC release that will be able to use them as they are. This in my mind, entirely negates your assumption that it would add expense for game studios. As far as your economic forcasts Fobes disagrees with you. http://www.forbes.com/sites/jasonevangelho/2013/10... At any rate They will be able to sell twice as many of these if they are eventually AMD compatible. I don't see it as an either/or situation. I want and AMD card with mantle and a G-sync type monitor. And there will be other types of G-sync tech in my OPINION. At any rate, from reading about both techs, I still think mantle is the more significant development, and if things stand as they presently are, I will upgrade with mantle and wait to upgrade my monitor until either a head mounted display like rift is perfected in high res (way better and cheaper than a monitor) hopefully with a g-sync like tech, but if not, you will still have mantle work on it. My next monitor purchase will be a oled monitor, for which I have been waiting for a decade now. When they hit 2k in price, I will buy. Until then I will make due with my old Samsung and a new Occulus Rift, when they are more perfected. How is that G-sync going to help me on a Rift? It won't unless it is included in the hardware, which I doubt. What would really be nice is a g-synch module that you can jack in to any display devise. Then It would be killer.
  • SlyNine - Saturday, October 19, 2013 - link

    And I didn't come to any conclusion, I simply countered with; you don't know any more than anyone else. Which is true, is it not?
  • treeroy - Saturday, October 19, 2013 - link

    Next-gen games are already $80 so your argument that "people will only pay $60" is a joke.
    Mantle games are not going to cost $8 million for AMD each - it was only that high because it's the launch title, it's much to do with marketing more than investment in games. Mantle will not cause games to go up in price, that's completely illogical - game prices have been frozen for the past 6 years or so, and there has been PLENTY of new technology introduced in that time.

    And the notion that consoles will block Mantle is crazy. The PS4 for one is being as open as possible, so Sony isn't going to say "Oh actually you can't make that game on our platform because it will move people to PC", and I imagine it's the same on the green side of the consoles. Moreover, AMD has complete dominance in the next-gen console generation, so even if Mantle doesn't take off (I think we're all sceptical of it), multiplatform games will still get optimised for AMD technology, which for many people is going to keep them/move them into the AMD side of the graphics war.

    You seem to be an nvidia fanboy. You should probably stop.
  • medi02 - Sunday, October 20, 2013 - link

    If mantle is close to API's exposed to console devs, it will surely take off.
  • ninjaquick - Thursday, October 24, 2013 - link

    lol, DICE's repi has already said that while Mantle adds some overhead, it really has been designed to be practically copy-paste. The only reason it exists at all is because developers want it. 8 Million for DICE to use Mantle? lolwut?

    Then there is TrueAudio, whichsupports FMOD and WWise, the two most popular audio engines, through a plugin. Again, minimal developer input required to use the hardware. It isn't some mystical X-Fi style solution either, it simply takes FMOD and WWise instructions and runs them through a dedicated compute path, away from the CPU.

    Lastly, Microsoft cannot actually block Mantle. They simply do not support, as in provide customer support, for it on Xbone. Sony has no reason to decline Mantle support as they are not trying to force developers to use Direct3D code.

    You think Mantle will fail, which is funny since Maxwell is including an ARM co-processor to basically do the exact same thing, provide an alternative programming path for developers, except it is fully proprietary and only guaranteed to work on Maxwell cards, whereas Mantle will work on any GCN (VLIW 4/1) cores.

Log in

Don't have an account? Sign up now