Computex 2014: AMD Demonstrates First FreeSync Monitor Prototype
by Ryan Smith & Anand Lal Shimpi on June 5, 2014 6:00 AM EST- Posted in
- Displays
- AMD
- GPUs
- FreeSync
- Computex 2014
Our very own Anand Shimpi just got off of the Computex showfloor for a bit after paying a visit to AMD’s booth. Among the items AMD is showing at Computex is the current status of their FreeSync project, whose base feature of Adaptive Sync was recently added to the DisplayPort 1.2a standard as an extension.
First announced at CES 2014, FreeSync is AMD’s take on variable refresh monitors, utilizing some variable refresh functionality first designed for embedded DisplayPort (eDP). At the time AMD was showing off the concept on laptops (due to the need for eDP) but are back at Computex with an update on the project.
Here at Computex AMD is showing off the first prototype monitor that is FreeSync capable, which interestingly enough is based on a retail monitor that was hardware capable and could be converted with updated firmware. AMD’s actual demo hasn’t changed – they’re still running the fan blade demo we saw at CES 2014 – but it’s now running on external monitors. The monitor in question operates with a fairly narrow range of rates of just 40Hz to 60Hz, which for the purposes of a prototype is good enough to showcase that the technology works, though it is narrower than the refresh ranges AMD is expecting for retail monitors.
At this point AMD is emphasizing that while they were able to get FreeSync up and running on existing hardware, owners shouldn’t be expecting firmware updates as this is very unlikely to happen (though this is ultimately up to monitor manufacturers). Instead AMD is using it to demonstrate that existing panels and scalers already exist that are capable of variable refresh, and that retail monitors should not require significant/expensive technology upgrades. Meanwhile AMD is also holding to their earlier projection of 6-12 months for retail monitor availability with retail prototypes expected around September, which puts final retail availability potentially as early as very late this year, but more likely into the first half of 2015.
Finally we have a video interview of the FreeSync demo in action. It bears mentioning that YouTube is limited to 30fps, so while we can give you some idea of what FreeSync performs like it’s not a fully capable representation. That will have to wait for closer to release when we can sit down with a high speed camera.
44 Comments
View All Comments
shivabowl - Thursday, June 5, 2014 - link
It lookie like a Nixeus 27" DisplayPort monitor that AMD is using to demonstrate Freesync.Septor - Tuesday, June 10, 2014 - link
Yup, cause it is. Apparently with updated firmware however.Anonymous Blowhard - Thursday, June 5, 2014 - link
> The monitor in question operates with a fairly narrow range of rates of just 40Hz to 60HzWorks for me. If you're dropping below that, you should adjust game detail until you're closer to 60 again. IMO G-Sync/FreeSync should be a way to minimize the impact of minor dips in framerate, not as an excuse to accept lower performance.
Guspaz - Thursday, June 5, 2014 - link
Doesn't work for me. Games should support 30-60, but you need to support as low as ~23 if you want to work with films.Technically you can set a monitor to 23.976 Hz... but only if the timing of all components works out (many GPUs don't allow timing that precise, or make it very difficult to set/use), and you need really complex software on the player side to synchronize that precisely.
With FreeSync/G-Sync, all these problems go away. Everything becomes super simple: the player decodes the video at what it thinks is real-time, and updates the monitor every time it wants to present a frame. The exact timing is less important, because the average framerate will be 23.976 even if frame present times aren't super precise, and dropped/duplicated frames simply won't be a problem at all.
tuxRoller - Thursday, June 5, 2014 - link
http://www.forbes.com/sites/jasonevangelho/2014/05...According to that person, there are around 4 supported ranges: 36-240, 21-144, 17-120, 9-60. My guess is that the standard, or the AMD hardware, has a limited number of modes it can handle (or, these ranges are defined for faster transition between modes).
Gigaplex - Thursday, June 5, 2014 - link
You still need to make sure the frame times sync with the audio, so it isn't entirely super simple.yasamoka - Friday, June 6, 2014 - link
...which we have been doing since 3D gaming with audio became the norm...You just need to display the frames as they get completed directly to the monitor, instead of combining parts of frames with parts of other frames and display that combination whenever the monitor has to refresh (every fixed interval).
edzieba - Friday, June 6, 2014 - link
As-they-arrive frame updating actually makes audio sync a LOT easier. If you can assume that 24 frames per second will be displayed as 24 equally spaced frames per second (23.976, you get the idea) rather than the current case of having to double and shift frames to fit into the 60fps monitor refresh rate, keeping audio in sync is made dramatically simpler.Same with game audio. You run audio at game logic speed, and display updates happen as soon as a frame is rendered. You don't need to worry about delaying certain sound effects (but not background music or environmental sound) if a frame is late and you need to repeat the last frame to keep in sync with the display refresh.
Senti - Friday, June 6, 2014 - link
> you need to support as low as ~23 if you want to work with filmsNonsense. 48 (47.952), 72 or 120 work even better than 24 for film refreshes as in case of random glitch it would last only fraction of frame unlike full frame time in 24 fps mode.
HisDivineOrder - Friday, June 6, 2014 - link
Eh. Adaptive Sync sounds great, but is unproven. FreeSync is just AMD's brand name for that. Again, unproven.Meanwhile, Gsync is in shipping product. AMD really needed to show up with the fully armed and operational battlestation if they wanted to come out looking great today.
Otherwise, they're just telling us what we already know. In a year or so, we'll see a FEW monitors with Adaptive Sync (not to be confused with Adaptive V-sync, dig at nVidia) that are overpriced to account for the "extra certification" and are more broadly compatible than Gsync. But we won't know how WELL they work until later (than today) because all we're seeing from AMD is a vague proof of concept that doesn't really prove the concept.
Prove the concept by showing us the lower range and upper range you're advertising or go home, AMD. Because nVidia has said for a while now that going as low as AMD is promising is not ideal and yet there's AMD promising it without being able to actually SHOW it.