AMD Crossfire Freesync Update: Delayed
by Ryan Smith on April 30, 2015 6:45 PM ESTWhen AMD launched Freesync back in March, one of the limitations of the initial launch version was that only single-GPU configurations were supported. Multi-GPU Crossfire systems could not be used with Freesync, requiring users to trade-off between Crossfire and Freesync. At the time AMD claimed that Crossfire Freesync support would be coming in April, however as April comes to a close it has become clear that such a release isn’t going to happen.
To that end, AMD has posted a short update on the status of Crossfire Freesync over on their forums. In the update, AMD states that after QA testing they believe that Crossfire Freesync is “is not quite ready for release” and that they will be holding it back as a result. Unfortunately AMD is not committing to a new release date for the feature, but given the fact that it’s more important to get this right than to get it out quickly, this is likely for the best.
Source: AMD
92 Comments
View All Comments
Vayra - Tuesday, May 5, 2015 - link
ULMB for example gets disabled on many a monitor, quite similar to the lack of anti-ghosting on FreeSync...doggghouse - Wednesday, May 6, 2015 - link
ULMB is disabled on *all* monitors when G-Sync is enabled. However, that's also the case for FreeSync. I'm pretty sure that strobing the backlight is not possible when dealing with variable refresh rates... at least for now.I think comparing ULMB to pixel overdrive (anti-ghosting) is apples to oranges, anyway.
Alexvrb - Friday, May 1, 2015 - link
Everyone knows he's a rabid Nvidia fan. Wait, I take that back. That statement might be unfair to average, everyday rabid Nvidia fans. There is no talking changing Chizow's mind or talking any kind of sense into him. He will NEVER agree with anything that isn't pro-Nvidia.My personal favorite is how he insisted that the R9 285 would throttle like crazy, and that said throttling was completely unacceptable. When it didn't he refused to admit he was wrong and changed arguments. Then it's proven that some Maxwell cards throttle like crazy even on regular games and suddenly it's not important if a card throttles.
Anyway he'd be perfectly happy to pay a hefty premium for a proprietary solution that only works on Nvidia cards. Many normal people would prefer a solution which is based on VESA standards and can be implemented by any GPU vendor. That way you can buy one monitor and you aren't locked into a single vendor to use one of its biggest features.
chizow - Saturday, May 2, 2015 - link
LMAO, from Alexvrb, the captain of the AMD turd polishing patrol. Did you personally work on Turdga btw? I have to imagine so as that is the only possible explanation for why someone would so vigorously defend such an awful part. There's now rumors that Turdga wasn't even the fully enabled ASIC, so there's a good chance it would've actually used significantly more power without much of a performance increase. But only a massive AMD fanboy like you would consider this a positive for AMD and attempt to draw parallels to Nvidia's Maxwell arch as you did. You didn't have much to say when Nvidia completely killed it with the 970/980 did you?But yes, just as I stated, 285's TDP was bogus, in order to avoid boost issues, they needed to slap a 3rd party cooler on it at which point its TDP was closer to the 280 with barely the performance to match. Truly, only a turd a die hard AMD fanboy like you would appreciate (or bother to defend as you did so vigorously).
Alexvrb - Sunday, May 3, 2015 - link
Unlike you, I own products from multiple vendors. My tablet even has a Tegra 3 (which granted is due for replacement soon, probably with an Intel powered model - such an AMD fanboy!). The only reason I appear to be an "AMD fanboy" to you is that you're buried so deep in Nvidia Cult Worship that any time ANYONE disagrees with you it's blasphemy and they must be vigorously attacked with multiple strawmen arguments and other such nonsense designed to move the goalposts so you can "win".chizow - Sunday, May 3, 2015 - link
Actually I do own products from multiple vendors as well, as recently as an R9 290X that was given to me by a vendor to validate, and of course, it reinforced the fact AMD Radeon products simply aren't good enough for my uses at either home or work.I also looked seriously into getting a Kaveri-based HTPC system, but after all the total costs, need for a chassis and PSU, and total power usage, it simply did not make sense for my needs for such a simple task as a DVR/mediaplex.
And I've said many times, to AMD fanboys such as yourself, if roles were reversed and there was an AMD logo on any of the Nvidia products I've purchased in the last 9 years since G80, I would've bought AMD without hesitation. Can you say the same? Of course not, because you buy inferior AMD products as any loyal fanboy would.
But you seem confused, you appear to be an AMD fanboy because you defend their most egregiously bad products, plain and simple. I mean anyone who bothers to defend Turdga as vigorously as you did is going to set off fanboy alarms anywhere, even AMDZone HQ.
chizow - Saturday, May 2, 2015 - link
And of course I'd pay a premium for something that actually worked and did all it says it would, who would pay a hefty premium for a monitor (AMD FreeSync panels still carry a premium) that didn't do all it said it would and actually performed WORST than a standard panel in many use-cases?Oxford Guy - Sunday, May 3, 2015 - link
Because we can all thank Nvidia for Mantle/Vulkan/DX12.Oh, that's right... they were too busy coming up with 28 GB/s VRAM partition schemes.
chizow - Sunday, May 3, 2015 - link
You're right, we have the consoles to thank for low level APIs, and Microsoft in particular for DX12.And 28GB/s VRAM partition schemes on a part that completely decimated AMD's product stack was certainly welcome, even for AMD fans who could find their favorite cards for a fraction of the price, thanks to Nvidia.
Oxford Guy - Sunday, May 3, 2015 - link
If you're not even going to bother with the pretense of objectivity and rationality then don't reply to my comments.