This is something that initially caught me off-guard when I first realized it, but AMD historically hasn’t liked to talk about their GPU plans much in advance. On the CPU size we’ve heard about Carrizo and Zen years in advance. Meanwhile AMD’s competitor in the world of GPUs, NVIDIA, releases some basic architectural information over a year in advance as well. However with AMD’s GPU technology, we typically don’t hear about it until the first products implementing new technology are launched.

With AMD’s GPU assets having been reorganized under the Radeon Technologies Group (RTG) and led by Raja Koduri, RTG has recognized this as well. As a result, the new RTG is looking to chart a bit of a different course, to be a bit more transparent and a bit more forthcoming than they have in the past. The end result isn’t quite like what AMD has done with their CPU division or their competition has done with GPU architectures – RTG will talk about both more or less depending on the subject – but among several major shifts in appearance, development, and branding we’ve seen since the formation of the RTG, this is another way in which RTG is trying to set itself apart from AMD’s earlier GPU groups.

As part of AMD’s RTG technology summit, I had the chance to sit down and hear about RTG’s plans for their visual technologies (displays) group for 2016. Though RTG isn’t announcing any new architecture or chips at this time, the company has put together a roadmap for what they want to do with both hardware and software for the rest of 2015 and in to 2016. Much of what follows isn’t likely to surprise regular observers of the GPU world, but it none the less sets some clear expectations for what is in RTG’s future over much of the next year.

DisplayPort 1.3 & HDMI 2.0a: Support Coming In 2016

First and foremost then, let’s start with RTG’s hardware plans. As I mentioned before RTG isn’t announcing any new architectures, but they are announcing some of the features that the 2016 Radeon GPUs will support. Among these changes is a new display controller block, upgrading the display I/O functionality we’ve seen as the cornerstone of AMD’s GPU designs since GCN 1.1 was first launched in 2013.

The first addition here is that RTG’s 2016 GPUs will be including support for DisplayPort 1.3. We’ve covered the announcement of DisplayPort 1.3 separately in the past, where in 2014 the VESA announced the release of the 1.3 standard. DisplayPort 1.3 will introduce a faster signaling mode for DisplayPort – High Bit Rate 3 (HBR3) – which in turn will allow DisplayPort 1.3 to offer 50% more bandwidth than the current DisplayPort 1.2 and HBR2, boosting DisplayPort’s bandwidth to 32.4 Gbps before overhead.

DisplayPort Supported Resolutions
Standard Max Resolution
(RGB/4:4:4, 60Hz)
Max Resolution
(4:2:0, 60Hz)
DisplayPort 1.1 (HBR1) 2560x1600 N/A
DisplayPort 1.2 (HBR2) 3840x2160 N/A
DisplayPort 1.3 (HBR3) 5120x2880 7680x4320

The purpose of DisplayPort 1.3 is to offer the additional bandwidth necessary to support higher resolution and higher refresh rate monitors than the 4K@60Hz limit of DP1.2. This includes supporting higher refresh rate 4K monitors (120Hz), 5K@60Hz monitors, and 4K@60Hz with higher color depths than 8 bit per channel color (necessary for a good HDR implementation). DisplayPort’s scalability via tiling has meant that some monitor configurations have been possible even via DP1.2 by utilizing MST over multiple cables, however with DP1.3 it will now be possible to support those configurations in a simpler SST configuration over a single cable.

For RTG this is important on several levels. The first is very much pride – the company has always been the first GPU vendor to implement new DisplayPort standards. But at the same time DP1.3 is the cornerstone of multiple other efforts for the company. The additional bandwidth is necessary for the company’s HDR plans, and it’s also necessary to support the wider range of refresh rates at 4K necessary for RTG’s Freesync Low Framerate Compensation tech, which requires a 2.5x min:max ratio to function. That in turn has meant that while RTG has been able to apply LFC to 1080p and 1440p monitors today, they won’t be able to do so with 4K monitors until DP1.3 gives them the bandwidth necessary to support 75Hz+ operation.

Meanwhile DisplayPort 1.3 isn’t the only I/O standard planned for RTG’s 2016 GPUs. Also scheduled for 2016 is support for the HDMI 2.0a standard, the latest generation HDMI standard. HDMI 2.0 was launched in 2013 as an update to the HDMI standard, significantly increasing HDMI’s bandwidth to support 4Kp60 TVs, bringing it roughly on par with DisplayPort 1.2 in terms of total bandwidth. Along with the increase in bandwidth, HDMI 2.0/2.0a also introduced support for other new features in the HDMI specification such as the next-generation BT.2020 color space, 4:2:0 chroma sampling, and HDR video.

That HDMI has only recently caught up to DisplayPort 1.2 in bandwidth at a time when DisplayPort 1.3 is right around the corner is one of those consistent oddities in how the two standards are developed, but none the less this important for RTG. HDMI is not only the outright standard for TVs, but it’s the de facto standard for PC monitors as well; while you can find DisplayPort in many monitors, you would be hard pressed not to find HDMI. So as 4K monitors become increasingly cheap – and likely start dropping DisplayPort in the process – supporting HDMI 2.0 will be important for RTG for monitors just as much as it is for TVs.

Unfortunately for RTG, they’re playing a bit of catch-up here, as the HDMI 2.0 standard is already more than 2 years old and has been supported by NVIDIA since the Maxwell 2 architecture in 2014. Though they didn’t go into detail, I was told that AMD/RTG’s plans for HDMI 2.0 support were impacted by the cancelation of the company’s 20nm planar GPUs, and as a result HDMI 2.0 support was pushed back to the company’s 2016 GPUs. The one bit of good news here for RTG is that HDMI 2.0 is still a bit of a mess – not all HDMI 2.0 TVs actually support 4Kp60 with full chroma sampling (4:4:4) – but that is quickly changing.

FreeSync Over HDMI to Hit Retail In Q1’16
Comments Locked

99 Comments

View All Comments

  • Azix - Wednesday, December 9, 2015 - link

    GCN is just a name. It does not mean there aren't major improvements. Nvidia constantly changing their architecture name is not necessarily an indication its better, its usually the same improvements over an older arch.

    Also it seems AMD is ahead of the game with GCN and nvidia is playing catchup, having to cut certain things out to keep up.
  • BurntMyBacon - Thursday, December 10, 2015 - link

    @Azix: "Also it seems AMD is ahead of the game with GCN and nvidia is playing catchup, having to cut certain things out to keep up."

    I wouldn't go that far. nVidia is simply focused differently that ATi at the moment. ATi gave up compute performance for gaming back in the 6xxx series and brought compute performance back with the 7xxx series. Given AMD's HSA initiative, I doubt we'll see them make that sacrifice again.

    nVidia on the other hand decided to do something similar going from Fermi to little Kepler (6xx series). They brought compute back to some extent for big Kepler (high end 7xx series), but dropped it again for Maxwell. This approach does make some sense as the majority of the market at the moment doesn't really care about DP compute. The ones that do can get a Titan, a Quadro, or if the pattern holds, a somewhat more DP capable consumer grade card once every other generation. On the other hand, dropping the DP compute hardware allows them to more significantly increase gaming performance at similar power levels on the same process. In a sense, this means the gamer isn't paying as much for hardware that is of little or no use to gaming.

    At the end of the day, it is nVidia that seems to be ahead in the here and now, though not by as much as some suggest. It is possible that ATi is ahead when it comes to DX12 gaming and their cards may age more gracefully than Maxwell, but that remains to be seen. More important will be where the tech lays out when DX12 games are common. Even then, I don't think that Maxwell with have as much of a problem as some fear given that DX11 will still be an option.
  • Yorgos - Thursday, December 10, 2015 - link

    What are the improvements that Maxwell offer?
    They can run better the binary blobs from crapworks?
    e.g. lightning effects http://oi64.tinypic.com/2mn2ds3.jpg
    or efficiency
    http://www.overclock.net/t/1497172/did-you-know-th...
    or 3.5 GB vram,
    or obsolete architecture for the current generation of games(which has already started)

    Unless you have money to waste, there is no other option in the GPU segment.
    a lot of GTX 700 series owners say so.(amongst others)
  • Michael Bay - Thursday, December 10, 2015 - link

    I love how you convenienty forgot to mention not turning your case into a working oven, and then grasped sadly for muh 3.5 GBs as if it matters in the real world with overwhelming 1080p everywhere.

    And then there is an icing on a cake in the form of hopeless wail on "current generation of games(which has already started)". You sorry amdheads really don`t see irony even if it hits you in the face.
  • slickr - Friday, December 11, 2015 - link

    Nvidia still pretty much uses the same techniques/technology as they had in their old 600 series graphics. Just because they've names the architecture differently doesn't mean it is.

    AMD's major architectural change will be in 2016 when they move to 14nm FinFet, so will Nvidia's when they move to 16nm FinFet.

    AMD already has design elements like HBM in their current GCN Fiji architecture that they can more easily implement for their new GPU's which are supposed to start arriving in late Q2 2016.
  • Zarniw00p - Tuesday, December 8, 2015 - link

    FS-HDMI for game consoles, DP 1.3 for Apple who would like to update their Mac Pros with DP 1.3 and 5k displays.
  • Jtaylor1986 - Tuesday, December 8, 2015 - link

    This is a bit strange since almost the entire presentation depends on action by the display manufacturer industry and industry standard groups. I look forward to them announcing what they are doing in 2016, not what they are trying to get the rest of the industry to do in 2016.
  • bug77 - Tuesday, December 8, 2015 - link

    Well, most of the stuff a video card does depends on action by the display manufacturer...
  • BurntMyBacon - Thursday, December 10, 2015 - link

    @Jtaylor1986: "This is a bit strange since almost the entire presentation depends on action by the display manufacturer industry and industry standard groups."

    That's how it works when you want a non-proprietary solution that allows your competitors to use it as well. ATi doesn't want to give away their tech any more than nVidia does. However, they also realize that Intel is the dominant graphics manufacturer in the market. If they can get Intel on board with a technology, then there is some assurance that the money invested isn't wasted.

    @Jtaylor1986: "I look forward to them announcing what they are doing in 2016, not what they are trying to get the rest of the industry to do in 2016."

    Point of interest: It is hard to get competitors to work together. They don't just come together and do it. Getting Acer, LG, and Samsung to standardize on a tech (particularly a proprietary one) means that there has already been significant effort invested in the tech. Also, getting Mstar, Novatek, and Realtek to make compatible hardware is similar to getting nVidia and ATi or AMD and Intel to do the same. IBM forced Intel's hand. Microsoft's DirectX standard forced ATi and nVidia (and 3DFX for that matter) to work in a compatible way.

    Beyond that, it isn't as if ATi has is doing nothing. It is simply that their work requires cooperation for all parties involved. Cooperation that they've apparently obtained. This is what is required when you think about the experience beyond just your hardware. nVidia does similar with their Tesla partners, The Way it's Meant to be Played program, and CUDA support. Sure, they built the chip themselves with gsync, but they still had to get support from monitor manufacturers to get the chip into a monitor.
  • TelstarTOS - Thursday, December 10, 2015 - link

    and the display industry has been EXTREMELY slow at producing BIG, quality display, not even counting 120 and 144hz. I dont see them embracing HDR (10-12 bit panels) any soon :(
    My monitor bought last year is going to last awhile, until they catch up with the video card makers.

Log in

Don't have an account? Sign up now