The Future: Competition, Secrecy, & the Unexpected

Finally, while Apple developing their own GPU is not unexpected given their interests and resources, the ramifications of it may very well be. There hasn’t been a new, major GPU vendor in almost a decade – technically Qualcomm’s team would count as the youngest, though it’s a spin-off of what’s now AMD’s Radeon Technologies Group – and in fact like the overall SoC market itself, the market for GPU vendors has been contracting as costs go up and SoC designers settle around fewer, more powerful GPU vendors. So for someone as flush with cash as Apple to join the GPU race is a very big deal; just by virtue of starting development of their own GPU, they are now the richest GPU designer.

Of course, once they start shipping their custom GPU, this will also open them up to patent challenges from those other players. While it has largely been on the backburner of public attention, this decade has seen a few GPU vendors take SoC vendors to court. This includes NVIDIA with Samsung and Qualcomm (a case that they lost), and still ongoing is AMD’s case against LG/MediaTek/Sigma/Vizio.

GPU development is a lot more competitive due to the fact that developers and compiled programs aren’t tied to a specific architecture – the abstraction of the APIs insulates against individual architectures – however it also means that there a lot of companies developing novel technologies, and all of those companies are moving in the same general direction with their designs. This potentially makes it very difficult to develop an efficient GPU, as the best means of achieving that efficiency have often already been patented.

What exists then is an uneasy balance between GPU vendors, and a whole lot of secrets. AMD and NVIDIA keep each other in check with their significant patent holdings, Intel licenses NVIDIA patents, etc. And on the flip side of the coin, some vendors like Qualcomm simply don’t talk about their GPUs, and while this has never been stated by the company, the running assumption has long been that they don’t want to expose themselves to patent suits. So as the new kid on the block, Apple is walking straight into a potential legal quagmire.

Unfortunately, I suspect this means that we’ll be lucky to get any kind of technical details out of Apple on how their GPUs work. They can’t fully hide how their CPUs work due to how program compilation works (which is why we know as much as we do), but the abstraction provided by graphics APIs makes it very easy to hide the inner-workings of a GPU and make it a black box. Even when we know how something works, features and implementation details can be hidden right under our noses.

Ultimately today’s press release is a bit bitter-sweet for all involved in the industry. On the one hand it absolutely puts Imagination, a long-time GPU developer, on the back foot. Which is not to spell doom and gloom, but the company will have to work very hard to make up for losing Apple. On the other hand, with a new competitor in the GPU space – albeit one we’ve been expecting – it’s a sign that things are about to get very interesting. If nothing else, Apple enjoys throwing curveballs, so expect the unexpected.

Imagination: Patents & Losing Apple
Comments Locked

144 Comments

View All Comments

  • tipoo - Monday, April 3, 2017 - link

    That would be cool, the VR focus. Hopefully increase screen resolutions at the same time, VR is atrocious on my 6S (half of 750p per eye up close is not pretty)
  • BillBear - Tuesday, April 4, 2017 - link

    There is some evidence that Apple has already switched over the compute core of their current GPU to their own design, as previously theorized by the same guy who figured out that NVIDIA had switched over to tile based rendering.

    http://www.realworldtech.com/apple-custom-gpu/

    He goes into the difference between Imagination's compute engine, which can do 16 bit math by running it in 32 bits and then ignoring the other 16 bits you don't care about and Apple's compute engine which does handle 16 bit math.

    Now what other heavy workload for mobile devices already uses the hell out of 16 bit math?

    The sort of deep learning AI algorithms that Apple prefers to run locally on your mobile devices for privacy reasons.
  • Meteor2 - Wednesday, April 5, 2017 - link

    Ooo, good spot!
  • Glaurung - Monday, April 3, 2017 - link

    So basically Apple has decided they can do phone/tablet GPUs better if they do them in house. And I wonder what this means for their continued reliance on Intel/Nvidia/AMD for mac GPUs?

    Are they going to make something that they can use in future Mac designs as well as in phones? Switching Mac CPUs to an Apple made design has a big problem because of all the Intel-only software out there, and all the Mac users depending on x86 compatibility for VMs or Bootcamp.

    But there's not nearly the same degree of lockin for GPU architecture. If they come up with something better (ie, more power efficient and fast enough) than Intel's GPUs, or better (for pro apps, they don't give a damn about games) than AMD/Nvidia, then that would be an even more interesting shakeup.
  • Eyered - Monday, April 3, 2017 - link

    Apple's new GPU would have to be top notch. AMD has Vega and Nvidia Volta coming soon. Both of which will be crazy powerful and efficient compared to what we have today. It would be a tough road to get to the point that they could complete. I'd be more than happy for the extra competition in the GPU market. Well, I'm guessing though it would be locked down to a MAC.

    As for the x86 stuff. It's here to stay I think. I wonder how much needing to have a CPU x86 holds us back.
  • epobirs - Monday, April 3, 2017 - link

    Microsoft is adding x86 support to Windows 10 on ARM. Apple should be able to do the same with MacOS,
  • loekf - Tuesday, April 4, 2017 - link

    AFAIK, the Darwin kernel (or OSX itself) already support 64-bits ARM as an architecture. Guess it's just for Apple's internal usage.

    Still, I doubt if Apple would do another architecture change for OSX, after PPC to x86 a while ago.
  • name99 - Tuesday, April 4, 2017 - link

    Jesus, the ignorance. What kernel do you think iOS uses?
  • willis936 - Monday, April 3, 2017 - link

    Hopefully Imagination's lawyers know their business. Apple engineers stare at a company's owned and selling IP, then says "hey we can do that without paying for the IP". Apple will need to be extremely careful with their design if they don't want to get sued to the moon.
  • tipoo - Monday, April 3, 2017 - link

    It would be a lot more interesting to me if they scaled this up to the Mac. Then there would be some new blood in the graphics arena, even if the third party was Apple and locked down to Apples hardware. But I wonder how much better they could do than Nvidia on efficiency, if any.

Log in

Don't have an account? Sign up now