http://www.anandtech.com/show/11243/apple-developing-custom-gpu-dropping-imaginationApple currently uses modified Imagination GPUs in their iDevices. They use some combination of Intel, AMD, and Nvidia GPUs in their laptops and desktops.
It will be interesting to see how far Apple plans to go with this. Apparently they're going to drop Imagination entirely. But what about the laptop and desktop form factors? Will Apple build bigger GPUs with enough performance for those form factors, or will they stay with external GPU vendors indefinitely? If they use their own GPUs entirely, that would probably be the end of dual-booting Windows, unless Apple decides to make a Windows driver for their GPUs.
Apple already builds their own CPUs for iPads and iPhones, but uses Intel CPUs for laptops and desktops. Using a CPU of a different vendor from a GPU is rather awkward if you're going to use integrated graphics, as many Apple products do. There have long been rumors that Apple would use their own CPU cores in place of Intel in higher performance situations. That may or may not happen, but having their own GPU that they can integrate would help if they do want to go that route.
Still, there's a big difference between having a good enough integrated GPU for a low power laptop and having a high performance GPU suitable for the multi-GPU version of the Mac Pro. Apple insists that they're not abandoning the Mac Pro. Of course, it's also possible to use their own GPU when they need an integrated GPU and have a discrete card from AMD or Nvidia when more performance is called for.
As Apple provides about half of Imagination's revenue in licensing fees, this is devastating for Imagination and could plausibly even be the end of the company. Imagination's stock dropped by more than 60% almost immediately after the news dropped that they were losing Apple as a customer.
Comments
¯\_(ツ)_/¯
Also worth to note that I've read a report yesterday that Apple has ordered 75M OLED screens from Samsung for the upcoming iPhone 8.
And what bothers me most is that with the above 2 things, Apple sues a lot of companies for patent infringements. The nerve they have when they're not making anything themselves (anymore) but just assemble stuff, much like a toddler does with his LEGO bricks...
I think Apple will continue using Intel, AMD, and nVidia GPUs for their laptops and desktops. They will pull their GPU design from an open source like they did with the mobile CPU. Considering the complexity of GPUs and that they are not apart of the GPU partnerships, I don't think they can design their own desktop/laptop GPUs that are very powerful. Much like old Macs with their proprietary GPUs. If they do manage to make a powerful desktop/laptop GPU, they will probably face several years of legal challenges.
So they're looking to try to replicate their previous horrifically bad business failings and drive the company into the ground?
Sounds good to me.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
I honestly think they don't want to increase their market share much or they'd risk losing their luxury label. They have the highest profits of all tech companies...
Heck, the transition from IBM/Motorola PPC to Intel seemed to happen over night, and it took Apple just nine months to transition every product from PPC to Intel. Apple even helped develop the PowerPC architecture (it was a consortium called AIM - Apple/Intel/Motorola), and when they switched from 68k to PPC in 1994, it was a huge deal, as it was a huge leap forward in performance, at one point even being touted as "The Worlds First Personal Supercomputer" (it was the first consumer CPU to be able to perform >1GFLOPs).
Apple switched to Intel in 2006, with the first of the Core brand. Reportedly the transition was driven by PPC's inability to stick to their road map (they were never able to go past 3GHz while Apple used them, and there was a long period between generations), and Intel's breakthrough in performance-per-watt with Core.
I know a lot of people here think that Bootcamp is a major selling point for Apple, but I still fail to see that, especially with the availability of good hypervisors/virtualization. So I don't think backwards compatibility with Microsoft is even a consideration in Apple's mind.
The Ax line does license Imagination technology, but it's still very much an Apple design. Just like they license ARM, but the Ax is still unique when compared to say, Snapdragon or Exynos.
Apple will still need patents from somewhere to build a GPU - not many people have them. It's entirely likely that in all the small acquisitions they do, that they think they have enough to cover their product without Imagination. And the patents you need to build a good mobile GPU don't necessarily translate to what is needed for a good desktop GPU - there's a good reason we don't have Imagination, Adreno or Mali in competition with AMD and nVidia on the desktop. The opposite holds true as well, you don't exactly see a lot of GCN or nVidia in mobile either (as much as nVidia is certainly trying).
¯\_(ツ)_/¯
¯\_(ツ)_/¯
As to Apple doing their own GPUs, look at how long Intel has tried that and their GPU's still suck. They could do it for low end devices, but not going to happen on macs.
"Be water my friend" - Bruce Lee
The ~servers~ are not Macs - although Apple did have a rackmount server for a period of time, Apple has never been a big player in the server arena. Web/Internet servers are mostly Linux, a decent bit of Windows Server, and some other odds and ends in there at the fringes. Those run on anything from Raspberry Pi to a datacenter with rooms full of racks and everything in between.
The ~development~ takes a lot of different shapes, and there, Apple is pretty popular. Graphics/images are almost exclusively Apple (go ahead and say graphics aren't web development, then show me how many popular web pages that don't load a graphics file somewhere). A lot of code development is done on Apple as well, it's pretty popular there. You really don't need monster workstations for most internet app development, as the code really runs and is stressed on the server, you just need a development platform that makes communication with the server easy, and can edit text.
With the rise in mobile apps, that is probably where Apple sells most of their machines for development, since iOS development needs OS X, and OS X can also be used as an Android development platform. That isn't necessarily the same thing as network application development though.
So while a javascript library really needs to run on the target browser (as javascript usually runs on the host, not the server, and browsers can be as much or as little OS-dependent as they like), the development tools don't, and I don't mean to speak for Torval, but that's what I understood him to be saying.