There have been some rumors lately that this was going to happen. It perhaps started at HardOCP last December:
http://www.hardocp.com/news/2016/12/06/amd_licensing_radeon_graphics_to_intel63And now, he says that it's not merely going to happen, but will happen this year:
https://hardforum.com/threads/from-ati-to-amd-back-to-ati-a-journey-in-futility-h.1900681/page-72#post-1042797289So does this make sense? If it's a one-time thing, I don't think it does. But if it lets Intel shut down their GPU division entirely, that's a different story entirely.
Intel has spent a ton of money over the years to develop their own GPU architectures. It's not just fabricating the chips that is the problem. It costs a lot of money to design the architecture, make particular chips using it and fix problems, license patents from the major GPU vendors, and write drivers. Intel has spent a lot of money doing this, with results ranging from rather bad to shockingly awful.
If they license someone else's GPU, all those development costs go away. They probably get a much better product, and provided that they pay substantially less than their own development costs would have been but substantially more than nothing (so that whoever sells the GPU makes money, too), it can be a win-win. Intel did this in the past with one generation of Atom CPUs licensing an Imagination GPU.
Plenty of cell phone chip manufacturers license a GPU, generally from ARM or Imagination. And that's in the same SoC as the CPU, even. The rumor here is a multi-chip module, with an Intel CPU and an AMD GPU as separate chips in the same package. Intel has done this in the past with Clarkdale, and also had multi-chip modules for pure CPU products, most notably the Core 2 Quad. AMD has also done multi-chip modules with a number of CPUs, generally all of their high end server chips from Magny-Cours onward.
So why license AMD's GPU in particular? The markets Intel CPUs target tend to ask for higher GPU performance than cell phones provide. AMD and Nvidia are the only two proven GPU vendors for high performance GPUs, though Imagination would probably claim that they could offer the performance Intel needs, too. AMD and Nvidia also happen to conveniently write drivers for Windows and Linux, so there's virtually no additional driver creation cost. So why AMD and not Nvidia? I don't know, but it could plausibly be that AMD offered a better price.
So why would AMD do this? Don't they want a GPU advantage to drive sales of Raven Ridge? Suppose that you're AMD and you have a choice between making $50 in profit on every CPU you sell and nothing on every CPU Intel sells, or $50 on every CPU you sell and $5 on every CPU Intel sells? Even if the deal means you end up with 12% market share rather than 15% as some of the people who care about the integrated GPU buy Intel, you still come out ahead. This could also greatly reduce the risk to AMD, and keep them in business even if their CPU side has another hiccup. AMD has been actively looking for ways to monetize their GPU IP, and this is one.
Does Intel want to be dependent on AMD for their CPUs? Of course not, but even with this deal, they wouldn't be over the long term, even if they shut down their own GPU division. A multi-chip module means that you could license an AMD GPU one generation and Nvidia the next--or ARM or Imagination or Qualcomm. Game consoles have no problem with bouncing between GPU vendors from one generation to the next. Apple does the same thing.
Another possibility that I'd like to introduce as my own speculation is that it could be Apple driving this. Apple has long been unhappy with Intel GPUs and AMD CPUs, for obvious enough reasons. If they threatened to ditch x86 in favor of ARM unless something like this happened, it could provide the impetus to get something done. Of course, if that's what happened, the chip might end up being Mac-exclusive.
It's all speculation and rumors at this point, so I don't know if this is actually going to happen. But it will be interesting to see if it does.
Comments
I'd be surprised if Apple were the driving force here, but it wouldn't be the first time (Crystalwell). Apple could just about jump Intel ship entirely and move to their own CPU without a huge impact on typical productivity if they wanted to. And in terms of CPU sales, OS X machine sales account for pretty much a rounding error in terms of total PC sales (as much as I like Apple, that's the fact).
I would suspect the bigger driving force would be the HPC/AI market that nVidia is currently looking more or less unchallenged in, and Intel doesn't want to cede without a fight. Phi is pretty much where the Intel accelerated IGP was born (they had IGP before, but it was extremely basic and didn't support gaming)
While Larrabee was originally supposed to eventually make its way into Intel GPUs, Intel seems to have abandoned that idea pretty quickly. I suspect that Intel looked at performance numbers and saw that they got destroyed by AMD and Nvidia and gave up. The modern Xeon Phi parts are basically Intel Atom cores plus wider AVX instructions, which is a really terrible way to do graphics. It's not just that the fixed-function graphics hardware like tessellation and rasterization isn't there. It's that the cache hierarchy is all wrong for graphics, e.g., having to go to L2 cache for things that any modern GPU would keep in registers and never touch a higher level of cache.
Xeon Phi is also a terrible way to do most other embarrassingly parallel algorithms, but that's a different story.
Doesn't make sense considering AMD is a competitor and Nvidia is not.
http://forums.mmorpg.com/discussion/456132/amd-makes-a-bold-move/p8
Actually, that sounds familiar. Didn't nVidia start out with something made for graphics first, then move on to HPC and AI and such that can leverage SIMD....
So if all this HPC stuff is starting out as Graphics, wouldn't it make sense to have your first cross-license deal with a graphics company also start out with... graphics? And then migrate it to HPC, since that seems to be how the evolution of pretty much everything HPC has went (except IBM or the new Chinese super computers, no idea what they are using honestly)
Now, that's just as much speculation as saying Apple is driving the deal, I think honestly it's more to do with just patents for general GPU architecture than anything. But it's fun to speculate.
It would definitely be interesting if AMD designs Intel GPUs. It would help secure AMD financially. It may also turn out better for Intel as they would keep an uncompetitive AMD as a competitor instead of a company with more resources or deal with FTC mandates.
Before AMD purchased ATI, they were in a strong merger negotiation with Nvidia. At the time, it looked like the AMD/ Nvidia merger would make a lot more sense. AMD had great ties with Nvidia, and had partnered in several successful ventures. ATI was working with Intel on a couple of projects, and if AMD purchased Nvidia that would have overcome a huge anti-trust hurdle allowing Intel to buy ATI. But the CEO of Nvidia made some unacceptable demands that once again proved his jerkiness, causing the whole deal to fall apart. AMD quickly made an offer for ATI that was probably pretty favorable to ATI stockholders (since it was accepted very quickly), and the rest is history.
The world is going to the dogs, which is just how I planned it!
IDK man you trying to say AMD merger with Nvidia and somehow the ceo made unreasonable demands. One company was worth 4 bil and the other company was worth over 15 billion.
The world is going to the dogs, which is just how I planned it!
As to the thought that Apple might move away from Intel, I tend to discount that because I know a lot of people who buy Apple because they can run windows on their machines for games when they need to. If Apple were to do that it would hurt their PC sales significantly.
Some of you don't seem to grasp that Huang, Nvidia's CEO is compared to Larry Ellison a lot when it comes to business dealings. I have seen many comments that dealing with Nvidia is like dealing with a vipers nest.
***ADDED AS AN EDIT***
I've often wondered what would have happened if the AMD/Nvidia merger would have gone through. If it had, and Intel purchased ATI, I think there would have been several very changed results.
First off, AMD is a company that is willing to take chances more than a mega company like Intel. When Intel released the Core2duo, AMD would have greatly invested in Nvidia to recoup their losses. Nvidia has some really top-notch engineers, and with a lot of funding and impetus, they could have choked the GPU market, pretty much owning it (back then the market share between Nvidia and ATI was a lot closer). With the extra profit from the GPU market, they might have been able to make the CPU race a lot closer, especially if they were marketing Athlon and Phenom CPUs with a Nvidia GPU built in.
Remember, most sales back then were desktops and laptops to non-gamers. An Athlon 2 X4 with a Nvidia 8400 or 8600 built on would have been great for the masses. It would have been top notch with the web, graphics, movies and even capable for gaming. Even if the CPUs were slower, I think the graphics improvement would have been enough to keep their market share strong. And such an APU running dual graphics with a true 8600 card using SLI tech would have been pretty doggone good, probably the entry level for serious gaming.
Of course having ATI patents and tech, along with some really good engineers that would finally be getting what they were worth from Intel, would have been a boon for Big Blue. If every Intel chipset motherboard had an ATI graphics processor on it, those Core2duos would have looked really, really good to consumers, way better than with the lame GM950 video on most i945, i965, and X31 & X35 motherboards at the time. And when they started building video into their i3, i5 and i7 chips, if it was using ATI tech they would have been very big, maybe bigger than they ended up.
But bottom line is I think we would have had a far more competitive market if AMD merged with Nvidia and Intel acquired ATI...
The world is going to the dogs, which is just how I planned it!
Right now, the only thing really keeping them from going ARM-only is that ARM isn't quite up to the performance they need for their OS X machines (yet). Apple is willing to put out a new generation that doesn't represent a speed increase (in fact, they have done generations before that are slightly slower than the previous generation), but it has to come with some other benefit to outweigh it - different design, better battery, something. The A10 started to post numbers close to Intel's own Pentium/Celeron lines, but they are still a ways from the Core lineup that Apple has been using to date.
Apple has changed desktop processor types before (twice before, in fact, 68k -> PPC -> x86, not accounting for the fact that iOS runs on ARM). Apple has supported multiple CPU architectures in a single operating system before (and supposedly they have a branch of OS X that runs on ARM already). None of that is easy, but it's nothing that Apple hasn't done before. If /when ARM has enough advantages, Apple will jump in a heartbeat and not look back.
The only reason Intel is even remotely interested in what Apple does (apart from the fact that they like to sell CPUs), is not because of the volume of CPUs Apple buys, but it's because where Apple goes, the rest of the industry seems to follow.
As to ARM, they have a LONG way to go to be competitive with Intel CPUs. Adding more cores generally does not help that much, it all comes down to how the software was written. When you consider how cheap PC's are right now, putting a low end ARM processor computer out would just get laughed at and of course Apple could not price it at the low end.
Bootcamp also sells a lot of Apple computers, because the effort to write a game for the Apple OS is not worth the effort, so if you want to play a game it is necessary and Apple darn well knows it.
http://www.theverge.com/2016/9/16/12939310/iphone-7-a10-fusion-processor-apple-intel-future
I don't think Apple cares about gaming on the Desktop, otherwise they would do things like release timely graphics driver updates, provide for better graphics performance & hardware, and support more modern versions of OpenGL and Vulkan. Instead, they seem to only care about gaming on iOS, maybe because they get a big cut there. Metal was initially for iOS, and only later was ported to OS X (mainly so that developers could do iOS->OS X ports).
Heck, most OS X capable computers that are sold, are sold with no discrete graphics at all, only Intel IGP.
Even if you argue that Bootcamp is how Apple is providing that support. The latest Bootcamp drivers: Aug 12, 2015.
At one point they did. But that hasn't been the case for a long while now, at least on OS X.
http://www.fudzilla.com/news/graphics/43663-intel-is-licensing-amd-graphics
Still just a rumor as far as I'm concerned, as no official sources have confirmed. But it's keeping an old rumor alive.
http://www.barrons.com/articles/intel-refutes-rumor-of-licensing-amd-graphics-technology-1495064908
"If MMORPG players were around when God said, "Let their be light" they'd have called the light gay, and plunged the universe back into darkness by squatting their nutsacks over it."
-Luke McKinney, The 7 Biggest Dick Moves in the History of Online Gaming
"In the end, SWG may have been more potential and promise than fulfilled expectation. But I'd rather work on something with great potential than on fulfilling a promise of mediocrity."
-Raph Koster
http://www.marketwatch.com/story/intel-and-amd-license-rumors-should-finally-be-dead-2017-05-22