It looks like you're new here. If you want to get involved, click one of these buttons!
http://www.incgamers.com/2015/03/amd-makes-cryptic-statement-about-future-of-mantle
"Unfortunately, Mantle’s future isn’t sounding too hot. Particularly in this section of AMD’s latest statement: “The Mantle SDK also remains available to partners who register in this co-development and evaluation program. However, if you are a developer interested in Mantle “1.0” functionality, we suggest that you focus your attention on DirectX 12 or GLnext.” "
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
Comments
Mantle needs to become more open? It could hardly become less open. A few weeks ago, I tried to find the Mantle documentation just to get some idea of what it could do and hoping it would explain some details about AMD's GCN architecture. It was password protected and you had to apply for access and tell AMD exactly what game you're working on that you want to use Mantle for. Fortunately, I was able to find the architectural details I wanted in AMD's OpenCL documentation.
We don't need yet another vendor pushing yet another proprietary API. If AMD officially retires Mantle once DirectX 12 and OpenGL 5.0 are out, that would be a good thing. That would mean that Mantle basically offered a preview of the new APIs before they were out in a few sponsored games, which isn't a bad thing, really, but hardly justifies caring about Mantle when buying a video card.
DirectX 12 is really set to make a big impact this time around. If any of the rumors floating around (before the GDC announcement) are true then we may be in for some happy times ahead.
Being able to use multiple GPUs of any vendor / make / model only requirement being that it supports Dx12 is a fairly big deal. Nvidia + ATI cards working together in a single machine as one large graphics resource pool. No more limited SLI or Crossfire issues. DirectX 12 will just pull all the graphics resources in the machine together as a single very large pull and use the resources as needed. Will be interesting to see how it all plays out.
I imagine the big cross platform between XBox One and Windows 10+ will be a large target for developers. Since DirectX 12 will be the underlying api under both porting between the two will be cake easy.
Couldn't possibly agree with you more on that.
Posted this more for the AMD people to have something to chew on.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
Isn't Mantle the single reason for the directions that DirectX 12 and OpenGL are going?
Anyone want to offer thoughts on this?
http://semiaccurate.com/2015/03/03/amd-breaks-new-ground-liquidvr-sdk/
Never mind Mantle. What will DirectX 12 or OpenGL Next (now called Vulkan) offer that wasn't already available in at least one of:
1) traditional graphics APIs,
2) recent OpenGL extensions, or
3) OpenCL?
Now, there probably will be some minor tinkering around the edges there. And merging the functionality of some disparate things into one API for games does have some serious value. But just because Mantle is the first next-gen graphics API that you heard of doesn't mean that it's the cause of the others.
Now, there probably is a lot of overlap between what went into Mantle and what will go into the next generation APIs. AMD put what they thought the next generation APIs should look like into Mantle. And they probably advocated for basically the same things when discussing development of DirectX and OpenGL. AMD has major sway in creating graphics APIs for obvious reasons. (So does Nvidia, of course.) And other vendors may well advocate the same things as AMD for the same reasons; everyone has wanted to reduce driver overhead for many years, for example.
It took Glide to really get us to DirectX, and to push OpenGL to look at more than just workstation-scale render machines and push towards a gaming-oriented API.
No one uses Glide today really, but that doesn't mean it wasn't significant. It was sort of the thing that kicked off even having a graphics API for gaming.
I see Mantle in sort of the same light. Even if it doesn't see a lot of game support now, it has pushed the entire conversation in that direction, and in that regard it's significant.
It's interesting that the major focus is on the CPU-side code. Mantle uses HLSL as the language for its GPU-side code--the same language as DirectX. The newly announced Vulkan uses GLSL--the same language as OpenGL. So they're basically treating writing GPU code as a solved problem, and justifiably so.
Vulkan, on the other hand, takes a totally different approach to CPU-side code from OpenGL. Remember that OpenGL dates all the way back to 1992--an era when computations were expensive and memory capacity was expensive, but bandwidth to connect the CPU to the GPU was cheap. So it was sloppy with bandwidth, but tried heavily to save on computational power and memory capacity. Today, the situation is reversed enough that the CPU talking to the GPU is often the major bottleneck, and that's what they're trying to relieve. Vulkan is intentionally breaking compatibility with a bunch of legacy bloat.
They talk about Vulkan and DirectX 12 taking more talented programmers than previous graphics APIs, but I'm skeptical of this. My experience with GPU programming has been that the hard part is figuring out what can reasonably be done on the GPU and then optimizing that--meaning the shader or kernel code. The code for the CPU to pass stuff to the GPU is simple and straightforward. It's vastly easier than the shader code, and easier than a whole lot of pure CPU-side stuff elsewhere in a game engine. I don't expect Vulkan or DirectX 12 to change that, even if it is more complicated than today's OpenGL API commands.
You know what does take more talented programmers? Tessellation--the main new feature of DirectX 11 and OpenGL 4. Lots of graduate-level math involved, which is why I've yet to see a released game that uses it sensibly. The proficiency barrier to using Vulkan or DirectX 12 CPU-side code will probably be much lower than that, and well within what a bright computer science person can handle.
I love how people act as if as soon as mantle released, then all of a sudden the people behind vulkan and directx suddenly threw their hands up in the air, shouted Eureka like their minds just exploded with a realisation.
Both API's were already well into development when mantle was released. Outside of that, the idea of low level API's is not new. Its one of the reasons that consoles were able to last as long as they were because they traditionally had their own "direct to hardware" low level APIs.
Mantle was released prior because DX12 and Vulkan both are being designed to operate with any graphics card, not just AMD cards. Its much easier to program a low level API for a specific set of hardware than it is for every bit of hardware.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
Excellent post.
Unfortunately i don't have the background or experience to comment on difficulty of programming other than the people i do know who work with graphics all agree low level api's can be a pain to code for. Either way very informative post.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
A lot depends on how low of level. If you're purely writing assembly, then yeah, that's a pain unless you're doing something awfully simple. But I doubt that that's where this is going; the recent OpenGL extensions were able to eliminate most of the CPU overhead without being a nuisance to use--and while keeping nearly all of the API exactly the same as your standard OpenGL.
I'm sure those APIs were already in development... but no one was talking about it until Mantle. And Mantle was first to market - that means a good deal.
Maybe without Mantle, Vulkan and DX12 wouldn't be the same as what we are seeing them evolve to be - the buzz would be different, the marketing/pitch to developers would be different, and we'd have seen a different emphasis out the gate. Hard to say, since, you know, Mantle did release and all.
AMD and Kronos group announced yesterday that Mantle is such a big part and inspiration to the Vulkan API that u could say it's a a evolution of mantle or mantle 2.0 :P
“Vulkan is a significant Khronos initiative to provide developers the choice of a state-of-the-art GPU API that is open and portable across multiple platforms, at a time where platform diversity is increasing,” said Neil Trevett, president of the Khronos Group and vice president at NVIDIA.
the last part of the sentence is the kicker ! you're gonna shit bricks when u figure it out, muahaaha xD
You do realize that Nvidia is rampant in pushing things out to screw over AMD as well right, even MORE guilty in a lot of cases of this. Its silly to go fanboy bashing at one company in support of your own which is as devious if not even more so.
Its a shame really if Mantle was made to be less open, though at the same time you could claim it was kept more 'enclosed' as it was being tested and experimented on. Its hard to really say what they were doing with it. If it was meant to be something limited I don't think it as bad of a thing.
To go on mentioning you are GLAD Mantle is being discontinued though in general I find is being quite anti-consumer. A big issue IS we only have DirectX and OpenGL. DirectX just stands to have a lot more funding behind it that OpenGL has trouble keeping up. Microsoft has little reason to invest much in DirectX as they can do the minimum and not worry much about putting effort in their product. Having another API would help make things more competitive, really helping to push forward things. MORE compeition is always better then less, and I find it a bit silly that it seems some would care little that it is lost primarily out of a sense of fanboyism for one side of the CPU/GPU market over another. Even if it wasn't open sourced as they originally claimed (which I would hope it WOULD be) it would vastly better for you as a consumer.
Sorry if I'm implanting things on you if you aren't, the tone presented just tends to lend more to the 'good riddance' side which I think is a shame really even if it was more to the negative side of why they designed it.
Probably because we're not talking just about diversity in terms of Console vs PC, but also different versions of Windows, and that Direct X is not backwards compatible, so that the new Direct X is only compatible with Windows 10, and each iteration of Windows before it, has its own, version of Direct X that is not compatible with the previous version. I would imagine that makes things difficult for game developers, because they have to code for the 'lowest common denominator' or risk their games not working on a significant proportion of the market, for instance if Developers had coded a game to utilise the version of Direct X that came with Windows 8, they would have limited themselves to barely 10% of the PC gaming market, its no surprise then that so many games still use Direct X 9.c as a base, with support for newer versions, kind of 'tacked on' which must be a headache for developers. Being able to make a game that works and looks the same regardless of which version of Windows the player is using, would be huge, particularly if it also meant it worked equally well on Linux etc.
Pretty pointless post given Windows 10 upgrade is going to be free.
I go after Nvidia for doing proprietary nonsense a lot, too. For that matter, if:
1) AMD's next architecture is reasonably competitive with Maxwell, and
2) Nvidia refuses to support adaptive sync,
then in a couple of months or so, I might well start telling people to dismiss Nvidia out of hand until they announce that they're going to support adaptive sync. G-sync, a proprietary version of the same thing that adds about $100 per monitor, is not an acceptable substitute.
-----
Video drivers have to have a bunch of different compilers in them. They have to support DirectX, OpenGL, OpenGL ES, WebGL, OpenCL, possibly some proprietary stuff, and different versions of all of the above, too. That means you have to have a bunch of different compilers that compile a bunch of different shader/kernel languages for a bunch of different architectures, and it needs to all work. That's a huge mess, and it's probably the biggest reason why there are so many video driver bugs.
Gratuitiously adding more APIs requiring more compilers to that makes the situation worse, not better. That's why the Khronos group is pushing SPIR-V: have GLSL, OpenCL C, and anything else that anyone wants to write shaders/kernels in compile to SPIR-V in an architecture-independent way. That way, the video drivers only need to compile SPIR-V to the particular architecture, rather than needing separate architecture-dependent compilers for HLSL, GLSL, OpenCL C, CUDA C, and everything else that anyone wants to write GPU code in.
Now, adding new APIs because they do something that the old didn't can be useful. Having OpenCL available beats having to cram all GPU programming into a graphics pipeline. And Vulkan will offer a cleaner way for the CPU to communicate with the GPU, jettisoning decades of accumulated cruft in OpenGL. Probably the biggest problem with OpenGL today is that it has both everything you need and a ton of legacy stuff that you don't need and shouldn't use, and it can be hard to figure out which is which.
I'm actually considering porting my game to Vulkan and ditching OpenGL entirely, though that will depend on:
1) seeing the actual Vulkan specification to make sure that it does what I think it will do, and
2) someone making some good Java bindings for it so that I don't have to port my game to C, which is out of the question.
Because Vulkan uses exactly the same GLSL as OpenGL, it won't require altering any shaders.
There's kind of supporting DirectX 12 and then there's really supporting DirectX 12. There are some mobile GPUs that support DirectX 11 feature level 9_3--by which they mean, the subset of DirectX 11 that was also available in DirectX 9.0c, but not the newer stuff that first arrived in DirectX 10 or later. Microsoft lets companies just say they support DirectX 11 and leave it at that.
If support for the OpenGL extensions that eliminate most driver overhead is a guide, then most likely, AMD GCN and Nvidia Kepler and later cards (Radeon HD 7000 and GeForce 600 series, excluding older cards rebranded into the new lines) will support the full DirectX 12, while older GPUs will have only partial support.
Each version of Windows did not have differing versions of DirectX.
WinXP supported up through DX9
Vista introduced DX10, but it was never really adopted by developers, and as such there was never a big clamouring to back-port it to WinXP. DX11 was going to be a Windows 7 exclusive, but it was eventually supported by Vista as well.
So there is some disctinction between DirectX revision and Windows version, but it's not direct, and certainly each version of Windows does not have a differing and non-compatible version of DirectX.
And then there is the "Feature Level" dibacle introduced with DX10. A hardware developer can choose to implement some of the features, but not all, and still call their product DX-Compatible (and this is how I think cross-vendor multi-GPU support will get killed by a certain GPU manufacturer, they just won't support it and still get to call their GPUs DX12-comaptible, just as a different feature level, which most people won't see in the footnotes, and won't care because "The other guy's drivers all suck anyway").
---
And the more I research about the OP - the more it looks like Mantle is just more or less getting rolled over into Vulkan, which I support (I'm sure it's not that simple, but that's a bit of what it appears to be on the surface). Going from something proprietary to something open is good, in my eyes anyway, and I wouldn't cry if Mantle just silently slipped away (although it makes me chuckle about a year ago when a lot of people where pushing for AMD cards simply because of promised Mantle support in some upcoming games).