Doesn't prove anything really. No nvidia drivers at this time and they even stated the game is being developed more towards the console which has amd. Will see when the game comes out and nvidia puts out the game ready drivers.
It has dx12 features, thats as much PC as it gets. No driver will fix lack of dx12 hardware.
Its unfortunately NVidia who is holding back advancement as all "the way its ment to be played" titles are dx11. BUT they cant hold it down forever. Not across 2 generations of GPUs in any case.
I'm not sure what you're trying to say. DX 12 will help close the gap for amd but you still need optimized driver regardless of what API they use. The 380 doesn't have dx12 cores so not sure where you come up with that. Also as stated in the video the game will be optimized for AMD. I'm not sure what you mean by nvidia holding back advancement when they help develop DX12. Also its up to the game developers to use the API not nvida.
All the latest GPU's are supported by dx12.
The r9 380 is just a rebrand anyways so not sure what you mean by lack of dx12 hardware.
AMD Radeon R9 380: Tonga Makes the Switch from R9 285 to R9 380
Antigua Pro (previously called Tonga) makes a reappearance as well in the Radeon R9 380. This card is another rebrand, plain and simple. But at least it’s built on a more recent version of AMD’s GCN architecture than Pitcairn. It was launched in September of 2014.
The only improvement over its predecessor seems to be higher GPU and memory clock rates. The card’s power consumption increases right along with them, which means that the new revision likely doesn’t include any actual changes
AMD has it since first GCN in 2011. Im sure youre not sure what im trying to say because you dont have a clue whats its about and what NVidia cut out of their chips to make them more power efficient.
Game will NOT be optimized for AMD game will be optimized for DX12. Industry standard API. As well as any DX12 game. Async compute will have more or less impact on performance but not using it just means you can do DX11, no real point in using dx12 without most important feature.
Yes, NVidia lied they had DX12 support on hardware level. They were called out on it quite a while ago, they have software emulation on async compute. There is still no fully dx12 capable cards but AMD has most important part (which isnt just some more fluff like previous DXes). Polaris will most likely be first full DX12 GPU. While consumer pascal will just be smaller maxwell.
They are holding back because their GPUs suck at dx12 and they obviously couldnt make it work sufficiently good otherwise we would certainly see something by now instead letting AMD wipe the floor with their GPUs. Kicker is that their next gen is most likely also crippled, they havent said SINGLE word on consumer pascal GPUs.
NVidia bribes developers and distribute cheat sheats of "what not to use". But DX12 is here. I was wondering back then why were they doing that, as all estimations showed pascal would be out by the time DX12 takes bigger part. Silence on the matter, threatening stardock that called them out have put things in perspective.
I dont like being bullshited to, and thats what NVidia did to all their customers.
Oh, and just a note, Vulcan is the same in that respect, its based on Mantle. So you can swap "DX12" for "Vulcan". And Mantle was 100% AMD (NVidia could have supported it but it didnt)
Doesn't prove anything really. No nvidia drivers at this time and they even stated the game is being developed more towards the console which has amd. Will see when the game comes out and nvidia puts out the game ready drivers.
It has dx12 features, thats as much PC as it gets. No driver will fix lack of dx12 hardware.
Its unfortunately NVidia who is holding back advancement as all "the way its ment to be played" titles are dx11. BUT they cant hold it down forever. Not across 2 generations of GPUs in any case.
I'm not sure what you're trying to say. DX 12 will help close the gap for amd but you still need optimized driver regardless of what API they use. The 380 doesn't have dx12 cores so not sure where you come up with that. Also as stated in the video the game will be optimized for AMD. I'm not sure what you mean by nvidia holding back advancement when they help develop DX12. Also its up to the game developers to use the API not nvida.
All the latest GPU's are supported by dx12.
The r9 380 is just a rebrand anyways so not sure what you mean by lack of dx12 hardware.
AMD Radeon R9 380: Tonga Makes the Switch from R9 285 to R9 380
Antigua Pro (previously called Tonga) makes a reappearance as well in the Radeon R9 380. This card is another rebrand, plain and simple. But at least it’s built on a more recent version of AMD’s GCN architecture than Pitcairn. It was launched in September of 2014.
The only improvement over its predecessor seems to be higher GPU and memory clock rates. The card’s power consumption increases right along with them, which means that the new revision likely doesn’t include any actual changes
Every GPU needs optimized drivers to perform well. That's hardly specific to AMD.
What are "dx12 cores"? All AMD GCN cards are going to support DirectX 12.
Some of the newer AMD GPUs have some asynchronous compute hardware that simply isn't physically present in Nvidia silicon. I don't expect it to be all that big of a deal, but you couldn't use it on PC at all before DirectX 12 and Vulkan, so we'll see.
I never used an amd before, always nvidia cards until last December. After reviewing my options on these forums, the most logical choice based on my money was an amd card.
I have had no problems so far and that's all it counts.
I could have bought a GTX 960 but I opted for an R9 290 which is A LOT better.
AMD cards have historically, been somewhat problematic for people, usually because of driver issues. Fortunately, afaik, that is no longer the case, but there remains a certain stigma because of it. However, since a GTX 960 generally costs about £180 and the R9 290 £300+ i would be very surprised if it wasn't a better card, would be kind of embarassing for AMD if it wasn't, would it not.
Historically people had problems with BOTH. Historically i had much more problems with NVidia shittly drivers (even now on GTX970) and needed to usualy keep 5-10 driver versions for different games. Historically theres not much difference between the 2. Both had their fair share of brainfart drivers. Its just mantra of dummies to repeat it as it is kinda popular thing to say when you run out of facts and arguments.
290 was only marginally more expencive at the righ time. And idiots STILL suggested NVidia at that point which is pure insanity (to be mild). 290 is GTX970, getting 960 for similar price...you really have to be retarded. Same as getting 970 for 30-50% more $$.
And just look how GPUs aged, the state of Rx 2xx cards opposed to anything NVidia below 9xx series. "Pitcairn is immortal" lol
Thats all facts.
And dx12 tests suggest NVidia pooched it again. I guess we can move from "MMO miracle patches" to "GPU miracle drivers" lol Even if Nvidia claims they "have been working for years on DX12 drivers along MS". Shouldnt they have good driver by now? lol. Or at least hardware support for next gen (which it seems to be just maxwell on smaller node so no async compute again, and agan they claim they knew all about DX12 for years)
They actively lied about their GPUs dx12 capabilities until Stardock called them out, they even threatened Strdock at one point, but its all out now. Software emulation.
This is funny because I've only have two nvidia driver issues. One was with tomb raider and the other was when I made the switch to windows 10. My last AMD card was a 5870 and changed to a 570 and never looked back. I've only bought evga and have never had one of them have issues. No, over heating or driver issues except the two I spoke of. Now my XFX 5870 fried on me 3 times. The drivers never even bugged me. Its the fact that the card was a super heater. After I made the change from AMD to Intel and nvidia I never looked back. All around better performance.
If AMD really shows better performance with ZEN and gives us a good video card that doesn't give me the need to open windows in the middle of winter I'll check them out again.
Wow, you swapped to fermi and call it superior. Pure gold rofl
This is what you bought:
Plenty of cooking with nvidia videos from that time.
I dont really think you even realize how stupid you sound.
No love for OpenGL 4.5... Well, not here anyway. It already has DX11 emulation, which lighting in my opinion looks a million times better than DX11 alone. I can't imagine DX12 being that big of a deal. Yet alone the hardware it works on. Both brands are just as good as the other dollar for dollar.
Sorry for a slightly off-topic chime in here. I'm a 290X owner, great card and it's awesome for what I use it for, and that's all that matters. I will never upgrade or buy windows 10 so could care less about DX12 even. OpenGL is where it's at.
It should be expected when the GPU makers are stuck on a process node for so long that they will run hot. There is only so much that can be done with a more mature process node, the rest comes from higher wattage. Fermi was pretty bad for power consumption, but it was alright in regards to heat. Yes it got hot, but its components were built to withstand the heat. If they actually burnt out like the nVidia 8600Ms then there would be a cause for concern.
Let me know which you'd rather own 10,00 shares of and which company has a better chance of being alive on it's own in two years.
AMD's business problems are mostly due to their CPUs, in particular Bulldozer being such a disaster. Swap the performance of Sandy Bridge/Ivy Bridge/Haswell/Broadwell/Sky Lake cores with Bulldozer/Piledriver/Steamroller/Excavator cores without changing anything about GPUs and AMD would be massively more profitable than Nvidia.
I could care less what anyone thinks of AMD and their cards. I have owned both and I buy what I feel is the best card for the best value at the time in which I am buying it.
Every Radeon card I have ever had was a piece of shit. I don't care what charts or graphs say I would never put another one of those in any system again.
I've had crap cards from both makers because I didn't do my homework. No one should have to do that sort of homework. It should be easy and straight forward, but it's not. You can get a good card by both makers.
Yes both produce some good cards. I pretty much look at top of the line cards every 5-6 years to get a new system.
Lot of my preference now is how hot does it run? And how loud is it?
Those 2 questions top my list in video cards i am looking to buy, along with how they top charts. Having something hot running or loud next to me for a few years is a big no no for me.
Every Radeon card I have ever had was a piece of shit. I don't care what charts or graphs say I would never put another one of those in any system again.
Here's another person comparing 150$ AMD card to 500$ nVidia card. You are clearly doing that. Because as an owner of R9 290X ... i can tell you that I have no regrets. This card is absolute beast and has never malfunctioned, it or its driver. And don't tell me I'm being lucky. There are plenty of people in this forum with this card in particular. They'll say the same thing.
I have an Asus R9 290x and I love this card. I had a 4870x2 before this one. Personally I would never overpay for a nvidia card. The old 4870x2 was one of the first duel gpu cards and it ran HOT. Do not get any duel gpu card that funnels air over both gpus, they simply get to hot because 1 gpu has hot air flowing over it to cool it. This new Asus card has fans for each gpu so it runs nice and cool. The old 4870x2 is still kicking tho after like 7 years of heavy use.
This new R9 290x is nice, I also like the gaming evolved software that comes with it.
If you ever plan on running two cards you will only get like 70% from each card with sli nvidia. Where with an amd you get closer to 90% from each with crossfire. You would only need two cards for 4k tho, and would need hella air cooling or liquid cooling with that.
Comments
Game will NOT be optimized for AMD game will be optimized for DX12. Industry standard API. As well as any DX12 game. Async compute will have more or less impact on performance but not using it just means you can do DX11, no real point in using dx12 without most important feature.
Yes, NVidia lied they had DX12 support on hardware level. They were called out on it quite a while ago, they have software emulation on async compute. There is still no fully dx12 capable cards but AMD has most important part (which isnt just some more fluff like previous DXes). Polaris will most likely be first full DX12 GPU. While consumer pascal will just be smaller maxwell.
They are holding back because their GPUs suck at dx12 and they obviously couldnt make it work sufficiently good otherwise we would certainly see something by now instead letting AMD wipe the floor with their GPUs. Kicker is that their next gen is most likely also crippled, they havent said SINGLE word on consumer pascal GPUs.
NVidia bribes developers and distribute cheat sheats of "what not to use". But DX12 is here. I was wondering back then why were they doing that, as all estimations showed pascal would be out by the time DX12 takes bigger part. Silence on the matter, threatening stardock that called them out have put things in perspective.
I dont like being bullshited to, and thats what NVidia did to all their customers.
Oh, and just a note, Vulcan is the same in that respect, its based on Mantle. So you can swap "DX12" for "Vulcan". And Mantle was 100% AMD (NVidia could have supported it but it didnt)
What are "dx12 cores"? All AMD GCN cards are going to support DirectX 12.
Some of the newer AMD GPUs have some asynchronous compute hardware that simply isn't physically present in Nvidia silicon. I don't expect it to be all that big of a deal, but you couldn't use it on PC at all before DirectX 12 and Vulkan, so we'll see.
This is what you bought:
Plenty of cooking with nvidia videos from that time.
I dont really think you even realize how stupid you sound.
Sorry for a slightly off-topic chime in here. I'm a 290X owner, great card and it's awesome for what I use it for, and that's all that matters. I will never upgrade or buy windows 10 so could care less about DX12 even. OpenGL is where it's at.
Fermi was pretty bad for power consumption, but it was alright in regards to heat. Yes it got hot, but its components were built to withstand the heat. If they actually burnt out like the nVidia 8600Ms then there would be a cause for concern.
Nvidia stock price.
AMD stock price.
Click the 5 year chart for both and compare.
Let me know which you'd rather own 10,00 shares of and which company has a better chance of being alive on it's own in two years.
VS
much better thermal performance, lower power requirements, higher o/c margins, better driver suite.
The choice is obvious for me.
거북이는 목을 내밀 때 안 움직입니다
2 Titan X's in SLI , with default blowers exhausting heat out the back.
Only viable solution for 4k atm.
And I can afford whatever I want because I invest in companies whose stock goes up.
Point still stands, at same price 970vs390 390 is better buy especially if you buy long term (same for 380/Fury) *shrug*
And you have lost money on their shares lately.
거북이는 목을 내밀 때 안 움직입니다
Lot of my preference now is how hot does it run? And how loud is it?
Those 2 questions top my list in video cards i am looking to buy, along with how they top charts. Having something hot running or loud next to me for a few years is a big no no for me.
거북이는 목을 내밀 때 안 움직입니다
I have an Asus R9 290x and I love this card. I had a 4870x2 before this one. Personally I would never overpay for a nvidia card. The old 4870x2 was one of the first duel gpu cards and it ran HOT. Do not get any duel gpu card that funnels air over both gpus, they simply get to hot because 1 gpu has hot air flowing over it to cool it. This new Asus card has fans for each gpu so it runs nice and cool. The old 4870x2 is still kicking tho after like 7 years of heavy use.
This new R9 290x is nice, I also like the gaming evolved software that comes with it.
If you ever plan on running two cards you will only get like 70% from each card with sli nvidia. Where with an amd you get closer to 90% from each with crossfire. You would only need two cards for 4k tho, and would need hella air cooling or liquid cooling with that.