As expected, it's a salvage part of GP102. It disables a little bit of a lot of things, from compute units to memory channels. Nvidia decided not to repeat the GTX 970 "4 GB" fiasco and will market it as an 11 GB card. Both the GPU itself and the memory will be clocked higher than the Titan X, making it likely a hair faster in typical situations.
With reviews not yet out because the card isn't launched, we don't officially know exactly how it will perform yet. But really, we pretty much do from the paper specs, as it's a salvage part of a GPU already on the market.
With the GTX 1080 Ti available at $700, a GTX 1080 at $600 looks rather overpriced. Nvidia agreed and slashed prices on the latter to $500. They're also cutting prices on the GTX 1070 founders edition to $400, but you can get other GTX 1070s for cheaper than that, so that's not terribly interesting.
One could ask why it took Nvidia 7 months after the launch of the Titan X for the salvage part to arrive. My guess is that yields weren't very good, and as Nvidia had the top end all to themselves with the GTX 1080 anyway, they didn't feel the need to rush. Do respins, let the process node mature, or whatever it takes for better yields and get good yields before you start really producing GP102 in high volume.
This sets the bar for AMD Vega, due out in the second quarter of this year. Vega should beat a GTX 1080 Ti in most of the paper specs (TFLOPS, memory bandwidth, etc.), but that's been the case for most of the last decade, and Nvidia was able to counter with the ability to more efficiently use the hardware available. For a CPU analogue, think of it as Nvidia had better IPC but AMD had more cores--except that in the GPU world, unlike consumer CPUs, this sometimes meant AMD won.
GCN was a nice architecture in its day, but AMD has been selling derivatives of it for over five years now. If Vega is nothing more than a minorly tweaked GCN, AMD's GPU side could be in trouble, as that's not going to catch Pascal. A scaled up Polaris isn't going to catch a GTX 1080 Ti inside of 300 W and 500 mm^2. That might be why AMD didn't bother to produce such a chip last year. GCN was competitive in 2012 and 2013, but Nvidia made Maxwell a lot more efficient in 2014, and AMD has been generally behind ever since.
If Vega is an all new architecture, or at least a massive overhaul, then it could be just about anything. It could be this generation's equivalent of the Radeon 9700 Pro that lapped the competition, or it could be this generation's equivalent of the Radeon HD 2900 XT that was hot, late, and slow.
The upshot of this is, if you want a high end gaming card, you really should wait until next week. I don't see a dire need to wait for Vega, as I'd be very surprised if Vega blows away Pascal or forces Nvidia to sell the GTX 1080 Ti for $500 and the GTX 1080 for $350. If the top end Vega does match a GTX 1080 Ti in performance, it will probably about match it in price tag, too.
Comments
Yea AMD got one over on intel for sure. But getting one over on Nvidia isn't going to be easy at all.
I would also wait for benchmarks. Vega is releasing in May, this is releasing in April. Are they doing a soft release in order to drive down Vega sales, or is it a hard release because they know the card is not competitive?
I don't know if yields had as much to do with it, as they just wanted to have something in their back pocket to take the wind out of the sails of an impending Vega. Except they didn't wait for Vega, they attacked Ryzen (which is odd if you think about nVidia being worried about a CPU competing against their GPU business, but I guess in a business sense, the CPU business is what AMD is betting the bank on right now, so if you wanted to put them down for the count, you'd try to attack them where it hurts the most). I don't think the 1080Ti has really done anything to drown out Ryzen news, Ryzen talk is everywhere, and I barely hear anything about the 1080Ti - probably because... yawn... it's not really any different than Titan X, just a bit cheaper. Ironically, the same could be said for Ryzen - it's not really any different than Intel HEDT, just a bit cheaper - but there it's between two companies, not one company really just competing against itself, so I guess that does make it news worthy.
As far as GCN goes - I don't know that it necessarily needs a massive overhaul or not. I guess Vega will be a big tell as far as that goes. My understanding was that GCN does receive some pretty significant updates generation over generation -- Fiji isn't the same GCN that's in Tahiti. I always thought it was more like x86 - it's a common platform, but implementations of it get more and more efficient through generations, and additional instruction sets are added over the years.
That being said, I think the pricing of this card is decent if you got the CPU to pull it and are 'in need' of a replacement (being a 780 or 970).
I would say its a great day to buy something starting from 980ti up to 1080...but getting one 1080ti now considering there are no games outthere other card would perform very good would be a bad call.
my 5 cents.
People bought the Titan X, it went for $1,200. So there's a market out there for people who want/need all the performance they can get. If you are trying to do something like push MSAA at 4K, I doubt even the 1080Ti would get you 60FPS in a lot of games.
The question I think you are trying to ask is, does the 1080Ti meet ~my~ needs? As there are certainly games out there that it will perform well on. It sounds like your happy, and a lot of other people would certainly agree with you (I'm perfectly happy with my 980 right now, but then again I'm pretty forgiving, I don't need 60+ FPS constant and I'm ok with adjusting some settings down) - but there are some people that want more, and a smalller set of folks that actually really could use more power.
If your gaming at 1080p, it's absolutely still overkill. The 1080 continues to be overkill for that market segment, so that's nothing new.
The 1440p/2k folks - you could make a case for it either way. Just depends on if your one of those MAX-MAX folks, or a 60+ FPS purist, or not (and have the budget to buy it in the first place).
If your going to be getting into 4K, then you want all the power you can get your hands on.
Vega may still a decent ways out. It was announced to be a second quarter release, which could be as early as April, but it could also be as late as July - and with every possibility that it could slip back further than that again. There's also the rumors (just rumors and speculation, but hey, what else do we have to go on around here) that Vega is competitive in the existing 1070/1080 market, but here we are looking a performance tier that's (slightly) above that yet, so if your looking for all the performance you can get, Vega may not be worth waiting for, and your looking to spend $600+ anyway on your GPU - Titan X/1080Ti are it and probably won't change much even after Vega's release (unless Vega blows us all away).
But if your in that $300-$500 range, Vega could shake things up there a good deal. AMD already is competitive in the <$300 market with the RX4xx series, so Vega won't do much in that market.
I know a lot of people that jumped from 1080 to TItan X, and just ebayed their used 1080's for around $400-450. If Vega (or whatever, really) proves to be better than what they have now, they will fork out the cash and just flip whatever they had before.
It's not a big market, but it definitely exists. I wish I had that kind of disposable income.
nVidia probably decided that releasing the 180ti with Ryzen means there will be a lot of people who will put this new GPU in their new Ryzen builds.
Of course, those same people would have probably put the 1080 in that build but I guess nVidia wanted to give people a new and shiny GPU to pair with their new and shiny CPU.
A 1080ti or Titan Pascal can make sense with a Ryzen build depending on your workload. For instance if you use Blender a lot, then nVidia makes a bit more sense here as most of the GPGPU functionality is written on CUDA. There is also a 3D scanning tool that composes the final model using CUDA. It's typical to do test renders with the GPGPU, and final render with the CPU. GPUs are designed to calculate 32-bit color data very fast. It's easier to get 64-bit data from the CPU and calculate the bounces. Personally, I would still wait for Volta.
As to Vega, if AMD can put a price competitive GPU out there it will sell well because Nvidia has been resting on their laurels for too long, clearly shown by the high prices and poor driver support lately. If nothing else, all of you should be congratulating AMD for making them sit up and take notice they are not the only game in town.
IF Vega matches 1080Ti
THEN why does it matter that 1080 would be considered Last Generation at that point? Your comparing Vega to 1080Ti, not 1080, so whatever the 1080 may or may not be would be irrelevent at that point.
IMO needs to improve on TI to be successful. So much time to strategy and plan. minimum.
At worst it needs to be equal and 10% less $$$$$$
Would be nice to see AMD competitive at the high end, been so long.
The R9 Fury X was not that long ago, it was competitive at the high end and still competes against the 1070. For me I have no intention of buying a Pascal or Polaris card for my next gpu. If Vega is not very good, it's a wait for Volta.
AMD cards have always had significantly more teraflops and memory bandwidth than Nvidia cards of the same price bracket and tier and often higher too.
Real gaming performance is all that matters.
980ti 6.5 tflops vs Fury X 8.5 tflops
780ti 5.04 tflops vs 290x 5.6 tflops
680 3.1 tflops vs 7970 3.8 tflops
these examples clearly show more tflops doesn't alway mean more performance in games.
AMD has always had more FP32 performance and it sounds good on paper or when someone quotes it but in real world situations for gamers the performance of their flagships was always about equal.
This is why I feel 1080Ti with 11.5 Tflops will beat a VEGA 10.
An overclocked 1080TI at around 14 tflops will be a beast. We could see Vega reaching 15 TFLOPS of FP32 with some high end card. 4096 shaders running at 1500 MHz will give 12.5 TFLOPS of FP32. I believe AMD will keep the clock at 1500 MHz base, 1600 Boost. They will give there partners advantage to clock the card between 1600 MHz to 1900 MHz with better PCB design, and sell it for higher price.
VEGA will have big advantage against NVIDIA Pascal cards in DX12 and Vulkan, no thing NVIDIA can do to fix that. I also think for VEGA the AMD reference design 12.5 TFLOPS of FP32 for $599. AMD partners with a higher clock speed 14.5 TFLOPS of FP32 with better PCB design for $699 same price as 1080TI.
But just because something has historically been the case doesn't mean that it has to be that way forever. If Vega is just GCN/Polaris plus HBM2, then yeah, Nvidia is still going to be ahead for gaming, and a Radeon RX Vega would probably be a little faster than a GeForce GTX 1080 and shy of a GTX 1080 Ti. But if it's a very new architecture that can match what Maxwell did for scheduling three years ago, then the question isn't so much whether Vega will beat a GTX 1080 Ti but by how much. And on that, we just don't know yet.
Just to add a car example, which I've used before. I had a buddy who has a 03' Mustang Cobra with a Ken Bell supercharger kit. He dyno'd around 540 WHP at 6000' altitude. He raced a guy with a 670 WHP Toyota Supra from a rolling 40mph and walked 3 car lengths on him by the time they hit 100.
Now, if you know anything about cars, it's more than just peak HP figures that determine performance. Yes, the stang had a lower peak HP, however it had a higher HP figure at literally every other point on the dyno chart, meaning it had more area under the curve, so from say 2000-5000 RPM he was putting significantly more power to the ground to his peak, whereas the Supra was anemic under about 5500rpm.
Now, how is this relevant?
The number's don't always tell the whole picture. Even things as simple as drivetrain types can make a difference, different transmissions are more or less efficient. Different gear ratios matter, etc etc etc.
So just looking at the "raw power" of a video card like a tflops or memory bandwidth, generally isn't going to tell you everything you need to know.
Basically like quizzical said, AMD typically had more "Horsepower" but had trouble putting that horsepower to good use, and was getting beat by a "less powerful" card that had more tricks up its sleeve to utilize that horsepower. Think rear wheel drive vs all wheel drive.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche