It looks like you're new here. If you want to get involved, click one of these buttons!
In September, Nvidia launched its GM204 chip as the GeForce GTX 980 and GTX 970. The GTX 980 is, at least for gaming purposes, the fastest card out there, and at $550, is priced accordingly. But that's not what I'm interested in here.
The GTX 970 is about 80% of a GTX 980, but at $330, is only 60% of the price. Sure, it was hard to find at $330 for a while, but it's available now, albeit at $330 before shipping and with no rebates available:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814500362
That price tag put it in competition with the GeForce GTX 770 and the Radeon R9 280X. It was a superior card to both of those in just about every way that you can think of, making this Nvidia's first aggressive pricing move in many years. Not coincidentally, it was the first time that Nvidia clearly had a superior architecture to AMD's in many years.
AMD has now responded, it seems. The Radeon R9 290X can be had for $320 with free shipping and before a $20 rebate:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202079
That's a higher power card than the GTX 970, but otherwise superior in nearly every way. Meanwhile, the Radeon R9 290 drops to $260:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127774
I was surprised to see these price cuts from AMD. The Hawaii chip in the R9 290X is a large, expensive chip. While less so than the GeForce GTX 780 Ti, it's still bigger and more expensive than the GM204 in Nvidia's GTX 970 and GTX 980. A 512-bit memory bus rather than 256-bit is expensive, too, as is the power circuitry needed to handle a 250 W chip. If AMD provokes a price war with Nvidia while having the less efficient architecture, that doesn't seem likely to end well for AMD.
Now, one way that slashing prices makes sense is if AMD has something much better coming, and soon. Recall that in mid-2009, AMD was selling its top of the line Radeon HD 4890 for under $200, and the Radeon HD 4870 that had formerly been its top of the line was as low as $130. Granted, those were much smaller, cheaper chips, but those prices were so aggressive that they convinced Nvidia to discontinue its high end cards several months before replacements would be ready.
And why did AMD do this? Because they knew had the Radeon HD 5000 series coming. Nvidia, meanwhile, had a bunch of chips that basically didn't work coming. It was a clearance sale to get rid of old chips and make way for the new. AMD does this from time to time, most notably with the Radeon HD 5850 at $140 in April 2011--which, apart from the architecture being rather dated, would still have been a nice deal on a price/performance basis a full two years later. More recently, they've been getting rid of the Radeon R9 280 at $200.
But a clearance sale on the Radeon R9 290X? That's AMD's top of the line. Now, $320 isn't exactly giving them away for free. But still, this is the largest, most expensive GPU that AMD has ever made. The days when its top of the line chip were less than half the size of Nvidia's are long over.
Now, it is about time for AMD to have something new, of course. The Radeon R9 280X that now has nowhere to go on price, being up against AMD's own superior Radeon R9 290, is nearly three years old, at least if you count its original incarnation as the Radeon HD 7970. The GCN architecture was nice for its day, but has been on the market for nearly three years.
There are two ways that GPU vendors typically come up with something better. One is a new process node, and the other a better architecture on an old process node. Getting big gains from the latter alone isn't terribly common. The aforementioned GTX 970 and GTX 980 are the recent example of it, of course, but before that, you have to go all the way back to 2008 with the Radeon HD 4850 and 4870 for another good example.
There's also the possibility of a new process node on the way. 20 nm chips are now selling in cell phones, so the process nodes are out there. It's not at all clear that any 20 nm process node is appropriate to a high-powered GPU, though. AMD has promised to release a 20 nm successor to Beema/Mullins in 2015, but a process node suitable for 1 W cell phone chips or even 4.5 W tablet chips isn't automatically good for 200 W discrete video cards. But if AMD has cards coming that are going to beat out the GTX 980 and soon, then the recent price cuts make sense.
Comments
Interesting information,thanks for sharing.
Basically clicking away text windows ruins every MMO, try to have fun instead of rushing things. Without story and lore all there is left is a bunch of mechanics.
Reply
Add Multi-Quote
The latter is a bad idea. SLI is very driver-dependent, and even if Nvidia is still putting effort into making SLI work on 2 1/2 year old cards, they probably won't for much longer.
As for the former, is it enough of an upgrade to justify? On a large enough budget, perhaps. But ordinarily, I recommend at least doubling your old GPU performance when you buy a new one, and the Radeon R9 290X is nowhere near that.
The latter is a bad idea. SLI is very driver-dependent, and even if Nvidia is still putting effort into making SLI work on 2 1/2 year old cards, they probably won't for much longer.
As for the former, is it enough of an upgrade to justify? On a large enough budget, perhaps. But ordinarily, I recommend at least doubling your old GPU performance when you buy a new one, and the Radeon R9 290X is nowhere near that.
Try the new game on your old card and see how it does. There's no need to upgrade until you pick up a game where your old card isn't good enough. A game that doesn't run well on a GTX 680 is going to find an awfully small market.
Quizz,
You seem to be a guru with this subject so if I may ask your thoughts on whether or not I should consider an upgrade
I am running a GTX570, I like NVIDIA but this card has given me "driver has stopped responding" errors on many different games pretty much since I got my rig about 3 years ago.
Would the "900" series be a worth while improvement for me?
What are your other Hobbies?
Gaming is Dirt Cheap compared to this...
I agree that SLI with 680s is a bad idea to go for now. But a 290X is a solid upgrade. It's typically 50-60% faster than a 680, which is enough to move the majority of recent games at full settings from 30-40fps to 60-70fps.
We know large chips are expensive, but there are a couple of things we don't know.
We don't know the actual cost per chip that AMD or nVidia is paying. AMD knows, and they know exactly how low they can price their products. We don't know the terms of the production runs they ordered. They may be on the tail end of this contractual product run, and they just need to move product to clear until the end of it.
There are also reasons a company may chose to take a loss, and they aren't all necessarily bad.
The latest news I heard was TMSC's 16nm node has been pushed back until 2016, which affects nVidia's GM210, and probably affects AMD to some extent as well.
http://wccftech.com/tsmc-buys-14b-worth-equipment-16nm-volume-prediction-begins-q2q3-2015/
I don't know if it will fix the "driver stopped responding" error messages. I used to get those in Civilization IV on a Radeon X1300 Pro, and never did figure out the cause.
My rule of thumb is that if you're going to upgrade a GPU, you want to at least double the performance. So that would mean looking at a Radeon R9 290, R9 290X, GeForce GTX 970, or GTX 980. All of those are enough of an upgrade to be worthwhile if you need more GPU performance.
But you seem to be dismissing the Radeon cards out of hand, and I think that's a mistake. The only good reasons to get a GTX 970 over a Radeon R9 290X today that I see are being sensitive to power consumption or needing Nvidia-specific features. Considering that you're running a rather inefficient GTX 570 today, you're presumably not worried about power consumption. And very, very few people need vendor-specific features, though if you have a G-Sync monitor today or you need to use a CUDA program that doesn't also have an OpenCL implementation, then maybe you do need Nvidia. "I think I could plausibly use it in the future" is not at all similar to "I need it today".
That leaves fanboydom as the other reason to have a strong brand preference, and that's something I advise against. But ultimately, it's your money, not mine, so if you're willing to pay more money for a slower card in order to get an Nvidia logo, that's your decision.
A GTX 980 is a very different proposition. While not good on a performance per dollar basis, if you need the fastest gaming card on the market, it costs what it costs and you pay a price premium for it. That's the argument for--and against--the GTX 980.
It's a philosophical decision about whether to get more frequent, smaller upgrades or less frequent, larger upgrades. For example, one could buy a new card that is 20% faster than the old every year, or buy a card that is twice as fast as the old every four years. I'd strongly favor the latter--and perhaps buying a more expensive card up front so that you usually have a faster GPU even while spending less with the latter approach.
Certainly, an R9 290X is a faster card than the GTX 680. If you had neither and wanted to pick between them today at the same price, of course you'd take the R9 290X. But upgrading a card is a different matter, and there's no sense in upgrading to something only slightly faster.
There's a significant cost to upgrading, even apart from the time it takes. It's not guaranteed that your other, older parts will work flawlessly with a video card that didn't exist when the rest of your drivers were written. Nor is it guaranteed that uninstalling the old drivers and installing the new drivers for your new card will go seamlessly. It takes time to physically uninstall one card and install a new one. None of those are huge barriers, and everything will probably work, but regarding a lot of minor costs as "free" is a common reason for people to make stupid economic decisions.
How much of an upgrade does it take to justify it? You may argue that 50% is enough, but I'd argue that it's not. My usual rule is that if you're going to upgrade a GPU, you want the new one to be at least twice as fast as the old. There are four exceptions:
1) The old card is dead entirely or otherwise malfunctioning, and the primary reason for the upgrade is that you want something that works.
2) The decision to replace the card is being driven primarily by feature support and not performance. For this to not be completely stupid is pretty rare, but it does happen. If, for example, if you need four monitors and the old card only has the ports to support three, you might be happy with an "upgrade" that is no faster than the previous card but does have the features you need.
3) The old card is something ancient and horribly slow, such as a GeForce 6150 that certain disreputable vendors were still selling not very long ago. There, you want a lot more than double: even ten times the performance is still awful, so what you really want is a modern card with modern capabilities and performance, even if it's no faster than modern integrated graphics.
4) You need to upgrade, but there isn't anything twice as fast as your old card on the market. Here, you kind of need to go for the top of the line to justify it, or maybe something slightly slower than the top of the line but massively cheaper. This situation only makes sense on a huge budget, but there are people who buy every new flagship card that comes out--or maybe two for CrossFire or SLI.
I covet that 290 X but I still can't justify it since my "old" 7970 is still chugging along quite nicely playing everything I play at max...
... well, at least on my 1920 X 1200 monitor it does. The only thing that has me even thinking about it is wanting to go up to higher rez monitor while still maxing the graphics without using Xfire..
And as nice as the 290 X is, I'm still not convinced that I shouldn't wait a bit longer for both a GFX and monitor upgrade. Tempting as this is, I think I'll still wait a bit.
“Microtransactions? In a single player role-playing game? Are you nuts?”
― CD PROJEKT RED
Of course everybody has different motivations for upgrading and certain minimal requirements of improvement, there's never going to be a general rule.
My view is that if you buy a top of the range card (like the 680 was), you're expecting a level of performance that you won't get close to in 2-3 years. A 680 now is not capable of playing recent games at 60fps on high settings, which most people buying higher end cards want.
The 290X is currently the best value card in terms of performance and price, and it's a very significant upgrade over a 680. Also, with games expecting much more video RAM now, this is something to bear in mind. At the same time, the 680 is more than capable of playing newer games at lowered settings (just as current relatively cheap cards are), so it all comes down to what you want.
Hey, I got a HD 7850 and I always try to get the second tier GFX card out there. Is it worth it for me to get a Radeon R9 290?
And will my power unit handle it? It is only 750W.
My gaming blog
http://www.anandtech.com/bench/product/1076?vs=1056
That shows a 7850 vs R9 290 benchmarks.
Looks like it's right around 2x the performance.
Will your power supply handle it --- well, 750W is enough for anything really, but the question isn't quantity, it's quality. If it's a good quality modern (<3-4 years old) power supply, and it natively has the 6+8 pin PCIE cables, I'd say it would be fine.
If you need to add pin adapters to get the GPU power hookups correctly, if it's an offbrand or suspect power supply, or if it's 4 years or older, I'd probably replace it with your GPU.
650W is enough to comfortably run any single GPU/single CPU computer today, with room for modest overclocks.
The newer AMD cards also support Mantle, which can replace DirectX in a few recent games. It can give anything from a small performance increase to a massive one, depending on your CPU, the game, etc.
The 7850 was never a high end card really. The 7950 was the step down from the 7970. A 290X is a huge leap up, as above.
Exactly what power supply do you have? If it's a good quality 750 W power supply, you'll be fine. But if it's some random piece of junk that says 750 W on the box for purely marketing reasons, it could be more of a problem.
I just took a look in the specs and it seems I was misstaken. It is only a 600W one
Specifically: http://www.corsair.com/en/gs600w
So I guess I need a new PSU as well...
My gaming blog
Not really. What you have is an okay quality power supply. It's not great, but the difference between okay and great is much smaller than the chasm between okay and awful. I won't discourage you from getting a better power supply if you've got the money and are inclined to do so, but a decent 600 W power supply will handle a 250 W video card just fine so long as you leave stuff at stock speeds and you're not doing something strange or stupid.
I just put together a build for someone with some money - they put a nice Corsair AXi power supply that happens to tell you exactly how much power your computer is drawing (of note: a $30 Kill-A-Watt will do the same thing).
This computer had a GTX980 and a i7 4790, with some overclocks, a good handful of fans (6 I think), an SSD and HDD, and a few other toys. It ended up being a nice build, but it wasn't exactly a budget build.
I decided to see what the actual power draw was while I was doing some burn-in testing after the system was put together.
At stock clocks, with Prime95 and Furmark both running: 395W
With some "hit-the-button" auto overclocks: 475W
Now, the 980 only has a TDP of around 165W, and Haswell is more power efficient than chips that came before it (that tends to goes out the window on overclocking). But you can see, even with some top-end components, the myth that you need a high wattage power supply is pretty bunk. You need a good quality power supply, but more wattage won't get you anything. What you really want is stability, not just raw power.
A lot of people just assume a bigger number = a better power supply. Or that if you just go total overkill, that not running it at 100% is somehow better for it. A PSU that can't run at 100% reliably probably isn't going to run reliably at any power output.
Corsair, for the most part, is a pretty safe brand. You can overpay for the name, but you won't get burned on quality or customer support.
Ok I will try with my current PSU and see if it can keep it stable. I do have an overlocked Intel i5-3570 @ 4.0 GHZ though. Not sure if that matters.
Thanks for the advice though, much appreciated!
My gaming blog
Thanks, I did not know this. Seems weird to advertise a certain power but not able to deliver it.
My gaming blog
More than just being able to deliver the power goes into it. Things like consistency, fluctuations, and what not factor into it as well.
Others can explain it better as I am no expert on PSUs. I am running Seasonic 550W gold rated PSUs in both my machines. Costs more but are worth it to me for the quality.
Not to sidetrack this GFX thread into more PSU stuff, but I'll do one more little detour...
A lot of power supplies are made by OEMs and re-branded. The better Corsairs are usually Seasonics or Channel Well's - both outstanding manufacturers. Here's a chart that tells you by model, who made what: http://www.tomshardware.com/reviews/power-supply-oem-manufacturer,2913-5.html
I've now used the same Corsair 1000 - HX (made by Channel Well) in 3 builds. I've had it since 2009 and the steadiness of the voltage is just as tight as it was 5+ years ago.
“Microtransactions? In a single player role-playing game? Are you nuts?”
― CD PROJEKT RED