It looks like you're new here. If you want to get involved, click one of these buttons!
OK the time to upgrade my GPU continues to draw nearer. Which is the best value? What one would you get in the price range of these two cards?
Edit: Gaming at 22" 1080p and would consider 1440p possibly before too long though probably not for another year after I do my build.
Comments
If power consumption is a huge deal to you, then the GTX 970. If you think you'll need more than 4 GB in the useful life of the card (which I'm skeptical of), then the R9 390. If you're planning on getting a new monitor with adaptive sync in the useful life of the card, then probably the R9 390, unless Nvidia decides to support it. (G-sync does the same thing, but adds about $150 to the price tag of a monitor over adaptive sync.)
Otherwise, let's wait and see some reviews first. On performance alone, likely the GTX 970, but I'd want to see some reviews to see just what AMD got out of a respin.
Hands down the 970. The 300 series is just another re-brand of the power hungry 200 series with minor tweaks and overclocked about as far as they will go. The 300 series are heat machines with no overhead for further clocking left. Maxwell beats all of them very efficiently.
Tomshardware: So, is this just another rebrand? Sadly, that’s pretty much the bottom line. There’s no real innovation to speak of in AMD’s 300 series, at least as far as these models go. Let’s hope for the company’s sake that Fiji doesn’t turn into Bermuda.
If you are looking towards 4K neither of these cards are in that arena. You'll pay 600+ to have something that can handle that comfortably.
You ask us about two gfx cards yet you fail to provide the most important piece of information: the size of your monitor and the resolutions you play at.
Anyway, the 970 runs quieter and way cooler, is much less power consuming and easy to overclock quite high without raising it's temperatures. Perfomance wise both score about the same marks at 1080p. At higher resolutions the AMD card pulls ahead.
Temperature and noise depend on other things, especially the cooling system, and not just the GPU chip itself. Now, it is easier to make a card cooler and quieter with a GPU chip that doesn't put out very much heat.
But "it's easier" isn't the same as "it's always done". The GeForce FX 5800 was notoriously loud with a TDP no higher than 75 W (and possibly a lot less than that; I can't find a reliable source). The original Pentium was notoriously hot with a TDP of 5.5 W.
Nonsense. I'd be willing to put money on almost all gamers still using 60hz monitors, the enthusiasts using 144hz etc are a tiny minority. Cutting edge is not the norm.
I use HDMI to the TV when playing games like FIFA 2015, The Witcher etc... i must not be a proper gamer going by your thinking.
I would look into upgrading to 1440p with those cards you are interested in otherwise it's like putting Hemi in a Toyota Corola with a speed limiter set at 60mph.
PCI? or PCI-e 2.0?
If it's a PCI-e x16 2.0 it's still very viable tech, I don't think even slotting a GTX 980 Ti will be able to saturate the bandwidth of PCI-e x16 2.0. If you were slotting an Intel 750 Series SSD then you will want a 3.0. By the fall though a lot of these prices will drop and the GPU wars have just gotten started with AMD's new line up. Some tech sites have predicted HDD prices to drop to 10 cents per GB.
Whatever you choose for your GPU make sure you figure out what resolution you want to play at and then go from there and buy the the best GPU you can afford if you are going to use the PC mostly for gaming. If you are going to use the PC for professional work then you'll have to balance out your budget between the CPU and GPU. For gaming, get a good CPU and splurge on the GPU, don't hold out on the GPU but don't chimp on the other components. Keep an eye out on the Fury X especially if you going to jump on the VR band wagon.
Breh, it's got bigger numbers. Everyone knows that when the numbers go up (insert sarcasm) it's soooooo much better, just like my Alienware. My Alienware has so much Gee Bees - like, a lot. I had the Alienware X but then I got the XX and gave it to my younger brother because I got Alienware X Black xtreme Gee Bee edition w/ additional molecules. Not with HDMI, not with DVI, but with DisplayPort 96. Because MOBAs and CS:GO.
I don't care to correct all of the problems with your post, but the adaptive sync standard requires use of DisplayPort. DVI is a legacy port on its way out.
HDMI 1080p has been the budget gamers standard for many years now. I could make up some statistics, but I won't -- it has been a very popular interface for whatever reason. Probably because the first wave of "affordable" LCD monitors back in the early/mid-2000's were mostly rebranded televisions, and then it just carried over after that.
If you build it in the fall then I'd wait. The AMD Fury(x) is about to release and from what I read the Furx X2 will release in fall.Maybe the prices will drop with its release so my suggestion is wait a bit.
Also the 290x and 390x nearly perform the same so that could be something to consider.