Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Video upgrade from GTX 260, suggestions?

2»

Comments

  • Loke666Loke666 Member EpicPosts: 21,441

    Originally posted by Quizzical

    Your recommendation seems to be basically to ignore prices.  A GeForce GTX 560 Ti or GeForce GTX 570 can sometimes be a reasonably good value for the money.  A GeForce GTX 550 Ti or GeForce GTX 580 pretty much never are.

    Agreed, 550 is junk and 580 is not worth the money, if you have loads of it you would get the new 680 instead.

    The 570 is pretty nice for the price you can find it at if you look a little though.

  • Loke666Loke666 Member EpicPosts: 21,441

    Originally posted by fiontar

    Hi guys. I'm looking to upgrade my GTX 260 on my current rig, looking for "best bang for the buck" in the $180 to $280 price range.

    My system specs:

    Win 7 Pro

    AMD Phenom II X6 1055T Thuban 2.8 Ghz, OC rock solid at 4.03 Ghz

    8GB Corsair XMS3 DDR1600 ram (2x4)


    Corsair CMPSU-850TX 850-Watt TX Series 80 Plus Certified Power Supply

    I know the price range is a bit broad, so suggestions at both ends of the range and thought on how much more performance I might expect for the extra $100 would be very much appreciated. :)

    Also would be curious if any new products worth waiting for in the price range are expected in the next coule months. I could put off the upgrade for another month or two with out issue.

    Thanks in advance.

    This one if 5 bucks over your price range but you get a huge step for those 5 bucks. Saving a few bucks on the card will come back and haunt you later, a good card will last longer so it really cost the same in the long run and you'll get a lot better graphics for the next 2 years ( a good card last 3-4 years, a crap 2 at best): http://www.newegg.com/Product/Product.aspx?Item=N82E16814134125

  • Loke666Loke666 Member EpicPosts: 21,441

    Originally posted by Amjoco

    2 GTX 460 sli! Works great and is cheap. OC fer sure

    No it wont.

    Rule number one: SLI is great if you use 2 state of the art card. A single high end card will give you more performance in the real world than 2 mid range cards. A single 570 is a lot better choice, particularly for MMO players. Many MMOs doesn't even support SLI and most of the ones that does is not doing it good enough. SLI typically is best for people that only play high end FPS games.

    2 680 GTX on the other hand is fine if you can afford them. Not a choice for OP though.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Originally posted by Loke666

    This one if 5 bucks over your price range but you get a huge step for those 5 bucks. Saving a few bucks on the card will come back and haunt you later, a good card will last longer so it really cost the same in the long run and you'll get a lot better graphics for the next 2 years ( a good card last 3-4 years, a crap 2 at best): http://www.newegg.com/Product/Product.aspx?Item=N82E16814134125

    It's also ECS, and I wouldn't trust them for build quality.  I guess that's mainly a verdict on their motherboards, more so than their video cards, but still.  I'd sooner pay a few dollars more (and it's a difference of less than $3) for a Galaxy card, as the original poster did.

  • AmjocoAmjoco Member UncommonPosts: 4,860

    Originally posted by Loke666

    Originally posted by Amjoco

    2 GTX 460 sli! Works great and is cheap. OC fer sure

    No it wont.

    Rule number one: SLI is great if you use 2 state of the art card. A single high end card will give you more performance in the real world than 2 mid range cards. A single 570 is a lot better choice, particularly for MMO players. Many MMOs doesn't even support SLI and most of the ones that does is not doing it good enough. SLI typically is best for people that only play high end FPS games.

    2 680 GTX on the other hand is fine if you can afford them. Not a choice for OP though.

    Ew I wasn't comparing to anything, just said it works great and is cheap. :)

     

    OP just a little graph. I think some others had some also posted. http://www.sweclockers.com/image/diagram/1970?k=2f29fbaae1fc8fea55a32bcd56cdda9e

    Death is nothing to us, since when we are, Death has not come, and when death has come, we are not.

  • Hopscotch73Hopscotch73 Member UncommonPosts: 971

    Quick question for Quizzical -

    I'm in a similar boat to the OP (except my card is a 240), and I was already looking at the 560ti - came across a MSI N560GTX-Ti Twin Frozr II 2GD5/OC and was wondering if the 2GB vram makes it a better choice, it's only €20 more than the 1gb version, and while more is generally better in terms of VRam, I know it's not always the case, so I thought I'd ask the resident guru.

    OP - sorry for slight derail!

  • fiontarfiontar Member UncommonPosts: 3,682

    Originally posted by Quizzical

    Maximum power consumption is tricky to test, because if AMD and Nvidia know that it's a power consumption test, they'll have their drivers clock the card back severely to look like they're lower power.  My view is that you need to give performance figures and power consumption from the same test in order to have a meaningful result.

    I'd question their methodology as well.  If you're measuring system power consumption, then that includes other components, not just the video card.  Games tend to push a processor in addition to a video card, so you get some added power consumption there.  Using a Bloomfield/X58 setup that is horrible on power consumption will make that effect a lot worse than it needs to be.  Even if they somehow found an application that doesn't touch the processor at all, you're still adding in power supply inefficiencies.  That they don't account for that (which really isn't that hard to do) is rather shocking.  So they're probably seriously overestimating how much power the video cards use in their test.

    And then you should consider that that's nowhere near a real max load.  Consider, for example, that reviews of the Radeon HD 6970 found that Metro 2033 was able to make PowerTune kick in now and then.  That's with a cap of 250 W, which means that Metro 2033 was able to pull 250 W from the video card alone.  The Guru 3D test finds power consumption of 209 W, and once you subtract off the reasons why they're overestimating things, they're probably seeing under 200 W of power consumption.  That's not a max load on the card.  It might be typical power consumption for a game where the video card is the bottleneck, but the T in TDP doesn't stand for "typical".

    The reviewing site uses their own testing method, rather than benchmarks that the GPU manufacturers can game via drivers. The chart compares the cards tested by the site using their method. So, it's internally accurate, but may disagree with some of the values determined by other sites. :)

    Want to know more about GW2 and why there is so much buzz? Start here: Guild Wars 2 Mass Info for the Uninitiated
    image

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Originally posted by Hopscotch73

    Quick question for Quizzical -

    I'm in a similar boat to the OP (except my card is a 240), and I was already looking at the 560ti - came across a MSI N560GTX-Ti Twin Frozr II 2GD5/OC and was wondering if the 2GB vram makes it a better choice, it's only €20 more than the 1gb version, and while more is generally better in terms of VRam, I know it's not always the case, so I thought I'd ask the resident guru.

    OP - sorry for slight derail!

    What else do you have in your system?  If you don't have the case and power supply to handle it, sticking a much higher power video card in your system can fry things.

    You don't really need more than 1 GB of video memory unless you're trying to use a resolution above 1920x1200.  If you want to play games at a resolution of 2560x1440 or larger, then you'd want a 2 GB card--but probably something faster than a GeForce GTX 560 Ti for that high of resolution.  As with system memory, video memory is largely a matter of, either you have enough or you don't.  If you have enough, then it doesn't matter if you have a little more than you need or ten times as much as you need.  The extra won't get used, anyway.

    Cards that double the normal amount of video memory tend to be low volume parts, and that drives prices up.  It's very rare that they're a good value for the money, and I strongly suspect that you're saying, it's only €20 more than some other card that is also overpriced.  Where are you looking at buying a video card, anyway?

  • JetrpgJetrpg Member UncommonPosts: 2,347

    Originally posted by Quizzical

    Maximum power consumption is tricky to test, because if AMD and Nvidia know that it's a power consumption test, they'll have their drivers clock the card back severely to look like they're lower power.  My view is that you need to give performance figures and power consumption from the same test in order to have a meaningful result.

    I'd question their methodology as well.  If you're measuring system power consumption, then that includes other components, not just the video card.  Games tend to push a processor in addition to a video card, so you get some added power consumption there.  Using a Bloomfield/X58 setup that is horrible on power consumption will make that effect a lot worse than it needs to be.  Even if they somehow found an application that doesn't touch the processor at all, you're still adding in power supply inefficiencies.  That they don't account for that (which really isn't that hard to do) is rather shocking.  So they're probably seriously overestimating how much power the video cards use in their test.

    And then you should consider that that's nowhere near a real max load.  Consider, for example, that reviews of the Radeon HD 6970 found that Metro 2033 was able to make PowerTune kick in now and then.  That's with a cap of 250 W, which means that Metro 2033 was able to pull 250 W from the video card alone.  The Guru 3D test finds power consumption of 209 W, and once you subtract off the reasons why they're overestimating things, they're probably seeing under 200 W of power consumption.  That's not a max load on the card.  It might be typical power consumption for a game where the video card is the bottleneck, but the T in TDP doesn't stand for "typical".

    So?

    There be a chart there, they all used the same method, at least if not 100% accurate shows a degree or trend of accuracy.

    Nm, quizz is always correct, Quizz is always correct (I wonder if this will keep me from having having to respond to 3-5 paragraphs, i doubt it).

    "Society in every state is a blessing, but government even in its best state is but a necessary evil; in its worst state an intolerable one ..." - Thomas Paine

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Jetrpg
    So?
    There be a chart there, they all used the same method, at least if not 100% accurate shows a degree or trend of accuracy.
    Nm, quizz is always correct, Quizz is always correct (I wonder if this will keep me from having having to respond to 3-5 paragraphs, i doubt it).

    Your better off looking at the TDP of the card - that's how much thermal power the card is rated to produce, and since there is a thing called the First Law of Thermodynamics, what comes out equal to what goes in. So that means the power drawn by the video card is at least proportional to the TDP rating of the waste heat produced by the card (minus the electricity that actually goes into producing work (which is amazingly little) and to run the cooling system, and conversion inefficiencies, etc. - which all together are a pretty small part of that number)

    Now, I won't claim that TDP's are entirely accurate, nVidia in particular has been known to stretch the truth on these when it suits them - mainly because they have no way of actually capping a card to prevent it from going beyond it's rated TDP. And overclocks invalidate them, and you can get some really nice yield chips that will come in under TDP. But they should be close, and fairly representative, and make for a good comparison tool if nothing else.

    But it's probably a better benchmark than someone sitting with a $20 Kill-o-Watt meter plugged into a computer trying to divine how much of that total load is due to the video card and how much is due to driver differences, or because there is a hard coded exception in the driver (*cough* Furmark) , or because one card can offload physics and etc...

    I do agree - that one web site is internally consistent if it runs the same test, and just swaps out video cards. However, they can't possibly account for things like internal driver hardcodes - which are very common, and before PowerTune, was the only way that GPU manufacturers had to protect their cards against run-away power consumption. If you were to actually bench something like Furmark without the hard code, nVidia cards would run off the charts (possibly damaging themselves in the process), and AMD cards would just keep bumping their PowerTune settings. So while it's internally consistent, it can't account for things like this.

  • JetrpgJetrpg Member UncommonPosts: 2,347

    Ill take ACTUAL readings, scientific measurements (if not totally controlled), over manufacturer's Ratings anyday. Similarly, i can see the flop rate and its computing power of two videos cards , cpus, etc. And the highest isn't always the best preformer. Well By that rating it should be.

    This isn't to say the rating is wrong. Only that i feel actual measurements seem more controlled/ accurate.

    Tho i totally, believe the 570 has higher power consumption, just would have used sources to prove that , or tell them to google it some more.

    Ps- edit- Power consuption is a joke, picking one or another gpu using that as a metric is silly.

    "Society in every state is a blessing, but government even in its best state is but a necessary evil; in its worst state an intolerable one ..." - Thomas Paine

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Originally posted by Jetrpg

    Ill take ACTUAL readings, scientific measurements (if not totally controlled), over manufacturer's Ratings anyday. Similarly, i can see the flop rate and its computing power of two videos cards , cpus, etc. And the highest isn't always the best preformer. Well By that rating it should be.

    This isn't to say the rating is wrong. Only that i feel actual measurements seem more controlled/ accurate.

    Tho i totally, believe the 570 has higher power consumption, just would have used sources to prove that , or tell them to google it some more.

    Ps- edit- Power consuption is a joke, picking one or another gpu using that as a metric is silly.

    Look at the chart in question.  It says "Calculated card TDP".  TDP doesn't mean "power used when playing a typical game".

    TDP stands for Thermal Design Power.  It means, this is the most power that the card could plausibly use for thermally-significant lengths of time (1 minute is significant, but 1 ms is not).  TDP is a way for card manufacturers to tell people who build systems, this is the amount of power that you need to be able to deliver to the video card, and the amount of heat that you need to be able to dissipate from it.  If you can deliver this much power and dissipate this much heat (while keeping an ample supply of suitably cool air delivered the the card intake fan(s)), then we promise that the card will work safely in your system.

    The amount of power that a card uses in an average game is irrelevant here.  If a given video card uses 100 W in game A, 150 W in game B, and 200 W in game C, then you might reasonably say that it uses 150 W in an average game.  But if the card is designed to be safe at 150 W and not at 200 W, then game C might well fry the card.  The TDP needs to be at least 200 W.

    Which measurement is relevant depends on what you're trying to do.  If the question is what your electric bills are going to be, then TDP doesn't matter.  Average power consumption is what matters--and not just in games, either; idle power consumption actually plays a huge role here.  If you're trying to pick out a case and power supply, then average gaming power consumption doesn't matter.  If you want to avoid frying things, the TDP is what matters.

    AMD cards with PowerTune have a simple way to give the TDP:  whatever the PowerTune cap is.  If a card has a PowerTune cap of 200 W, and some power virus shows up that would make the card pull 250 W, then PowerTune will throttle back the clock speeds so that the card still only uses 200 W.

    Furthermore, the TDP is adjustable by the end-user, as you can change the PowerTune cap.  If you get a Radeon HD 7970 with its TDP of 250 W, and then set the PowerTune cap to 200 W, then the card now has a TDP of 200 W.  That won't reduce typical gaming power consumption by 50 W; in many games it won't have any effect at all, and even for games where it does, it might reduce typical power consumption by 5 or 10 W, not 50.

    For cards that don't have anything analogous to PowerTune, the TDP is a number that the company has to pick.  For Fermi cards, the TDP that Nvidia listed was basically a lie, to try to cover up how bad the cards were on power consumption.  The generation before that, when Nvidia and AMD were competitive on performance per watt, Nvidia listed more honest TDPs that would be rather difficult to reach in real games.

    "There be a chart there, they all used the same method, at least if not 100% accurate shows a degree or trend of accuracy."

    That's why they play the games.  Well, some sites don't.  But that's why we look to Hard OCP for video card reviews.  If you can get higher performance with lower power consumption in real games (not just canned benchmark demos of real games), then that's not cheating benchmarks.  That's what video cards are supposed to try to do.

  • LazureusLazureus Member Posts: 26

    Nvidia 560 Gtx Ti either single or in SLI

    image

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Jetrpg
    Ill take ACTUAL readings, scientific measurements (if not totally controlled), over manufacturer's Ratings anyday. Similarly, i can see the flop rate and its computing power of two videos cards , cpus, etc. And the highest isn't always the best preformer. Well By that rating it should be.
    This isn't to say the rating is wrong. Only that i feel actual measurements seem more controlled/ accurate.
    Tho i totally, believe the 570 has higher power consumption, just would have used sources to prove that , or tell them to google it some more.
    Ps- edit- Power consuption is a joke, picking one or another gpu using that as a metric is silly.


    My point is, these readings aren't even close to scientific.

    At least with the TDP, if it's totally bogus, you can hold the company publishing that number responsible.

    Would I pick a video card based on a power metric? All other things equal, or near equal, I absolutely would. Small form factor, or low noise, I absolutely would. Laptop - exclusively and over near all other factors. It's not just some meaningless statistic, it has real world ramifications, even in desktop computers. If the first wave of 480GTX's didn't prove that, the 590 GTX certainly did.

    And if someone doesn't care to google (or research, as a scientific-minded person may put it) my statements to determine their validity - I don't care. You can believe me, or verify what I say, or just dismiss it. It doesn't affect the truth (or ignorance) of my statements.

    I'll even, this time, include some links to "prove" my points:

    http://hardocp.com/article/2010/09/30/my_quiet_galaxy_geforce_gtx_480_sli_build/


    Running these 480 cards at full load for a period of 30 minutes has the ability to raise the temperature in my office a few degrees easily. I can certainly feel the 480 exhaust pouring out from under my desk while gaming. Actually I can even feel the heat with the box simply at a 2D desktop. Even though I have gotten the box a bit further from me, I am still thinking about building a little heat shield so it directs the heat behind my back rather than at my side.


    (this one made me laugh out loud)

    http://hardforum.com/showthread.php?t=1596174

    http://www.guru3d.com/article/geforce-gtx-590-review/21



    Wrapping up, you know I'm still a little flabbergasted as to why NVIDIA clocked the card as low as they did. Likely they are ying-yang with the fact that this card performance roughly equal slash close to R6990 performance and simply opted for low noise and a fashionable power consumption.
    (Guess why they had to clock it as low as they did - because they had to to fit the PCI power envelope)
  • Hopscotch73Hopscotch73 Member UncommonPosts: 971

    Originally posted by Quizzical

    What else do you have in your system?  If you don't have the case and power supply to handle it, sticking a much higher power video card in your system can fry things.

    You don't really need more than 1 GB of video memory unless you're trying to use a resolution above 1920x1200.  If you want to play games at a resolution of 2560x1440 or larger, then you'd want a 2 GB card--but probably something faster than a GeForce GTX 560 Ti for that high of resolution.  As with system memory, video memory is largely a matter of, either you have enough or you don't.  If you have enough, then it doesn't matter if you have a little more than you need or ten times as much as you need.  The extra won't get used, anyway.

    Cards that double the normal amount of video memory tend to be low volume parts, and that drives prices up.  It's very rare that they're a good value for the money, and I strongly suspect that you're saying, it's only €20 more than some other card that is also overpriced.  Where are you looking at buying a video card, anyway?

    Same power supply as the OP, mobo is an Asus P7P55LX, processor is an i7 860 2.80ghz (not OC-ed), 8gb RAM 1600 DDR3, case is an Antec 900. No SSD, single SATA HD so I think I'm good as far as power draw and cooling goes.

    I built it about 18 months ago but stuck  the card from my old rig into it, budget constraints. THe 240 copes, but I generally have to turn off shadows and run in med resolution in games to preserve framerates.

    I use dual monitors at work, but at home I have a single 24" HPw2216 - which isn't in that scope (2560x1440) where native resolution is concerned. I was thinking of going dual monitor at home too, but I wanted to upgrade my gfx card first.

    I'm in Ireland, so was looking at web based retailers here, the card I was looking at was this one:

    http://www.komplett.ie/Komplett/product/nvidia/80004678/msi_geforce_n560gtx_ti_tf_ii_oc_2gb_pci_e/details.aspx

    But based on what you're saying, this would make more sense for me:

    http://www.komplett.ie/Komplett/product/nvidia/20076109/asus_geforce_engtx560_ti_dcii_2di_1gd5_1gb_pci_e/details.aspx

    The €20 price difference was between 2 MSI cards, the 2GB one linked above and the 1GB version.

    I really wish we had a newegg equivalent over here, we get hammered on prices and tax.

    Thanks for taking the time to get back to me - much appreciated!

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    That's way too much to pay for a GeForce GTX 560 Ti.  For a 2 GB card, you can probably find a Radeon HD 7850 that is superior to the GTX 560 Ti in nearly every way for cheaper than that.  Well, the cooler won't be as good, but it won't need to be, as the card will only use maybe 2/3 as much power--and while offering better performance.  For example:

    http://www.elara.ie/productdetail.aspx?productcode=MMEP700502

    Even the 1 GB card is really too much to pay for a GeForce GTX 560 Ti.  For example, you could get this:

    http://www.komplett.ie/Komplett/product/ati__amd/20080832/sapphire_radeon_hd_6870_1gb_pci_e/details.aspx

    That will get you around 90% of the performance for under 73% of the price tag.

    Even if you want a GeForce GTX 560 Ti, you can get one cheaper elsewhere:

    http://www.pixmania.ie/ie/uk/8932341/art/gainward/geforce-gtx-560-ti-1-gb-g.html

    That's not as good of a cooler, though, and Gainward really isn't the best brand.  (They used to sell AMD cards, too, and AMD was like, your cards are terrible, so we're not going to sell you any more GPU chips.)

    Or you could pick up a faster GeForce GTX 570 for not much more:

    http://www.pixmania.ie/ie/uk/11677086/art/inno3d/geforce-gtx-570-game-pack.html

    I'm not familiar with Inno3D, as they don't sell cards in the US.

  • moosecatlolmoosecatlol Member RarePosts: 1,531

    680 for the power usage, something anyone can appreciate after the 590's monthly bills.

  • Hopscotch73Hopscotch73 Member UncommonPosts: 971

    Thanks Quizzical  - your search-fu is a thing of wonder. 

    Plenty of options there, I think I'll go for the 6870 - I have been sticking to nVidia for the past few years out of habit, but given what you've said about power/heat in nVidia cards, I think I'll give AMD a shot this time round.

    Again, thank you for your help!

     

     

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Excessive power consumption is really only a problem if either you don't have the case and/or power supply to handle it, or you find excessive heat coming out of the case and into your room to be a problem.  Not having an adequate case can be either because you've got a cheap junk case (e.g., what you might get from Dell or HP), or because you're trying for a small form factor such as a laptop.  For a given level of performance, you'd nearly always rather have less power consumption than more, but if you've got the case and power supply to handle it, more power consumption isn't really that big of a deal.

    Just to give you a ballpark approximation, in order to get one made-up unit of graphical performance, a video card will need to use about:

    Radeon HD 4000 series, GeForce 100/200/300 series, GeForce GTX 465/470/480 cards:  2 W

    GeForce 400 and 500 series cards other than 465/470/480:  1.7 W

    Radeon HD 5000 and 6000 series cards:  1.4 W

    Radeon HD 7000 series and GeForce 600 series cards:  1 W

    Of course, the problem with GeForce 600 series cards is that most of them aren't out yet, and the only one that is (GTX 680) is generally out of stock everywhere.

    If you think power consumption is a huge deal, you can get a Radeon HD 7770 for about the same price as the 6870, but it will only give maybe 80% of the performance of the 6870.  I'd advise against going that route unless you're trying to cram the card into a rather inadequate case, however.  The "Barts" chip of the 6870 is presumably already discontinued, and once those cards disappear and AMD can get better volume from TSMC, the 7770 will probably drop in price a bit.

Sign In or Register to comment.