Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Overclocked 9xx--can it touch the 2500k?

boikymarboikymar Member Posts: 60

Hey there. I'm well aware that no AMD CPU can keep up with Sandy Bridge processors at stock speeds. Obviously if you overclock both the 955 and 2500k, the 2500k will blow away the 955 just as badly as it does at stock speeds. But will a Phenom II x4 955 BE OCed to 4.0 GHz (give or take) hold it's own against a 2500k at its stock 3.3 GHz? As you may have guessed, I'm a 955 owner and I'm debating on upgrading to Sandy/Ivy. Money's tight right now but the benchmarks I've seen for the 2500k are incredible.

Comments

  • KabaalKabaal Member UncommonPosts: 3,042

    No it won't due to the built in turbo mode on the 2500k, it automatically boosts itself to 3.7ghz right out of the box when under load. It's a bit silly comparing them with the 955 overclocked and the 2500k not as the 2500k's are incredible overclockers. That being said, you might be better off spending the money on a good graphics card depending what you already have.

  • MehveMehve Member Posts: 487

    I suppose it depends on where you get your benchmarks, but I seriously doubt anything under 4.2Ghz is even going to be contender against a stock 2500k - there's a pretty significant difference in clock-for-clock efficiency between the two.

    A Modest Proposal for MMORPGs:
    That the means of progression would not be mutually exclusive from the means of enjoyment.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    No, it won't.

    Still, I'd advise against getting a Core i5 2500K.  A Phenom II X4 955 is still a capable gaming processor.  If you've got the itch to upgrade, then I'd wait for Ivy Bridge, which is rumored to launch on April 29.

    Even so, there's the question of whether the performance difference matters.  If a Phenom II X4 955 gets you 70 frames per second at stock speeds and 80 when overclocked, and a Core i5 2500K gets you 90 frames per second at stock speeds and 110 when overclocked, does that difference matter?  Eventually you'll hit a game where the performance difference does matter, but that could be years away.

  • boikymarboikymar Member Posts: 60

    ^ Genius answer, Quizzical. I've had an "itch" to upgrade for the past month or so...but I think I'm content with my 955. I have it OCed to 3.9 GHz and am getting 250-300 FPS on Guild Wars with max settings and 4x AA. I'm guessing that 60+ FPS in GW2 will be a breeze...although I'm sure I'll have to upgrade my 6770 in order to do so on max. I'm thinking a 6950 2GB or a GTX 570. I run dual monitors @ 1080p but only play my games on one monitor. The other monitor usually plays host to Mozilla, iTunes, etc. I'm leaning towards the 570--just waiting for a price drop.

  • ShakyMoShakyMo Member CommonPosts: 7,207
    6950 fella, there's a good chance you ain't need to change your psu to support it
  • ShakyMoShakyMo Member CommonPosts: 7,207
    i5 with 69xx for a high rig
    i7 with 79xx or 680 for Uber but expensive and not needed for a few years yet rig
    Fx4 or phe2 with 68xx for a highish rig that will play all current games on high at 1080p
  • ShakyMoShakyMo Member CommonPosts: 7,207
    Never by fx6 or fx8 though - server chips, actually slower for a games rig than fx4 and high end phenom 2s
  • boikymarboikymar Member Posts: 60

    My Antec Earthwatts 650W will handle any upgrades I'm thinking of at this point. But like I said, I'm going to stick with my 955. I will be looking at a higher-end card, though. I'm thinking that whenever I switch over to Intel, I'll crossfire/SLI the card that I'm using in my AMD rig. I'm leaning towards a 570 just because of drivers. Anyone want to advise me otherwise?

  • CaldrinCaldrin Member UncommonPosts: 4,505

    Nothing wrong with ATI drivers these days been using them for years now..

     

    Currently running 2 x 6850s ..

  • boikymarboikymar Member Posts: 60

    I haven't experienced any problems first hand with ATI drivers, but I've heard some horror stories about Crossfire. SLI is supposedly emulates a single card's performance much better than AMD cards--mainly by reducing microstuttering. I've never ran Crossfire and there's so many different benchmarks and opinions that it's hard to use those to form my own. :

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Cayman (6900 series) cards are long gone, so I wouldn't try to hunt those down.

    A Radeon HD 6870 will give you a little less than double the performance of your current video card.  And it's also cheaper than you might think:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814102948

    Note the promo code:  that's under $160 with shipping and before rebate.

    Above a Radeon HD 6870, you end up having to pay a lot more for something only somewhat faster.  Depending on budget and what happens to be on sale that day, you can sometimes make a case for:

    GeForce GTX 560 Ti around $200

    Radeon HD 7850 around $240

    GeForce GTX 570 around $270

    Radeon HD 7870 around $340

    GeForce GTX 580 around $370

    The prices there matter, not just the card names.  If you find a GeForce GTX 560 Ti for $250, then that's a massive waste of money.  For what it's worth, a GeForce GTX 580 is maybe 50% faster than a Radeon HD 6870, while costing more than twice as much.  So performance per dollar drops as you go up the chain, but that's part of the price you pay to get a higher end card.

    For what it's worth, the Radeon HD 7000 series cards have massively better (about 70% better) performance per watt than the GeForce 500 series cards, as well as a better feature set.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Originally posted by boikymar

    I haven't experienced any problems first hand with ATI drivers, but I've heard some horror stories about Crossfire. SLI is supposedly emulates a single card's performance much better than AMD cards--mainly by reducing microstuttering. I've never ran Crossfire and there's so many different benchmarks and opinions that it's hard to use those to form my own. :

    SLI doesn't emulate a single card's performance at all.  Neither does CrossFire.  Microstutter and increased latency are simply  problems intrinsic to alternate frame rendering, and there's nothing you can do about it other than going with a single card.

  • boikymarboikymar Member Posts: 60

    Originally posted by Quizzical

    Originally posted by boikymar

    I haven't experienced any problems first hand with ATI drivers, but I've heard some horror stories about Crossfire. SLI is supposedly emulates a single card's performance much better than AMD cards--mainly by reducing microstuttering. I've never ran Crossfire and there's so many different benchmarks and opinions that it's hard to use those to form my own. :

    SLI doesn't emulate a single card's performance at all.  Neither does CrossFire.  Microstutter and increased latency are simply  problems intrinsic to alternate frame rendering, and there's nothing you can do about it other than going with a single card.

    Thanks for the response, Quiz! I didn't mean that SLI literally acted as one card. There was a pretty extensive test regarding microstuttering differences in SLI and Crossfire (Tom's Hardware, I think?) and it showed that SLI's sudden FPS drops were way less significant than Crossfire's FPS drops. That's what I meant to say. My bad.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Originally posted by boikymar

    Originally posted by Quizzical


    Originally posted by boikymar

    I haven't experienced any problems first hand with ATI drivers, but I've heard some horror stories about Crossfire. SLI is supposedly emulates a single card's performance much better than AMD cards--mainly by reducing microstuttering. I've never ran Crossfire and there's so many different benchmarks and opinions that it's hard to use those to form my own. :

    SLI doesn't emulate a single card's performance at all.  Neither does CrossFire.  Microstutter and increased latency are simply  problems intrinsic to alternate frame rendering, and there's nothing you can do about it other than going with a single card.

    Thanks for the response, Quiz! I didn't mean that SLI literally acted as one card. There was a pretty extensive test regarding microstuttering differences in SLI and Crossfire (Tom's Hardware, I think?) and it showed that SLI's sudden FPS drops were way less significant than Crossfire's FPS drops. That's what I meant to say. My bad.

    It varies from game to game and from card to card.  Tech Report does some pretty in-depth testing on this, as they'll look for stuff like the 99th percentile frame time to try to catch hitching.  And sometimes they'll produce results such as saying that for a given game, the same brand has both the cards that scale the best and also the cards that scale the worst.

    One thing that is important to realize is that the big tech sites usually test popular games--precisely the sort of games that AMD and Nvidia put a lot of work into optimizing their drivers for.  Tests in those games are not necessarily representative of results in obscure indie games that the AMD and Nvidia driver teams never tested, let alone put a bunch of work into optimizing.

  • boikymarboikymar Member Posts: 60

    Thanks Quiz. I, like many others, appreciate your insight.

Sign In or Register to comment.