Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Will a Asus G74SX (BestBuy version) be able to handle GW2?

2»

Comments

  • KalferKalfer Member Posts: 779

    Originally posted by makii

    Originally posted by wojtekpl


    Originally posted by Alot

    I hang around quite a lot in the Guild Wars 2 Guru - Technical Support section, and it seems that the computers at GamesCom had 4GB Ram, GTX 460s and intel i5-750s and  ran Guild Wars 2 at maximum settings with 4XAA at 1920x1080. 

    I recommend you take a look at the Grand Royal Unified GW2 Requirements Discussion Thread.

    Hehe

    I have HD6850 OC [equivalent to gtx460 oc] i5-760 and 4gb ram :):):):):)

     

    I can say the following:

     

    i am 100% sure that i will be able to run the game at 1920x1200, 16 AF and 8-16 AA [normal, not uber] with 40+ FPS.

    Reason? I can run Age of Conan on Dx10 with absolutely everything maxed, all distances, af 16 aa 16 blur and every single other piece. We all know AoC coding sucks so if i can run it with such settings at 40-60 fps then im pretty sure everyone with similiar PC as mine will be able to run the game on high-to-max settings. Everyone with weaker PC will most likely be able to run it at medium settings np.

     

    GW2 does not need superb graphics, it needs to have character; this "something" which it does have.

     

    Age of Conan had alot of Graphic-code improvements since the release, to guarantee a smooth gameplay.

    But you guys forgot, that GW2 uses an enchanted Engine version of GW 1. So everyone who could play GW 1 with 60fps at 1920x1200, should be able at least to play GW 2 with the same resolution but maybe with 40fps, wich is more than enough.

     

    I have no claims to back this up, but I think this is unrealistic. They have said that they will strive to support low end system, but I still suspect(based on videos and trailers) that we will see some particle effects, shadows, view distance, AA and so on, that will be taxing. Perhaps quite taxing.

     

    I also wonder how intensive the game wil run in WvWvW. They say that those scenarios can contain several hundreds of players along with all sorts of siege weapons. I suspect, rendering that many models with all their individual armor pieces, and rendering all those spell effects and animations will also take it's strain. It's one thing to have 60 steady FPS in an open world area with a few dozen mobs, but it's another to have hundreds of human players going bananas in epic warfare!

     

     

    My experience(To the OP) is that Laptops feel outdated quite fast. I suspect OP would be better off buying a powerful desktop now + a cheap netbook/subnotebook for his truly portable goals. Because I don't think the G74 is a portable laptop, unless he is a LAN person who really plays a lot at friends houses.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Originally posted by Alot

    Quizzical, by chance do you know when AMDs Bulldozer CPUs are expected to come out? Quite some people believe that the release of those CPUs will cause quite some drops in hardware prices.

    As of the start of June, AMD said 60-90 days, which roughly means, August.  AMD at the time insisted that Interlagos and Valencia server processors are not delayed, but will come out in Q3 2011 as scheduled all along, so that sounds like it's a fab capacity problem and not a "needs more respins" problem.  If the die is fine but there isn't enough fab capacity to satisfy all markets, servers get precedence because they command higher price tags, and AMD would rather sell one $600 server processor than two $200 desktop processors.  So, incidentally, would Intel.

    Don't expect price drops in response to the launch of Zambezi, though, with the possible exception of some high bins of Deneb and Thuban that aren't a good value for the money now and still won't be even after the price drops.

    Intel desktop processors under $200 already aren't a good value for the money.  Intel doesn't care.  They figure people will buy them anyway because they're Intel and they have a good marketing department.  Bulldozer won't change that.  I'd expect the Core i5 2500K to still be a decent value for the money after Zambezi hits, so that won't put price pressure on Intel, either.  Gulftown will be absurdly overpriced, but that's already the case, and Intel doesn't care.  It's basically, we made these server processors, and if you want to buy one for a desktop, you can, but we're charging server prices.

    AMD's Phenom II X4 965 and lower processors aren't going to be in the same price bracket as Bulldozer to begin with, so that won't put any pressure on those processors.  AMD is already selling Deneb, Propus, and Regor pretty cheaply for their respective die sizes, so don't expect price cuts from AMD there.  The only real exception is that higher bins of Deneb and Thuban might get price cuts to get rid of them, especially when AMD decides to end production of the dies.  That basically means that maybe the price that gets you a 3.4 GHz Phenom II X4 today will get you a 3.6 GHz one instead early next year.

    What will lead to price cuts, however, is Global Foundries bringing more capacity online for their 32 nm HKMG SOI process node.  If a 5% market share for the new processors means selling every single one they make as fast as they can make them, then AMD will price the processors to get a 5% market share.  Once Global Foundries has more capacity and can supply 10% of the market on the new process node, then AMD will cut prices to claim a 10% market share for the new processors.  This won't be an abrupt thing of, now there's suddenly a bunch of capacity.  Rather, it will be a slow trickle over the course of the next year, as AMD makes fewer processors on the older 45 nm node and more on the new 32 nm node.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Originally posted by Emeraq

    Originally posted by Quizzical


    Originally posted by Emeraq

    I'm running an Alienware MX17 with duo P8600 @ 2.4 GHZ 4 gig memory, Nvidia 260M

    .....

    That Asus has a Nvidia 560M which seems to be twice as nice as my card... I would think you'd have no problems running any game currently out or coming out shortly on that machine. But I'm no expert on the subject.

    Alienware doesn't make it run any faster or slower.  It's the hardware inside that matters.  And you haven't said what hardware you have, even if you think you have.  There are two different cards branded as a 260M by Nvidia:  the GeForce GTS 260M and the GeForce GTX 260M.  They're not merely different, but completely unrelated.  DIfferent architectures, different process nodes, different feature sets (including diferent DirectX version compatibility), and so forth.  And they're both different architectures and process nodes from the desktop GeForce GTX 260.  Yes, Nvidia does that intentionally, to confuse you.

    A GeForce GTX 560M will usually outperform a GeForce GTX 260M by a pretty good margin, but not always, as the latter has far superior texture performance.  It's not double the performance, though.  Maybe 50% faster or so on average.  A GeForce GTX 560M is basically an underclocked GeForce GTX 550 Ti, and the underclocking means its performance will typically trail a GeForce GTS 450, Radeon HD 5850, GeForce 9800 GTX+, or Radeon HD 4850.

    A GeForce GTX 560M will tend to run hotter than a GeForce GTX 260M, and will use nearly double the power of a GeForce GTS 260M.  Nvidia didn't get the performance per watt improvements that they should have gotten from moving to a 40 nm process node, so they compensate for it by just letting their cards run hotter rather than giving up performance.  That's why you're far better off getting an AMD card this generation in a gaming laptop if you can.  The problem is that on a $1200 budget, you're looking at a gaping hole in the market where the products that should exist (Sandy Bridge+Juniper, appropriately configured and aggressively priced) simply don't, because no laptop vendor can be bothered to build it.

     Ah, okay, so my laptop has the GTX 260M card.... And you're saying the GTX 560M is only half again as powerful as this one????

    Now that I check the specs, it would probably be considerably less than that, even.  Maybe closer to 30%.

    A GeForce GTX 550 Ti is about 50% faster than a GeForce 9800 GT in typical games.  Not many reviews will show both, but a Radeon HD 5770 performs about the same as a GeForce GTX 550 Ti, as you can see in 550 Ti reviews, and about 50% better than a GeForce 9800 GT, as you can see in 5770 reviews.  If you take both of those cards and underclock and undervolt them to put them in laptops, what do you think happens?  Getting apples to apples benchmark tests in laptops is impractical, but a naive guess would be that they each lose about the same percentage of their desktop performance, which would result in a GTX 560M being 50% faster.

    But even that probably overestimates the GTX 560M's performance.  The GTX 560M loses 25% of the desktop card's clock speeds, and even more than that in memory bandwidth.  The GTX 260M loses less than 10% of the clock speed on both counts.  Thus, I'd expect the GTX 560M to win by maybe 20%-30%.

    Another way to look at it is that Nvidia got about 20% better performance per watt from the transition from TSMC's 55 nm bulk silicon process node to the 40 nm node.  The GTX 260M is made on the former process node, and the GTX 560M on the latter.  They use about the same power.  How do you think they compare in performance?  Maybe the GTX 560M wins by 20%, or perhaps a little more, if it uses more power?

    One thing to remember is that it's not just comparing cards two generations apart, so that you'd get two generations worth of improvements.  It's comparing cards of a generation that Nvidia won handily in laptops to a generation that Nvidia lost badly in laptops.  Nvidia didn't make very much progress across those two generations, and AMD did--and a lot more progress than one might hypothetically expect from two generations of parts, as AMD also fixed their laptop drivers and idle power consumption in that time.  Two years ago, a GeForce GTX 260M made perfect sense for a gaming laptop.  Today, an Nvidia GPU of any sort is undesirable in a laptop.

Sign In or Register to comment.