Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Will this run Gw2 well?

AdrenAdren Member UncommonPosts: 69
Windows 8 - 64-bit version - 3rd generation Intel Core i7-3517U processor (1.9GHz/3.0GHz w/ Turbo Boost) - 6GB DDR3 memory - 500GB 5400RPM hard drive - 11.6 HD widescreen CineCrystal LED LCD display (1366 x 768) - Intel HD Graphics 4000 - Mobile Intel HM77 Express chipset - webcam - multi-gesture touchpad - Gigabit LAN - Wi-Fi 802.11a/b/g/n - Bluetooth 4.0 - HDMI & VGA - 1 x USB 3.0, 2 x USB 2.0 - card reader - 6-cell Li-ion Battery (up to 8-hour) - 11.2 x 8.0 x 1.1 in.

Comments

  • RedcorRedcor Member Posts: 426

    Civilized men are more discourteous than savages because they know they can
    be impolite without having their skulls split, as a general thing.
    -Robert E. Howard

  • AdrenAdren Member UncommonPosts: 69
    havent bought the laptop yet :P getting for gf who really wants to play Gw2. thx for the site Redcor btw i could use it for later
  • GreyhooffGreyhooff Member Posts: 654

    your description doesn't have a graphics card (something like Nvidia 650m or ATI whatever), I'd strongly urge you to get a graphics card in any machine you want to play games on

    image

  • AdrenAdren Member UncommonPosts: 69
    graphics card is there...its the Intel HD Graphics 4000 i believe
  • VorchVorch Member UncommonPosts: 793
    Originally posted by Adren
    graphics card is there...its the Intel HD Graphics 4000 i believe

    He means a dedicated gfx card. The 4000 is integrated into your cpu (APU).

    While a dedicated gfx card is always prefered, I think the 4000 should be fine if you are ok with medium settings.

    Here are some vids of people playing GW2 on that setup:

     

    This is one of the RARE cases that I would recommend getting an alienware instead of the laptop if you are afraid of building your own computer. The entry level X51 is 700 bucks and comes with a warranty should anything happen to it.

    "As you read these words, a release is seven days or less away or has just happened within the last seven days— those are now the only two states you’ll find the world of Tyria."...Guild Wars 2

  • AdrenAdren Member UncommonPosts: 69
    thx for ur replies
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by Adren
    Intel Core i7-3517U

    Nope.  That's ultra-low end hardware, at least for a laptop form factor (as opposed to tablets or cell phones).  Any Core i*-**17U chip means it's an ultra-low voltage 17 W chip.  Lower wattage means lower performance, as it has to run everything at very low clock speeds.  And then you add to that the problem that Intel's graphics architectures are very inefficient in performance per watt.  Low wattage times poor performance per watt means very poor performance.

    Don't be misled by the "Intel HD Graphics 4000" moniker.  That's integrated graphics, but integrated graphics isn't intrinsically bad.  Radeon HD 7660D integrated graphics would probably be able to run Guild Wars 2 pretty well on fairly high (but not max!) settings.

    The real problem is that any Ivy Bridge chip with 16 EUs gets the Intel HD Graphics 4000 moniker.  Different bins and different chips with the same graphical silicon can clock the graphics wildly differently, depending on how limited they are by power consumption.  When Intel launched Ivy Bridge, they sent out a bunch of Core i7-3920XM chips with a 55 W TDP as the review samples for Intel HD Graphics 4000.  Those run the graphics at 1300 MHz most of the time while playing games.

    But when you're limited to a 17 W TDP, you'll see GPU clock speeds a lot closer to the stock speed of 350 MHz.  Naturally, clocking the same chip at 350 MHz rather than 1300 MHz will tend to affect your performance.  The chip offiically allows graphics to clock up to 1150 MHz in some circumstances, but you'll only ever see that in programs that don't push the GPU very hard and while you aren't using the CPU much at the same time.  In other words, it's not going to come anywhere near that while you're playing a real game.

    Don't be confused by Intel's Ultrabook marketing campaign.  The point of Ultrabooks is thinness, and they sacrifice everything else to make laptops a few millimeters thinner.  So you get a high price tag, low end performance, poor battery life, poor reliability, no upgrade options, no possibility of repair, and few features.  You could fix all of those problems by making it 5 mm thicker, but the entire point of Ultrabooks is thinness.

    There's no reason to even consider an Ultrabook unless thinness is your top priority and you're willing to sacrifice everything else to get it.  And even if you are, you might want to look at a MacBook Air instead of an Ultrabook.  Whatever Apple's deficiencies may be, they can produce slick form factors.

    And in particular, if you care about games, you definitely don't want an Ultrabook.  If you care about games, there's no real reason to consider an Intel-based laptop on a sub-$800 budget.  On larger budgets, you probably want a Core i7-3630QM together with a discrete card, most likely a GeForce GTX 660M, 670MX, 675MX, or 680M.  There's also the option of the MSI GX60, which has an AMD A10-4600M and a Radeon HD 7970M, which gets you high end graphical performance in a $1200 laptop by going with a cheaper CPU.

    Actually, if you care about games, you want a desktop unless it absolutely doesn't work for you for some reason.

  • DaezAsterDaezAster Member UncommonPosts: 788
    graphics card is a must. You will regret not having one and putting all that work on the cpu in a labtop is asking for heating issues and failure!!!!
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by DaezAster
    graphics card is a must. You will regret not having one and putting all that work on the cpu in a labtop is asking for heating issues and failure!!!!

    Actually, a discrete video card adds quite a bit of heat, and that adds to the cooling problems.  That's one reason why AMD's recent integrated graphics that really are good enough for gaming are such a big deal.

  • ThorkuneThorkune Member UncommonPosts: 1,969

    I played GW2 with the following specs and had no issues on tweaked medium graphics:

     

    Intel i3-2350M 2.30GHZ CPU

    Intel HD3000 Graphics

    4GB RAM

    Windows 7 Home Premium 64 bit OS

    5400 RPM 500GB HD

  • BattlerockBattlerock Member CommonPosts: 1,393
    Budget quality = ASUS G75VX
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by Thorkune

    I played GW2 with the following specs and had no issues on tweaked medium graphics:

     

    Intel i3-2350M 2.30GHZ CPU

    Intel HD3000 Graphics

    4GB RAM

    Windows 7 Home Premium 64 bit OS

    5400 RPM 500GB HD

    That's a lot faster than what the original poster was looking at, both on the CPU and GPU sides.  Ivy Bridge is more efficient than Sandy Bridge, but not nearly enough to make up the difference between 35 W (yours) and 17 W.

  • DaezAsterDaezAster Member UncommonPosts: 788
    Originally posted by Quizzical
    Originally posted by DaezAster
    graphics card is a must. You will regret not having one and putting all that work on the cpu in a labtop is asking for heating issues and failure!!!!

    Actually, a discrete video card adds quite a bit of heat, and that adds to the cooling problems.  That's one reason why AMD's recent integrated graphics that really are good enough for gaming are such a big deal.

    Have a macbook pro and it runs the intel4000 graphics and running civs turns my labtop into a space heater, so much so that i don't run games on my labtop no more. Ran the gw2 mac client and it does not hold its own compared to my pc with a graphics card which is a much older system. I personally would'nt go with a labtop for gaming but if i did I would make sure it had a dedicated graphics card and is designed to vent that heat which will be on the card as apposed to the cpu.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by KingofHartz
    Budget quality = ASUS G75VX

    That will handle games all right, but $1250+ is most certainly not a budget laptop.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by DaezAster
    Originally posted by Quizzical
    Originally posted by DaezAster
    graphics card is a must. You will regret not having one and putting all that work on the cpu in a labtop is asking for heating issues and failure!!!!

    Actually, a discrete video card adds quite a bit of heat, and that adds to the cooling problems.  That's one reason why AMD's recent integrated graphics that really are good enough for gaming are such a big deal.

    Have a macbook pro and it runs the intel4000 graphics and running civs turns my labtop into a space heater, so much so that i don't run games on my labtop no more. Ran the gw2 mac client and it does not hold its own compared to my pc with a graphics card which is a much older system. I personally would'nt go with a labtop for gaming but if i did I would make sure it had a dedicated graphics card and is designed to vent that heat which will be on the card as apposed to the cpu.

    And so your argument is that a 45 W CPU is too much heat, but the same 45 W CPU together with a 75 W video card in the same system is not?  Integrated graphics in the same chip as the CPU actually helps with cooling tremendously, as in programs that can push the CPU and GPU hard simultaneously, it will clock them both lower to compensate.

  • GruntyGrunty Member EpicPosts: 8,657
    Originally posted by DaezAster
    Originally posted by Quizzical
    Originally posted by DaezAster
    graphics card is a must. You will regret not having one and putting all that work on the cpu in a labtop is asking for heating issues and failure!!!!

    Actually, a discrete video card adds quite a bit of heat, and that adds to the cooling problems.  That's one reason why AMD's recent integrated graphics that really are good enough for gaming are such a big deal.

    Have a macbook pro and it runs the intel4000 graphics and running civs turns my labtop into a space heater, so much so that i don't run games on my labtop no more. Ran the gw2 mac client and it does not hold its own compared to my pc with a graphics card which is a much older system. I personally would'nt go with a labtop for gaming but if i did I would make sure it had a dedicated graphics card and is designed to vent that heat which will be on the card as apposed to the cpu.

    Dell Precision Workstation portable systems have one heat pipe/sink and half a fan for the CPU and 2 heat pipe/sinks and 1.5 fans for the discrete GPU. GPUs put out a lot of heat.

    "I used to think the worst thing in life was to be all alone.  It's not.  The worst thing in life is to end up with people who make you feel all alone."  Robin Williams
  • SirFubarSirFubar Member Posts: 397
    Like Quizzical said, no. You will need a better cpu than the one you listed if you want to run GW2. For the GPU, you should be okay with a HD4000 since GW2 is a lot more CPU demanding than GPU, but if you can afford a discrete GPU, I would go for that instead of the HD4000 for obvious reasons.
  • DaezAsterDaezAster Member UncommonPosts: 788
    Originally posted by Grunty
    Originally posted by DaezAster
    Originally posted by Quizzical
    Originally posted by DaezAster
    graphics card is a must. You will regret not having one and putting all that work on the cpu in a labtop is asking for heating issues and failure!!!!

    Actually, a discrete video card adds quite a bit of heat, and that adds to the cooling problems.  That's one reason why AMD's recent integrated graphics that really are good enough for gaming are such a big deal.

    Have a macbook pro and it runs the intel4000 graphics and running civs turns my labtop into a space heater, so much so that i don't run games on my labtop no more. Ran the gw2 mac client and it does not hold its own compared to my pc with a graphics card which is a much older system. I personally would'nt go with a labtop for gaming but if i did I would make sure it had a dedicated graphics card and is designed to vent that heat which will be on the card as apposed to the cpu.

    Dell Precision Workstation portable systems have one heat pipe/sink and half a fan for the CPU and 2 heat pipe/sinks and 1.5 fans for the discrete GPU. GPUs put out a lot of heat.

    I agree with what you and quizz are saying 100%. My point is the heat of the processor itself. A friend has a gaming labtop which is designed with venting and cooling for the graphics which is what i suggested like what you mentioned. I run protools on my top and it takes a while before the heat builds to the point of the fans really picking up but games put it there almost imediately. Case design is a big part of the equation here where my labtop vents through the keys my friends whos is designed for games and housing a graphics card is thicker and has vents on the side that the air is moved in and out through.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Computer chips have a specified TDP, which is a way of saying, the chip will never put out more than this amount of heat for thermally significant amounts of time.  If a laptop can't handle a chip that is actually putting out its rated TDP, then the problem is a faulty laptop design.
Sign In or Register to comment.