Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Neverwinter performance scaling: doesn't need much of a video card, but does need a stronger CPU th

2

Comments

  • BizkitNLBizkitNL Member RarePosts: 2,546
    50 during a game is actually quite, quite fine.
    10
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by DeniZg

    My old Intel e5200 with Ati4850 can run this around 25FPS at close to maximum settings, with 1680*1050 rez. Drops to 15-20 in the city.  My i5-3470 with AMD 7850 runs this at 1080p, absolute max setting, without any drops below 60FPS, anywhere.

    Around 25 frames per second, and drops to 15-20 in some areas, and you don't see any problem with that?  I don't know about you, but I like higher frame rates.

  • BetaguyBetaguy Member UncommonPosts: 2,629
    Originally posted by Dibdabs
    I am using an old dual-core and there's no problems at all.  Is it PCs with more cores than that having issues?

     Same, I am using a 2-3 year old machine Duo Core 2.8Ghz, 4 GB Ram, and a crappy ATI 6430HD 1GB.  ZERO LAG, ZERO FRAMERATES DROPPED. Just saying.

    "The King and the Pawn return to the same box at the end of the game"

  • MibletMiblet Member Posts: 333
    Originally posted by ShakyMo
    Constantly above 50 isn't normal

    He said under load he reaches 59-61c, which I would assume would mean 35-42c idle.

    50c and above is fine under load, and yes it is normal.  Having 35c under load is far, far better than normal expectancy for an i5 (really it's damn good with a top end air cooling solution, if you don't have that then hats off to you...or your temperature sensors are badly off ;p).

    My system gets 60fps and rarely dips to 50ish (pretty much only when in Protectors Enclave) and runs up to 60c under load in Neverwinter on max settings. I have an i5 3570k@4GHz.

  • ShakyMoShakyMo Member CommonPosts: 7,207
    See that's mad I'm getting same fps as you with a 6300 and much lower temps

    Now I do have a fairly high end gpu, factory clocked / cooled 7950 but...
  • ShakyMoShakyMo Member CommonPosts: 7,207
    Ah unless Intels naturally run hotter? But is expect it to be other way round with AMD using larger die size and what have you
  • MibletMiblet Member Posts: 333
    Originally posted by ShakyMo
    See that's mad I'm getting same fps as you with a 6300 and much lower temps

    Now I do have a fairly high end gpu, factory clocked / cooled 7950 but...

    Depending on how clocked the GPU is you will be getting roughly the same fps as me, gaming is not going produce massive differences between the 2 chips.  With the 6300 being better price / performance.  At best, assuming identical GPU and opertaing environments you will see up to 5fps difference, if your GPU is clocked higher than mine (very possible as mine is only very slightly overclocked - stability issues with this card when clocking, luck of the draw etc.) you will see that diminish or likely be overtaken.

    The Intel chip should be running cooler at stock iirc, which as I said your temps are very, very good for an air cooled solution.  35c under load is amazing, my ambient room temperature would make that near impossible for me to attain with an air cooling solution.  Though my overclock did lead a hefty temperature rise in itself.

  • RidelynnRidelynn Member EpicPosts: 7,383

    Keep in mind that temperature is a function of two inputs: the thermal load (CPU) and the cooling capacity (heat sink).

    While 60C may be unusual for you, it is perfectly within design specifications. Intel lists a maximum Tcase of 72.6C. TCase is the temperature on the top of the Die, not internal core temp, which will be several degrees hotter.

    So long as your cooling solution can keep the CPU before 72.6 on the top of the die, it's fine. Cooler is better, sure, but 60C is hardly broken. Now, if it were running at 40C and all of a sudden jumped to 60C - that indicates a problem, but it still isn't necessarily broke yet.

  • ShakyMoShakyMo Member CommonPosts: 7,207
    Ah I do live in England.
  • The user and all related content has been deleted.

    image

    Somebody, somewhere has better skills as you have, more experience as you have, is smarter than you, has more friends as you do and can stay online longer. Just pray he's not out to get you.
  • ReizlaReizla Member RarePosts: 4,092
    Originally posted by Quizzical

    http://www.tomshardware.com/reviews/neverwinter-performance-benchmark,3495-8.html

    Basically, they took the game engine of Champions Online and Star Trek Online that tried to do way too much on the CPU rather than offloading it to the GPU and then made it do even more on the CPU, likely just by having to draw more stuff without changing the engine itself much.  And they did that without making it scale well to many CPU cores.  Oops.

    Crappy piece of work on Tom's Hardware for a change :(

    Neverwinter uses by default the DX9 rendering and that's the thing killing the performance on DX11 cards. Not to mention it forces the card to work at 100% generating a buttload of heat as well.

    When you go to Graphics -> Troubleshooting you can change the rendering to DX11. This will improve the FPS a lot and lower the GPU usage by 25% on my system (set to highest settings). I've dropped from AA16x to AA8x (with the rest still at max) and the GPUs run now only on 55% tops with the FPS having an extra 10% improvement.

    More info on my blog.

  • ShakyMoShakyMo Member CommonPosts: 7,207
    Ah that's another difference, I always run with the settings on my card, rather than "let game decide"
  • 9ineven9ineven Member UncommonPosts: 168
    The game runs better than GW2 on my old Quad Core Q8400, DDR2 RAM and a GTX 560.
  • khartokhar3khartokhar3 Member UncommonPosts: 486

    didn't had any problems yet. actually i kinda was surprised how good the performance is atm. almost every mmo which have been released last months had some performance issues. but this one seems to worrk pretty well.

    my specs:

    cpu: amd phenonm ii x4 (3ghz)

    gfx: xfx 5770 gt 1gb

    ram: 4gb

    system: vista 32

  • vanquishedangelvanquishedangel Member Posts: 1

    Just to add my two cents tho it may not be needed

     

    The game seems to be cpu based but maybe not all of it, here is my end

     

    I am running and ancient piece of computer, 8 years old, but upgraded.

     

    I have:

    HP dc 5750

    AMD dual core 4800+ 2.5 ghz (mine slightly oced to 2.6 ghz)

    8 gigs ram

    ATI radeon 6450, 1 gig vram (oced to 725 mhz GPU, 807 mhz ram with amd overdrive.)

     

    I tried running this under ubuntu 13.04 64 bit and it worked with about 20-40+ fps in caves and outer maps, 10 fps in protectors, skirmishes were a slide show at best, but this is in wine compatability layer, stronger computers have it running fine this way.

    under windows xp 64 bit, I get 30-55 fps in outer maps, 15-20 fps in protectors, skirmishes are playable. Abig difference was made by slightly oc'ing with the help of Clockgen. This computer bios is very locked and not oc friendly, but i can oc 2 clicks in clock gen. This also oc'es ram, cpu, pci, pcie, and fsb, at the same time respectively. this oc created a great stability in the game. I also installed an unofficicial dx10 patch and then went to microsoft and downloaded dx10+dx11 and it installed just fine plus I have all updates for windows xp 64 bit sp2. Widows is a terrible headache to install and get running, I long forgot what what hell was like. All the software to download and get ie, firewall, defraggler, cleaner, drivers, antivirus etc.... I miss linux already.

    The graphics oc showed very little difference but it did show some, the cpu etc oc showed almost no difference in fps but lags reduced heavily and game is a ton smoother so this was actually the biggest difference, deninately very cpu dependant.

     

    on a side note when doing my research (in speed according to benchmarks, if processors all had the same coding, average)

    single core = 100% (will run single threaded applications better than any multi core)

    ht= 134% ( in general)

    dual core = 150% (when running programs written for it)

    quad core =170% (when running programs written for it)

    If a program isnt written for the extra threads then they "can" be counter productive but the better coding and hard ware in them usually makes this a mute point.

     

    It should also be known that this games engine has several memory leaks, may not be noticed on some machines, it suffers from the same memory leaks as star trek online and champions online

  • Dreamo84Dreamo84 Member UncommonPosts: 3,713
    Uh, pretty sure my I7 3820 is "commercially available" and it's higher than anything Tom's Hardware even benchmarked lol. OP is trying too hard.

    image
  • DAOWAceDAOWAce Member UncommonPosts: 436

    Dat necro.  I'll throw in my 5 cents then.

    The engine is severely unoptimized.

    It stresses my 670 more than most games, and I run the game capped at 60FPS.

    I'm forced to lower most of the view distance options to stay at 60FPS most of the time.

    Shadows are murder on framerate, but the game looks horrible with them on low/off, so I keep them on high and turn off other things as a tradeoff.

    High detail objects can drastically lower framerate in certain areas, turned them off too.

    Worst of all, the UI has a 25% performance hit.  This has still not been fixed and I still don't see any talks about it getting fixed.

    I can't even stream the game at 720p30 without a fair bit of FPS loss ontop of everything else.

     

    They did optimize some areas (notably town) in the balance patch, but the engine itself still requires too much out of our systems.  It is not optimized well and I doubt it ever will be.

    I mind it less and less with every passing day because I become less and less interested in the game every passing day.  I may fully quit in the next week or two.

  • Mopar63Mopar63 Member UncommonPosts: 300

    First @ 1080 and maxed out detail sliders in game I have seen an A10 5800K at stock with a HD 6950 run smooth as butter. The game is CPU heavy on load but then again that is most MMOs, nothing new here. Toms article is so full of crap I do not know where to start. I am running a lesser video card and CPU than his test at 1080 but get the same frame rates he got and I am running at 1440.

    The game code does need to be optimized no doubt but the OP and the topic of this thread are WAY OFF.

    Concerning the temps of an i5 or i7, running 60 or so under game load is NOT unusual. In fact I would say it is about the norm. Expresso I am curious you have not mentioned the cooler you are running and if or if not the processor is overclocked. These are a important factors based on speed. As for 70C being the limit for the new CPUs, that is incorrect, they throttle at about 90C.

    DAOWAce not sure what your issue is but likey a CPU bottleneck. I am running my 7950 at 1440 resolutions and the sliders at max and peg 60 with Vysnc and have some overhead if I need it to spare. The lose of performance when streaming is another indicator of a CPU bottleneck. Using my Hauppauge PVR2 I can stream max 1080 detail with zero performance hit.

     

     

     

     

  • Mondo80Mondo80 Member UncommonPosts: 194
    What you need to understand is that Tom's Hardware site is an Intel pusher.  Has anyone else proven that the 8350 is that bad of a CPU for that price?  They mention that Neverwinter is CPU hungry but not which one the game's engine is optimized for or the performance on the i7 or how it performs on Windows XP,7,8 or Linux.
  • DAOWAceDAOWAce Member UncommonPosts: 436


    Originally posted by Mopar63
    DAOWAce not sure what your issue is but likey a CPU bottleneck. I am running my 7950 at 1440 resolutions and the sliders at max and peg 60 with Vysnc and have some overhead if I need it to spare. The lose of performance when streaming is another indicator of a CPU bottleneck. Using my Hauppauge PVR2 I can stream max 1080 detail with zero performance hit.

    You've said GTX 6850 and now a Radeon 7950; the first card doesn't exist, which one do you have?

    A 3770K at 4.5GHz is not a bottleneck to the game, nor is it to streaming this game.

    The bottleneck is the severe lack of optimization in the engine itself, nothing more.

    And of course you can stream without a performance hit, you're using a capture card, it's designed to do that.

  • Mopar63Mopar63 Member UncommonPosts: 300
    Originally posted by DAOWAce

     


    Originally posted by Mopar63
    DAOWAce not sure what your issue is but likey a CPU bottleneck. I am running my 7950 at 1440 resolutions and the sliders at max and peg 60 with Vysnc and have some overhead if I need it to spare. The lose of performance when streaming is another indicator of a CPU bottleneck. Using my Hauppauge PVR2 I can stream max 1080 detail with zero performance hit.

     

    You've said GTX 6850 and now a Radeon 7950; the first card doesn't exist, which one do you have?

    A 3770K at 4.5GHz is not a bottleneck to the game, nor is it to streaming this game.

    The bottleneck is the severe lack of optimization in the engine itself, nothing more.

    And of course you can stream without a performance hit, you're using a capture card, it's designed to do that.

    Thanks for the heads up, that should be HD 6950 major typo there. My point however was that streaming performance lose is a CPU not GPU bottleneck.

  • AkulasAkulas Member RarePosts: 3,029
    My computer always sounds like it's taking off when I play this game. It only does it in other games for the first 5 seconds when I first start them up but it constantly makes rocket launching noises while in Neverwinter. I just thought using a GForce 200 series card which has survived 3 computers (and still plays all the modern games on highest settings) was the reason. But using a crappy AMD Athlon 270 duo processor on a game which needs better would be the better reason why. My motherboard is small and lacking too and was built before they used the colour coding system to grade them.

    This isn't a signature, you just think it is.

  • evilastroevilastro Member Posts: 4,270
    Runs at max settings with no slowdown on my 2 year old i7 CPU.  I'm sure you can pick those up pretty cheap these days.  
  • AnslemAnslem Member CommonPosts: 215
    Originally posted by Reizla
    Originally posted by Quizzical

    http://www.tomshardware.com/reviews/neverwinter-performance-benchmark,3495-8.html

    Basically, they took the game engine of Champions Online and Star Trek Online that tried to do way too much on the CPU rather than offloading it to the GPU and then made it do even more on the CPU, likely just by having to draw more stuff without changing the engine itself much.  And they did that without making it scale well to many CPU cores.  Oops.

    Crappy piece of work on Tom's Hardware for a change :(

    Neverwinter uses by default the DX9 rendering and that's the thing killing the performance on DX11 cards. Not to mention it forces the card to work at 100% generating a buttload of heat as well.

    When you go to Graphics -> Troubleshooting you can change the rendering to DX11. This will improve the FPS a lot and lower the GPU usage by 25% on my system (set to highest settings). I've dropped from AA16x to AA8x (with the rest still at max) and the GPUs run now only on 55% tops with the FPS having an extra 10% improvement.

    More info on my blog.

    Hi!

    Will following your suggestion help with the lag around cities/people.  I played until level 5 or so last night - loved the Targeting - and as soon as the game sent me to an area with people and chat spam, my ancient laptop lagged like crazy.  

    I can play WoW/Rift with no issues, but had graphics issues with TERA.  Thanks in advance.  Might be time for an upgrade... :(

    Played: Ultima Online - DaoC - WoW -

  • sushimeessushimees Member Posts: 489
    Originally posted by ShakyMo
    The game uses 3 threads

    If your on a dual core it will be slow.

     

    And an Intel i3 2100 is a dual core with 4 threads. Please don't misinform people by implying that a dual core has less than 3 threads.

    image
    image

Sign In or Register to comment.