Well since everyone else is doing it, I'm going to wade in with my specs too.
Intel p4 (single core) 2.8GHZ
2GB ram (elderly motherboard only supports 2GB max)
ATI HD2600XT 512MB (middle of the road, I know, but budgetry issues forced a compromise)
Processor is marginally below the minimum required spec (3ghz single core or 2.4ghz dual), but I'm going to take a chance on it - I figured AoC will be more memory/GPU intensive than CPU intensive - also, task manager is your friend; all should be well if I just kill everything (explorer.exe included).
Anyone who has played games for more than a couple of years will have had experiences of being able to run games flawlessly that the manufacturers claim shouldn't even run at all on your specs. I know I have. My theory is, then, that as long as I've got the recommended RAM and GPU, then there should be a wee bit of leeway on the CPU spec.
Should have the game by monday and then I'll know if my gamble will pay off.
Upgraded the nVidia 7300 GS/256 to an nVidia 8400 GS/512
went from 5-10 fps to 18-21 average, also now on medium settings
1024x768 @ medium settings, shader 2.0 18-21, drops to 10-12 rarely, up to 30fps in some empty areas
big note - shader 3.0 drives the fps back to below 10, shader 2.0 needs to be used on rigs like this. Also, it has a bad habit of switching back to 3.0 after some zoning. When I see my fps drop thru the floor, I go reset the shader to 2.0 and all is good again.
The OP actually is NOT good. Just because something appears technical doesnt mean it is useful or accurate.
That is NOT an accurate representation of what is going on with pixel OR vertex shaders. The two are very different routines.
Pixel shading routines apply effects on a per pixel level. Imagine each frame of an animation as an individual bitmap image. Before effects are applied, the image is flat (like old school games like Kings Quest ). A pixel shader does some quick fancy math based on parameters to decide what new color EACH PIXEL should be AFTER the effect.
A vertex shader performs a similar function, but does it at a higher level. At the vertex level (naming of these routines is kept pretty simple ) Vertex can be broadly defined as an objects 3D definition.
There are also geometry shaders which exist between the two. Geometry shaders work on primitives or shapes. So like, a triangle or square.
So now imagine a textured cube is the image. The vertex shader starts, sees it as an actual cube, and applies some effects. Next, the geometry shaders sees it as a collection of squares and triangles (based on the angle that is showing) and applies effects. Last, the pixel shaders see it as a bitmap (just 1280x1024 pixels - or whatever res) and apply effects.
Usually, in modern game design, a combination of all of these routines are used depending on what is the least expensive (ie - most efficient) for the particular effect.
NVidias G80s use a unified shader model architecture with big giant fully programmable units that can do whatever the software tells it. Older architectures had dedicated hard wired units that were either pixel or vertex/geometry.
DirectX (and OpenGL) are the software API side of this. When the hardware paired with them can handle running the routine, it is run on the hardware. These routines are VERY expensive and typically CANT HAPPEN in software. The CPU is just too bad at floating point and too busy to do it.
So the opening paragraph is just completely misunderstanding how the pipeline works and, therefore to me, the whole article is pretty useless.
I think what is happening is that, once again, the unified shader model of the new cards is causing trouble for devs. Nearly EVERY new game seems to have stuttering problems on the BEST hardware. This is an optimization issue or an incompatibility somewhere. Throwing endless RAM at it MASKS it, but isnt the root cause.
I dont understand how this stuff is slipping past QA, to be honest.
Also, using becnhmarks like 3DMark 06 (or whatever year) aren't very reliable in predicting how AoC will run. It's a very different animal and the -only- way to know how it runs on your system is to play it, collecting fps while experimenting with the various advanced gfx settings in the game control panel.
I'm getting great FPS in this game with an 8800GTS G92 edition, and an AMD 4200+. Since the "Low" and "Medium" settings are bugged right now, turn it to "High" and turn shadows off. It gave me a 30fps boost ^-^.
Great info OP, I will try these techniques sometime.
I have been playing AoC since 3 day early access, and to put it as simply as i can, this post im replying to pretty much sums it up. The best pc on the planet cant play this w/o crashing every 15 mins. I love the game and waited 3 years to play it, but it is definately INDEED BROKEN. And everyday that is going by Bugcom is losing their entire gamebase because of it. It is definately sad, it is, but it is also definately the truth, bugcom is putting themselves outta business almost as fast as vangaurd if not faster.
Am I doomed? I mean System Requirement Labs says I have minimum standards yet I failed according to them. II have a laptop with:
Intel pemtium R 2.8 GHz Rated at 4.19 2 g of ram Pemtium 4 120 Gigs
SRL Says: Video Card Minimum: 128 MB DirectX 9.0c Graphics Card with Shader 2.0 support (NVIDIA GeForce 6800+ / ATI Radeon 9800+) You Have: ATI MOBILITY RADEON X600 (ATI display adapter (0x3150))
Video Card Features - Minimum attributes of your Video Card Video RAM: Required - 128 MB , You have - 128.0 MB Video Card 3D Acceleration: Required - Yes , You have - Yes Video HW Transform & Lighting: Required - Yes , You have - Yes Vertex Shader Ver.: Required - 2.0 , You have - 2.0 Pixel Shader Ver.: Required - 2.0 , You have - 2.0 Should I call their bluff and get the game to see if it works?
What about this is confusing you?
It states the minimum required video card, and it states the video card you have.
It is correct that your card does indeed match all features, but it does not match the required speed. Does SRL not say anything about that?
Anyway, you could check this page for video card comparisons:
Then you will see that your X600 card is at about half the speed for every specific operation (fill rates, shader operations) compared to the minimum required 9800.
Concluding, I'm fairly confident that the game will indeed start on your laptop, but it will run far to slow to be enjoyable.
The information here was great. Thank you for taking the time to compile it all. Hopefully by now the specs needed for smooth gameplay have dropped a bit . I guess ill see, Thanks again
Hm, i really want to try the game but i'm not sure if my computer is up for it. And when there is no free trial i don't want to waste my money if the game would run bad.
I got a P4 3,5GHz, 1G RAM (I definitely need 2 G or more, right?) and a Radeon X1950 Pro
Could anyone tell me if this is okey? What graphic settings can i use so it runs smooth? I don't want to play it if i only can use minimum settings because this is a game where the graphics actually counts.
Comments
Well since everyone else is doing it, I'm going to wade in with my specs too.
Intel p4 (single core) 2.8GHZ
2GB ram (elderly motherboard only supports 2GB max)
ATI HD2600XT 512MB (middle of the road, I know, but budgetry issues forced a compromise)
Processor is marginally below the minimum required spec (3ghz single core or 2.4ghz dual), but I'm going to take a chance on it - I figured AoC will be more memory/GPU intensive than CPU intensive - also, task manager is your friend; all should be well if I just kill everything (explorer.exe included).
Anyone who has played games for more than a couple of years will have had experiences of being able to run games flawlessly that the manufacturers claim shouldn't even run at all on your specs. I know I have. My theory is, then, that as long as I've got the recommended RAM and GPU, then there should be a wee bit of leeway on the CPU spec.
Should have the game by monday and then I'll know if my gamble will pay off.
My system is middle of the road, but the following improved my fps by about 15 .
intel e8400 3.0ghz
2 gigs ram
8500gt 512 mb
go to video options
go with "high" graphics
go to advanced
put shader from 3.0 down to 2.0
disable shadows
apply
this works for quite a few users on the official forum, but then, it doesnt work for all so don't bitch if your rates dont go up.
some players report that alt-tabbing out of AoC then going back in helps with fps. Didn't for me.
Update:
Upgraded the nVidia 7300 GS/256 to an nVidia 8400 GS/512
went from 5-10 fps to 18-21 average, also now on medium settings
1024x768 @ medium settings, shader 2.0 18-21, drops to 10-12 rarely, up to 30fps in some empty areas
big note - shader 3.0 drives the fps back to below 10, shader 2.0 needs to be used on rigs like this. Also, it has a bad habit of switching back to 3.0 after some zoning. When I see my fps drop thru the floor, I go reset the shader to 2.0 and all is good again.
~\_/~\_O
Going to break from the pattern here.
The OP actually is NOT good. Just because something appears technical doesnt mean it is useful or accurate.
That is NOT an accurate representation of what is going on with pixel OR vertex shaders. The two are very different routines.
Pixel shading routines apply effects on a per pixel level. Imagine each frame of an animation as an individual bitmap image. Before effects are applied, the image is flat (like old school games like Kings Quest ). A pixel shader does some quick fancy math based on parameters to decide what new color EACH PIXEL should be AFTER the effect.
A vertex shader performs a similar function, but does it at a higher level. At the vertex level (naming of these routines is kept pretty simple ) Vertex can be broadly defined as an objects 3D definition.
There are also geometry shaders which exist between the two. Geometry shaders work on primitives or shapes. So like, a triangle or square.
So now imagine a textured cube is the image. The vertex shader starts, sees it as an actual cube, and applies some effects. Next, the geometry shaders sees it as a collection of squares and triangles (based on the angle that is showing) and applies effects. Last, the pixel shaders see it as a bitmap (just 1280x1024 pixels - or whatever res) and apply effects.
Usually, in modern game design, a combination of all of these routines are used depending on what is the least expensive (ie - most efficient) for the particular effect.
NVidias G80s use a unified shader model architecture with big giant fully programmable units that can do whatever the software tells it. Older architectures had dedicated hard wired units that were either pixel or vertex/geometry.
DirectX (and OpenGL) are the software API side of this. When the hardware paired with them can handle running the routine, it is run on the hardware. These routines are VERY expensive and typically CANT HAPPEN in software. The CPU is just too bad at floating point and too busy to do it.
So the opening paragraph is just completely misunderstanding how the pipeline works and, therefore to me, the whole article is pretty useless.
I think what is happening is that, once again, the unified shader model of the new cards is causing trouble for devs. Nearly EVERY new game seems to have stuttering problems on the BEST hardware. This is an optimization issue or an incompatibility somewhere. Throwing endless RAM at it MASKS it, but isnt the root cause.
I dont understand how this stuff is slipping past QA, to be honest.
Also, using becnhmarks like 3DMark 06 (or whatever year) aren't very reliable in predicting how AoC will run. It's a very different animal and the -only- way to know how it runs on your system is to play it, collecting fps while experimenting with the various advanced gfx settings in the game control panel.
~\_/~\_O
I'm getting great FPS in this game with an 8800GTS G92 edition, and an AMD 4200+. Since the "Low" and "Medium" settings are bugged right now, turn it to "High" and turn shadows off. It gave me a 30fps boost ^-^.
Great info OP, I will try these techniques sometime.
I have been playing AoC since 3 day early access, and to put it as simply as i can, this post im replying to pretty much sums it up. The best pc on the planet cant play this w/o crashing every 15 mins. I love the game and waited 3 years to play it, but it is definately INDEED BROKEN. And everyday that is going by Bugcom is losing their entire gamebase because of it. It is definately sad, it is, but it is also definately the truth, bugcom is putting themselves outta business almost as fast as vangaurd if not faster.
EXACTLY.
What about this is confusing you?
It states the minimum required video card, and it states the video card you have.
It is correct that your card does indeed match all features, but it does not match the required speed. Does SRL not say anything about that?
Anyway, you could check this page for video card comparisons:
http://www.gpureview.com/show_cards.php?card1=318&card2=19
Then you will see that your X600 card is at about half the speed for every specific operation (fill rates, shader operations) compared to the minimum required 9800.
Concluding, I'm fairly confident that the game will indeed start on your laptop, but it will run far to slow to be enjoyable.
The information here was great. Thank you for taking the time to compile it all. Hopefully by now the specs needed for smooth gameplay have dropped a bit . I guess ill see, Thanks again
Danyel Hale
Hm, i really want to try the game but i'm not sure if my computer is up for it. And when there is no free trial i don't want to waste my money if the game would run bad.
I got a P4 3,5GHz, 1G RAM (I definitely need 2 G or more, right?) and a Radeon X1950 Pro
Could anyone tell me if this is okey? What graphic settings can i use so it runs smooth? I don't want to play it if i only can use minimum settings because this is a game where the graphics actually counts.
is that a viable videocard for this game?
e-GeForce 8500GT de 1 Go EVGA
Thanks for the guide.
I'm pretty sure my constant dcing is due to having 2gig on xp.