I'm sorry, I know its new, but the game just doesn't look good enough to warrant such poor performance. I can max every single other MMO out there but right now, on my Phenom II x4 / 5870 I'm getting worse performance than Crysis. To me, that's a dealbreaker. Optimize the engine or don't release the game, there's no way the slightly above average graphics of Rift should be taxing my system like they are.
I agree my performance was terrible, same processor, I admit I only have a 9600GT but the game on lowest settings made no difference then high settings, same FPS.
I never planned to play Rift anyway, but it is an issue.
I was breaking 80 FPS with settings on high last night. It dipped down to about 30 with big RIFT events and 60+ players on screen (+ dozens of invasions) all spaming spells etc.
Make sure you are using the recommended drivers from TRION. They aren't always the latest version and the difference in performance can be eye opening.
All time classic MY NEW FAVORITE POST! (Keep laying those bricks)
"I should point out that no other company has shipped out a beta on a disc before this." - Official Mortal Online Lead Community Moderator
Proudly wearing the Harbinger badge since Dec 23, 2017.
Coined the phrase "Role-Playing a Development Team" January 2018
"Oddly Slap is the main reason I stay in these forums." - Mystichaze April 9th 2018
I love these posts that try to say the development team doesn't know how to optimize their engine, when it is usually the end-user that needs to optimize their drivers/settings/etc.
Developing for a PC market is extremely complicated as unlike consoles, there is no single hardware set to utilize the application. Even providing "minimum specs" is a tough decision to make, as that assumes hardware and not client configuration. That person with the minimum specs may try to raise the graphics level and thereby impact their own performance.
Guys the game is running great now... I've been playing up until 5am this morning and haven't had any issues and I'm on a crap PC. Keep in mind this is a beta and they were doing a stress test on all servers, every game has had login issues like this, move on and try to enjoy the rest of the beta event.
What's your graphics card? AA puts heavy load on your GPU. Getting a total of 4GB+ of Ram would help since I know Rift takes about 1900 for me.
forgot to write that down.. here is my complete system:
mainboard: Asus M4A785TD-V Evo
cpu: Amd Phenom II X4 945
ram: G.Skill 2gb DDR3 Ram
graphics: radeon hd 5770, 1gb
if it really takes 1900mb I should get new ram. But I dont have spare money at the moment :-/ .. I hate components that simply break without a reason :-/
I was breaking 80 FPS with settings on high last night. It dipped down to about 30 with big RIFT events and 60+ players on screen (+ dozens of invasions) all spaming spells etc.
Make sure you are using the recommended drivers from TRION. They aren't always the latest version and the difference in performance can be eye opening.
And which one do I need for my radeon hd 5770? Is there any information on the official site?
Sadly, the engine is a tweaked version of the Warhammer engine developed by people who made Warhammer. It's gonna lag regardless of how much hardware you throw at it. I hope I'm wrong, but the precedent is hard to ignore.
This has nothing to do with the engin at all. If you were around for the first beta's you would know how smooth this thing can run. That being said they only had around 50k max by the end of beta 4 spread out over 16 servers or so between EU and US. Now they have all the pre-orders and all the mass VIP code grabers the population has exploded. I read a few days ago something about steam saying nearly 300k pre orders for Rift over the past couple of weeks. Now add that to Trions sales, Gamestop sales, Amazon sales and D2D. This is and was a planned stress test, and I for one am thankful they are doing it now rather than wait till a week or two before release and botch it. And givin what I have seen since the start I have no doubt in my mind they will get this fixed in no time, as they have with everything else in the past. As a matter of fact servers where just broght down a little bit ago for a server stability patch. Just give it a day or two and then seen how things are running, thats when judgement should be passed. Not just a few hours after servers open on the first Stress Test day.
We were told the exact same thing in the Warhammer beta. I realy hope they do pull it off, but its pretty much deja'vu at this point.
Ive been in RIft BEta for a good while now. I was also IN the WAR beta and played it for about a year after launch. RIft runs tons and tons better than WAR. They overloaded the servers on this beta and jacked up the rifts and invasions to stress test. I am using an AMD 920 OC to 3.0, 6G of ram and a AMD 4850 vid card and I am getting 45 fps On max settings (minus shadows) at 1920 x 1280 and the game runs very smooth - even with a lot of people on the screen. THey are still optimizing the client and server. I am convinced that by launch the game will be running very well.
Had no performance issues at all and was playing for a solid five hours over night. Apart from the initaial log in problems but would expect that from any beta test.
What's your graphics card? AA puts heavy load on your GPU. Getting a total of 4GB+ of Ram would help since I know Rift takes about 1900 for me.
forgot to write that down.. here is my complete system:
mainboard: Asus M4A785TD-V Evo
cpu: Amd Phenom II X4 945
ram: G.Skill 2gb DDR3 Ram
graphics: radeon hd 5770, 1gb
if it really takes 1900mb I should get new ram. But I dont have spare money at the moment :-/ .. I hate components that simply break without a reason :-/
I had a couple friends last night with 5770's complain of the same thing, it turned out they had other shit running in the background that was killing their fps. I suspect ATI drivers, they are usually the culprit.
I'm playing this game on a computer made out of old Russian stove parts and toster guts. O.K, my computer isn't that bad, t's a single core with 2gigs of ram and a 512mb grapics card. But seriously, the game looks and runs amazing, even when there are dozens of people around.
The ones with no problems are the lucky ones, and some (probably high) percentage of players will fall into that category. It will never get to 100%, every game has people who cannot play them due to the specific composition of their hardware/software/network.
Throughout beta the number of folks having issues should continue to go down and more people will be albe to run the game at an acceptable level. (or you'll end up with the situation we had with Vanguard or AOC)
Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm
Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV
Don't just play games, inhabit virtual worlds™
"This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon
I'm sorry, I know its new, but the game just doesn't look good enough to warrant such poor performance. I can max every single other MMO out there but right now, on my Phenom II x4 / 5870 I'm getting worse performance than Crysis. To me, that's a dealbreaker. Optimize the engine or don't release the game, there's no way the slightly above average graphics of Rift should be taxing my system like they are.
I run on a 2.4Ghz single core with an 8800gtx and 2 gigs ram.
My purfomance is acceptable until I get into raid battles and then it drops to unplayable levels.
I am not shocked by this and am very happy to have had the opportunity to play the game and realize I need to upgrade. This seems the beset time to upgrade so Trion and Rift get an A+ from me.
Those that are getting frame rate issues running the 5000 series ATI cards should give 10.9 a try. I think that is the driver version that Trion had suggested a bit ago. I also believe that there may have been a revision 2 of the latest (10.12) that made some improvements. I have a computer with both an nVidia 280gtx and another with an ATI 5870 and the 5870 gets a good 10-20 fps less. It just looks like ATI cards need a bit more tweeking to get into good frames.
It's a Jeep thing. . . _______ |___| \_______/ = |||||| = |X| \*........*/ |X| |X|_________|X| You wouldn't understand
I'm sorry, I know its new, but the game just doesn't look good enough to warrant such poor performance. I can max every single other MMO out there but right now, on my Phenom II x4 / 5870 I'm getting worse performance than Crysis. To me, that's a dealbreaker. Optimize the engine or don't release the game, there's no way the slightly above average graphics of Rift should be taxing my system like they are.
I run everything on max without any problem I use nvidia 460gtx ,you should be doing fin with that display card of yours unless the Phenom IIx4 sux.
The ones with no problems are the lucky ones, and some (probably high) percentage of players will fall into that category. It will never get to 100%, every game has people who cannot play them due to the specific composition of their hardware/software/network.
Throughout beta the number of folks having issues should continue to go down and more people will be albe to run the game at an acceptable level. (or you'll end up with the situation we had with Vanguard or AOC)
Time will tell.
Nah, this game will not have the whole EQ2, AoC Vanguard issue.
It's just not that graphically intense.
Like Skyrim? Need more content? Try my Skyrim mod "Godfred's Tomb."
The ones with no problems are the lucky ones, and some (probably high) percentage of players will fall into that category. It will never get to 100%, every game has people who cannot play them due to the specific composition of their hardware/software/network.
Throughout beta the number of folks having issues should continue to go down and more people will be albe to run the game at an acceptable level. (or you'll end up with the situation we had with Vanguard or AOC)
Time will tell.
Most of my guild where playing with me last night. About 15 of us. There was a broad range of machines involved. Apart from a few issues to treek the settings nobody was reporting any performance problems. This is just from my experiance but I don't consider it to a big problem. But reading some of the comments it's odviously an issue for some. Just don't think it's as wide spread as it's beeing made out on this thread. But as you say, time will tell.
The ones with no problems are the lucky ones, and some (probably high) percentage of players will fall into that category. It will never get to 100%, every game has people who cannot play them due to the specific composition of their hardware/software/network.
Throughout beta the number of folks having issues should continue to go down and more people will be albe to run the game at an acceptable level. (or you'll end up with the situation we had with Vanguard or AOC)
Time will tell.
Most of my guild where playing with me last night. About 15 of us. There was a broad range of machines involved. Apart from a few issues to treek the settings nobody was reporting any performance problems. This is just from my experiance but I don't consider it to a big problem. But reading some of the comments it's odviously an issue for some. Just don't think it's as wide spread as it's beeing made out on this thread. But as you say, time will tell.
It's probably one of two things:
1, the game, for whatever technical reason, runs differently on different machines and configurations regardles of how beefy the system is.
2, players having issues might think they have good systems but their systems might not be as good as they think they are. There are probably people putting the game on ultra who shouldn't be putting the game on ultra.
Like Skyrim? Need more content? Try my Skyrim mod "Godfred's Tomb."
I have not yet tried to log into this beta phase since there were so many reports of people not getting in. Previous betas I had no issues at all averaging about 70fps. It is possible they changed something that is giving people all these issues but I think it is part of the reason they invited so many for this phase to test out more systems and the load perhaps. I will try and log in later and see how things are. I spent all my time today creating signatures and have not yet resurfaced from my paint program.
I'm sorry, I know its new, but the game just doesn't look good enough to warrant such poor performance. I can max every single other MMO out there but right now, on my Phenom II x4 / 5870 I'm getting worse performance than Crysis. To me, that's a dealbreaker. Optimize the engine or don't release the game, there's no way the slightly above average graphics of Rift should be taxing my system like they are.
Something is wrong with your machine then. I run this game on high at 32-36 FPS using a intel Q8300 2.5ghz quad, 4gb of ddr2 800 and a 8800gt 512mb card. Parts in my machine (gfx/ram) are 3 years old or more.
I'm running a similar older rig, dell xps410 with an evga nvidia 8800gt SSC (1gb RAM), getting 25 FPS on ultra settings. No complaints so far in my first hour.
You realize 25 frames is horrible right?
you realize standard smooth animation is 30 frames per second, right?
It's 60.
It's 30.
You are wrong. It's 60. There is a HUGE difference between 30 and 60.
[mod edit]
Mission in life: Vanquish all MMORPG.com trolls - especially TESO, WOW and GW2 trolls.
I have to warn everyone I am a huge fanboy of this game. I preordered after Beta4 and I set it up for a 6 month Sub. That said I have to agree with what some others are saying here in that there are performance issues. My guess is between the ongoing tweaking and the possible debugging for beta could be the cause.
I built a machine for gaming 6-9 months ago:
Core I7 overclocked to 4GHZ
12 GB Ram
ATI 5970
I cannot run this game with ultra settings and maintain 60 fps, so to me its doubtful anyone else can either. I had to turn down shadows and particle effects and then I can usually maintain 60fps or close to it. That said there are also performance issues related to particle effects, even with the slider turned down to about 25%. For instance in the starter areas when those purple bomb things are blowing up all over... I can be running around at 60fps and then one goes off right in front of me and and my frame rate drops to 35 until the bomb finishes exploding, and then I get back up to 60fps again.
I also tried a good bit of tweaking (turning off crossfire on my card, and locking the affinity of the CPU to only one core), and both of those actions caused drops in frame rate.
I also think that the antialiasing in this game needs some major tweaking. Thats one of the advantages of having a 5970 is that a lot of the extra power can go to higher levels of AA without effecting my frame rates. This is not the case here where even 2X AA brings my frame rate down from 60 on to about 45 on average.
I use V-Sync so 60 FPS is my cap.
I use FRAPS on my G19 to monitor my frame rate.
The last comment I wanted to make is that as others have said, 60fps is really the standard for PC gaming. It's arguable on console systems. As others have pointed out the standard for NTSC television is 30fps (29.97 FPS actually) and movies are roughly 24fps. That said if you go into an electronics store or a trade show like CES and watch standard 60hz TVs playing the same content as 120hz or higher tvs, you will immediately see the difference. The general premise here is the same. I personally can notice it in any PC game when I am panning the camera around quickly to see whats behind me and I am getting anything less that 60fps. It seems a bit clunky and its certainly does not feel fluid.
So I guess to summarize I am not going to call anyone a liar claiming that they are getting 60fps on max quality settings, because its possible ATI users are experiencing less performance that Nvidia users currently are, but I still find it doubtful at this stage in the game.
I have to warn everyone I am a huge fanboy of this game. I preordered after Beta4 and I set it up for a 6 month Sub. That said I have to agree with what some others are saying here in that there are performance issues. My guess is between the ongoing tweaking and the possible debugging for beta could be the cause.
I built a machine for gaming 6-9 months ago:
Core I7 overclocked to 4GHZ
12 GB Ram
ATI 5970
I cannot run this game with ultra settings and maintain 60 fps, so to me its doubtful anyone else can either. I had to turn down shadows and particle effects and then I can usually maintain 60fps or close to it. That said there are also performance issues related to particle effects, even with the slider turned down to about 25%. For instance in the starter areas when those purple bomb things are blowing up all over... I can be running around at 60fps and then one goes off right in front of me and and my frame rate drops to 35 until the bomb finishes exploding, and then I get back up to 60fps again.
I also tried a good bit of tweaking (turning off crossfire on my card, and locking the affinity of the CPU to only one core), and both of those actions caused drops in frame rate.
I also think that the antialiasing in this game needs some major tweaking. Thats one of the advantages of having a 5970 is that a lot of the extra power can go to higher levels of AA without effecting my frame rates. This is not the case here where even 2X AA brings my frame rate down from 60 on to about 45 on average.
I use V-Sync so 60 FPS is my cap.
The last comment I wanted to make is that as others have said, 60fps is really the standard for PC gaming. It's arguable on console systems. As others have pointed out the standard for NTSC television is 30fps (29.97 FPS actually) and movies are roughly 24fps. That said if you go into an electronics store or a trade show like CES and watch standard 60hz TVs playing the same content as 120hz or higher tvs, you will immediately see the difference. I personally can notice it in any PC game when I am panning the camera around quickly to see whats behind me and I am anything less that 60fps.
So I guess to summarize I am not going to call anyone a liar claiming that they are getting 60fps on max quality settings, because its possible ATI users are experiencing less performance that Nvidia users are, but I still find it doubtful at this stage.
But why would you have to get 60 fps or even 40 fps if the game is running smoothly at less?
If the human eye can't detect the difference between 40 fps and 60 fps the the point seems moot.
I would once again say "if you are getting smooth game play then you are in good standing". Regardless of some number that people are pulling out of their hat.
During the first beta I was going along, happy at ultra when I started seeing people complaining that they weren't getting huge amounts of fps. I chuckled thinking "I must be doing really well then" and put my cursor over the small computer icon only to find that I was only doing about 20 fps.
Yet, had no one said anything I wouldn't have been the wiser as I really wasn't having any issues.
So I wonder if people are really having a slide show of a game and if they are is it because they have bad equipment or is there something with the hardware they have that is causing issues.
I would also wonder if people are just upset that they aren't seeing a "60fps" because they feel that is some benchmark they need to meet regardless of actual game play?
Like Skyrim? Need more content? Try my Skyrim mod "Godfred's Tomb."
Actually the notion that the human eye can only detect so many FPS is somewhat of a misunderstood myth. While it is true that much beyond 15FPS (not 30) the human eye can not distringuish between frames inserted into the array of frames. The human eye can however, distinguish the difference in fluidity when more frames are added to the overal media.
I again challenge you to go to your local electronics store to watch a comparison between a standard 60hz tv, 120hz tv, and 240hx tv all playing the same content. You will be able to tell the difference and put all three televisions in order from lowest to highest, without first being told what they are. If the human eye really couldn't detect the changes in fluidity, then this would not be possible.
For those people that play PC games and are used to getting a solid 60FPS on their games on max settings, are probably the ones complaining about the lower frame rates. If your PC typically only garners 30fps on max settings in most games, you most likely won't notice. Very much like the TVs, I never had a complaint on my 60hz LCD until I bought a 120hz LCD. Now I immedately see the difference on all 60hz tvs.
I have to warn everyone I am a huge fanboy of this game. I preordered after Beta4 and I set it up for a 6 month Sub. That said I have to agree with what some others are saying here in that there are performance issues. My guess is between the ongoing tweaking and the possible debugging for beta could be the cause.
I built a machine for gaming 6-9 months ago:
Core I7 overclocked to 4GHZ
12 GB Ram
ATI 5970
I cannot run this game with ultra settings and maintain 60 fps, so to me its doubtful anyone else can either. I had to turn down shadows and particle effects and then I can usually maintain 60fps or close to it. That said there are also performance issues related to particle effects, even with the slider turned down to about 25%. For instance in the starter areas when those purple bomb things are blowing up all over... I can be running around at 60fps and then one goes off right in front of me and and my frame rate drops to 35 until the bomb finishes exploding, and then I get back up to 60fps again.
I also tried a good bit of tweaking (turning off crossfire on my card, and locking the affinity of the CPU to only one core), and both of those actions caused drops in frame rate.
I also think that the antialiasing in this game needs some major tweaking. Thats one of the advantages of having a 5970 is that a lot of the extra power can go to higher levels of AA without effecting my frame rates. This is not the case here where even 2X AA brings my frame rate down from 60 on to about 45 on average.
I use V-Sync so 60 FPS is my cap.
I use FRAPS on my G19 to monitor my frame rate.
The last comment I wanted to make is that as others have said, 60fps is really the standard for PC gaming. It's arguable on console systems. As others have pointed out the standard for NTSC television is 30fps (29.97 FPS actually) and movies are roughly 24fps. That said if you go into an electronics store or a trade show like CES and watch standard 60hz TVs playing the same content as 120hz or higher tvs, you will immediately see the difference. The general premise here is the same. I personally can notice it in any PC game when I am panning the camera around quickly to see whats behind me and I am getting anything less that 60fps. It seems a bit clunky and its certainly does not feel fluid.
So I guess to summarize I am not going to call anyone a liar claiming that they are getting 60fps on max quality settings, because its possible ATI users are experiencing less performance that Nvidia users currently are, but I still find it doubtful at this stage in the game.
I do believe it is mostly an issue with ATI support in the game. I run an i7 at 4.02ghz, 6 gigs of ram and a GTX460 at 1920x1200 and easily exceede 60fps (between 58 and 66 depending on whats going on) at Ultra. I usually just run on High because there isnt too much notable difference between Ultra and High (to me that is) which allows me to get around 70fps. I use EVGA Prcision to monitor FPS on my G15.
Now I agree on the AA. Turn it on to 2x and, while on High, my FPS will drop from around 70 to about 50. 4x drops me down to near 35. Smoothing really doesnt make the game look all that much better so I just keep AA off (until they get it to work a lil better...).
There are 3 types of people in the world. 1.) Those who make things happen 2.) Those who watch things happen 3.) And those who wonder "What the %#*& just happened?!"
Comments
I agree my performance was terrible, same processor, I admit I only have a 9600GT but the game on lowest settings made no difference then high settings, same FPS.
I never planned to play Rift anyway, but it is an issue.
Q9300 (2.5GHz) CPU
ATI 250 Video Card
3 GB RAM
VISTA
I was breaking 80 FPS with settings on high last night. It dipped down to about 30 with big RIFT events and 60+ players on screen (+ dozens of invasions) all spaming spells etc.
Make sure you are using the recommended drivers from TRION. They aren't always the latest version and the difference in performance can be eye opening.
All time classic MY NEW FAVORITE POST! (Keep laying those bricks)
"I should point out that no other company has shipped out a beta on a disc before this." - Official Mortal Online Lead Community Moderator
Proudly wearing the Harbinger badge since Dec 23, 2017.
Coined the phrase "Role-Playing a Development Team" January 2018
"Oddly Slap is the main reason I stay in these forums." - Mystichaze April 9th 2018
I love these posts that try to say the development team doesn't know how to optimize their engine, when it is usually the end-user that needs to optimize their drivers/settings/etc.
Developing for a PC market is extremely complicated as unlike consoles, there is no single hardware set to utilize the application. Even providing "minimum specs" is a tough decision to make, as that assumes hardware and not client configuration. That person with the minimum specs may try to raise the graphics level and thereby impact their own performance.
Playing: Rift, LotRO
Waiting on: GW2, BP
forgot to write that down.. here is my complete system:
mainboard: Asus M4A785TD-V Evo
cpu: Amd Phenom II X4 945
ram: G.Skill 2gb DDR3 Ram
graphics: radeon hd 5770, 1gb
if it really takes 1900mb I should get new ram. But I dont have spare money at the moment :-/ .. I hate components that simply break without a reason :-/
And which one do I need for my radeon hd 5770? Is there any information on the official site?
Ive been in RIft BEta for a good while now. I was also IN the WAR beta and played it for about a year after launch. RIft runs tons and tons better than WAR. They overloaded the servers on this beta and jacked up the rifts and invasions to stress test. I am using an AMD 920 OC to 3.0, 6G of ram and a AMD 4850 vid card and I am getting 45 fps On max settings (minus shadows) at 1920 x 1280 and the game runs very smooth - even with a lot of people on the screen. THey are still optimizing the client and server. I am convinced that by launch the game will be running very well.
Had no performance issues at all and was playing for a solid five hours over night. Apart from the initaial log in problems but would expect that from any beta test.
I had a couple friends last night with 5770's complain of the same thing, it turned out they had other shit running in the background that was killing their fps. I suspect ATI drivers, they are usually the culprit.
I'm playing this game on a computer made out of old Russian stove parts and toster guts. O.K, my computer isn't that bad, t's a single core with 2gigs of ram and a 512mb grapics card. But seriously, the game looks and runs amazing, even when there are dozens of people around.
The ones with no problems are the lucky ones, and some (probably high) percentage of players will fall into that category. It will never get to 100%, every game has people who cannot play them due to the specific composition of their hardware/software/network.
Throughout beta the number of folks having issues should continue to go down and more people will be albe to run the game at an acceptable level. (or you'll end up with the situation we had with Vanguard or AOC)
Time will tell.
"True friends stab you in the front." | Oscar Wilde
"I need to finish" - Christian Wolff: The Accountant
Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm
Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV
Don't just play games, inhabit virtual worlds™
"This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon
I run on a 2.4Ghz single core with an 8800gtx and 2 gigs ram.
My purfomance is acceptable until I get into raid battles and then it drops to unplayable levels.
I am not shocked by this and am very happy to have had the opportunity to play the game and realize I need to upgrade. This seems the beset time to upgrade so Trion and Rift get an A+ from me.
Well, I run everything on Ultra and my fps tends not to be extremely high but the game play is smooth. This is what I judge my performance on.
I can take part in larger rifts and my computer doesn't become a slide show, even though my fps is certainly not 60 fps per second.
If people are judging their performance on the number and are still getting smooth performance then I don't see the issue.
It seems to me that for some people their "mileage may vary"
Godfred's Tomb Trailer: https://youtu.be/-nsXGddj_4w
Original Skyrim: https://www.nexusmods.com/skyrim/mods/109547
Serph toze kindly has started a walk-through. https://youtu.be/UIelCK-lldo
Those that are getting frame rate issues running the 5000 series ATI cards should give 10.9 a try. I think that is the driver version that Trion had suggested a bit ago. I also believe that there may have been a revision 2 of the latest (10.12) that made some improvements. I have a computer with both an nVidia 280gtx and another with an ATI 5870 and the 5870 gets a good 10-20 fps less. It just looks like ATI cards need a bit more tweeking to get into good frames.
_______
|___|
\_______/
= |||||| =
|X| \*........*/ |X|
|X|_________|X|
You wouldn't understand
I run everything on max without any problem I use nvidia 460gtx ,you should be doing fin with that display card of yours unless the Phenom IIx4 sux.
Nah, this game will not have the whole EQ2, AoC Vanguard issue.
It's just not that graphically intense.
Godfred's Tomb Trailer: https://youtu.be/-nsXGddj_4w
Original Skyrim: https://www.nexusmods.com/skyrim/mods/109547
Serph toze kindly has started a walk-through. https://youtu.be/UIelCK-lldo
Performance is just fine on my system
Phenom x6 @ 4GHz
4GB RAM
2x 6950's
Most of my guild where playing with me last night. About 15 of us. There was a broad range of machines involved. Apart from a few issues to treek the settings nobody was reporting any performance problems. This is just from my experiance but I don't consider it to a big problem. But reading some of the comments it's odviously an issue for some. Just don't think it's as wide spread as it's beeing made out on this thread. But as you say, time will tell.
It's probably one of two things:
1, the game, for whatever technical reason, runs differently on different machines and configurations regardles of how beefy the system is.
2, players having issues might think they have good systems but their systems might not be as good as they think they are. There are probably people putting the game on ultra who shouldn't be putting the game on ultra.
Godfred's Tomb Trailer: https://youtu.be/-nsXGddj_4w
Original Skyrim: https://www.nexusmods.com/skyrim/mods/109547
Serph toze kindly has started a walk-through. https://youtu.be/UIelCK-lldo
I have not yet tried to log into this beta phase since there were so many reports of people not getting in. Previous betas I had no issues at all averaging about 70fps. It is possible they changed something that is giving people all these issues but I think it is part of the reason they invited so many for this phase to test out more systems and the load perhaps. I will try and log in later and see how things are. I spent all my time today creating signatures and have not yet resurfaced from my paint program.
You are wrong. It's 60. There is a HUGE difference between 30 and 60.
[mod edit]
Mission in life: Vanquish all MMORPG.com trolls - especially TESO, WOW and GW2 trolls.
I have to warn everyone I am a huge fanboy of this game. I preordered after Beta4 and I set it up for a 6 month Sub. That said I have to agree with what some others are saying here in that there are performance issues. My guess is between the ongoing tweaking and the possible debugging for beta could be the cause.
I built a machine for gaming 6-9 months ago:
Core I7 overclocked to 4GHZ
12 GB Ram
ATI 5970
I cannot run this game with ultra settings and maintain 60 fps, so to me its doubtful anyone else can either. I had to turn down shadows and particle effects and then I can usually maintain 60fps or close to it. That said there are also performance issues related to particle effects, even with the slider turned down to about 25%. For instance in the starter areas when those purple bomb things are blowing up all over... I can be running around at 60fps and then one goes off right in front of me and and my frame rate drops to 35 until the bomb finishes exploding, and then I get back up to 60fps again.
I also tried a good bit of tweaking (turning off crossfire on my card, and locking the affinity of the CPU to only one core), and both of those actions caused drops in frame rate.
I also think that the antialiasing in this game needs some major tweaking. Thats one of the advantages of having a 5970 is that a lot of the extra power can go to higher levels of AA without effecting my frame rates. This is not the case here where even 2X AA brings my frame rate down from 60 on to about 45 on average.
I use V-Sync so 60 FPS is my cap.
I use FRAPS on my G19 to monitor my frame rate.
The last comment I wanted to make is that as others have said, 60fps is really the standard for PC gaming. It's arguable on console systems. As others have pointed out the standard for NTSC television is 30fps (29.97 FPS actually) and movies are roughly 24fps. That said if you go into an electronics store or a trade show like CES and watch standard 60hz TVs playing the same content as 120hz or higher tvs, you will immediately see the difference. The general premise here is the same. I personally can notice it in any PC game when I am panning the camera around quickly to see whats behind me and I am getting anything less that 60fps. It seems a bit clunky and its certainly does not feel fluid.
So I guess to summarize I am not going to call anyone a liar claiming that they are getting 60fps on max quality settings, because its possible ATI users are experiencing less performance that Nvidia users currently are, but I still find it doubtful at this stage in the game.
But why would you have to get 60 fps or even 40 fps if the game is running smoothly at less?
If the human eye can't detect the difference between 40 fps and 60 fps the the point seems moot.
I would once again say "if you are getting smooth game play then you are in good standing". Regardless of some number that people are pulling out of their hat.
During the first beta I was going along, happy at ultra when I started seeing people complaining that they weren't getting huge amounts of fps. I chuckled thinking "I must be doing really well then" and put my cursor over the small computer icon only to find that I was only doing about 20 fps.
Yet, had no one said anything I wouldn't have been the wiser as I really wasn't having any issues.
So I wonder if people are really having a slide show of a game and if they are is it because they have bad equipment or is there something with the hardware they have that is causing issues.
I would also wonder if people are just upset that they aren't seeing a "60fps" because they feel that is some benchmark they need to meet regardless of actual game play?
Godfred's Tomb Trailer: https://youtu.be/-nsXGddj_4w
Original Skyrim: https://www.nexusmods.com/skyrim/mods/109547
Serph toze kindly has started a walk-through. https://youtu.be/UIelCK-lldo
Actually the notion that the human eye can only detect so many FPS is somewhat of a misunderstood myth. While it is true that much beyond 15FPS (not 30) the human eye can not distringuish between frames inserted into the array of frames. The human eye can however, distinguish the difference in fluidity when more frames are added to the overal media.
I again challenge you to go to your local electronics store to watch a comparison between a standard 60hz tv, 120hz tv, and 240hx tv all playing the same content. You will be able to tell the difference and put all three televisions in order from lowest to highest, without first being told what they are. If the human eye really couldn't detect the changes in fluidity, then this would not be possible.
For those people that play PC games and are used to getting a solid 60FPS on their games on max settings, are probably the ones complaining about the lower frame rates. If your PC typically only garners 30fps on max settings in most games, you most likely won't notice. Very much like the TVs, I never had a complaint on my 60hz LCD until I bought a 120hz LCD. Now I immedately see the difference on all 60hz tvs.
I have a new I-930 with 6 GB RAM, Nivdia gtx 465 and had both overclocked last night:
I was getting 17 fps with ultra settings and 4x supersampling turned on. The graphics looked awesome but at a cost of fps.
Lowering it to high settings with 4x supersampling it went to 25 fps. The graphics did not significantly decrease in apparent quality from ultra.
Not sure why its so slow for me but from what I recall I always got low framerate where I live with my other older, high-end computer.
I use Cox Premier cable internet with or without a router, same difference.
I do believe it is mostly an issue with ATI support in the game. I run an i7 at 4.02ghz, 6 gigs of ram and a GTX460 at 1920x1200 and easily exceede 60fps (between 58 and 66 depending on whats going on) at Ultra. I usually just run on High because there isnt too much notable difference between Ultra and High (to me that is) which allows me to get around 70fps. I use EVGA Prcision to monitor FPS on my G15.
Now I agree on the AA. Turn it on to 2x and, while on High, my FPS will drop from around 70 to about 50. 4x drops me down to near 35. Smoothing really doesnt make the game look all that much better so I just keep AA off (until they get it to work a lil better...).
There are 3 types of people in the world.
1.) Those who make things happen
2.) Those who watch things happen
3.) And those who wonder "What the %#*& just happened?!"