Originally posted by Zepee fps- frames per secondIf im not mistaken the higher the fps the better ur gameplay experience however! theres this "Ping", the higher it is the slower ur game will perform ;)See, the 1.10 patch to D2 didnt only bring new items.... it brought some knowledge to me ^^
in most cases yes, in some cases the fps may be to high which is usually just a driver issue, that will make for choppy gameplay
If I remember right, the human eye sees things at 30 FPS. Yet games normally run higher than that and I notice it, so I don't know, I never understood that.
The brain is a wonderful thing. Games, video, animated gifs, the mouse cursor---none of these really move. It's just an illusion. what it really is is separate pictures being redrawn at an accelerated rate (Think of the "flip drawings" we used to make in grade school). Each image is a "frame" shown by the TV or monitor. Your brain takes these frames in sequence and is able to put them together tho simulate movement.
The standard cinematic 24 FPS goes back to Thomas Edison. Edison decided to test the human ability to see the movement perfectly. He determined the human eye can detect no flicker at 48 FPS. For an example of this, take a look at the nearest flourescent light. Standard AC current causes the bulb to switch on and off ~60 times/sec. The problem with the 48FPS speed was that film stock was expensive (the film contains silver) and 48 FPS chewed up tons of film. First the films were shot in 16 FPS, with each image being flashed twice (tricking the brain), but when sound pictures were invented shortly after, the number was increased to 24 to increase audio quality. At this point, the image was flashed 3x instead of twice.
Fast forward to the 50s, and the advent of TV. Since the early signals were weak and prone to interference, 60 Hz was a good manufacturing quality, and it matched the FPS of any movies shown on the new medium. FYI: The early movies did in fact run faster when put on TV (60/2 = 30 FPS), but this was in fact a benefit since time was at a premium, without any noticeable difference to the viewers (think ~5 min per every two hours). Now the broadcasts are sync'd with the reciever.
Here's what it boils down to: You'll notice a difference on games up to 48 FPS, then it doesn't matter. That's also why monitors don't usually go past 85 Hz
A couple of final points---
First, the reason why TV has not changed its speed (30 FPS in the US) is due to the fact of tradition and compatibility. You can watch the same films and tv on the earliest TVs, but in that case the picture would get garbled because of the technology not being able to handle the simultaneous images. Increasing the FPS too much would make billions of TVs obsolete overnight. Also, since films are still shot at 25-30 FPS, there would be no point as that's where viewers would see the most difference.
Lastly, you may be wondering why the current TV images seem so much more fluid than early shows and games. Due to a process called interleaving, multiple images are shown simultaneously snd the brain sees less of a flicker than a game running at 30 FPS. The below is what happens in the first 1/6th of a second:
Time You see: 1/30 Frames 1&2 2/30 Frames 2&3 3/30 Frames 3&4 4/30 Frames 4&5 5/30 Frames 5&6 And so on...
Comments
srry. for the poll, did something wrong with post
Well, at frist its not FSP...
Its FPS and it means Frist Preson Shooter
Ahh, there FSP, now I understand.
The numbers in some games when you play.
Its the processor lag, I think.
fps- frames per second
If im not mistaken the higher the fps the better ur gameplay experience
however! theres this "Ping", the higher it is the slower ur game will perform
See, the 1.10 patch to D2 didnt only bring new items.... it brought some knowledge to me ^^
---------------------------------------------------------------------
Played- Runescape, Conquer
Tested- EQ, RYL, Freeworld
shitty then - used to have 40 fps, around that dropped alittle
to 30
in most cases yes, in some cases the fps may be to high which is usually just a driver issue, that will make for choppy gameplay
FPS = Frames Per Second
If I remember right, the human eye sees things at 30 FPS. Yet games normally run higher than that and I notice it, so I don't know, I never understood that.
--------------------------------------
Wait how could something run faster than the human fps rate? I would think that would be the highest?
I'm not sure I fully understand it... but I have an idea. Article:
http://amo.net/NT/02-21-01FPS.html
--------------------------------------
A quick crash course in FPS
Everyone have their No 2 pencils? Good...
The brain is a wonderful thing. Games, video, animated gifs, the mouse cursor---none of these really move. It's just an illusion. what it really is is separate pictures being redrawn at an accelerated rate (Think of the "flip drawings" we used to make in grade school). Each image is a "frame" shown by the TV or monitor. Your brain takes these frames in sequence and is able to put them together tho simulate movement.
The standard cinematic 24 FPS goes back to Thomas Edison. Edison decided to test the human ability to see the movement perfectly. He determined the human eye can detect no flicker at 48 FPS. For an example of this, take a look at the nearest flourescent light. Standard AC current causes the bulb to switch on and off ~60 times/sec. The problem with the 48FPS speed was that film stock was expensive (the film contains silver) and 48 FPS chewed up tons of film. First the films were shot in 16 FPS, with each image being flashed twice (tricking the brain), but when sound pictures were invented shortly after, the number was increased to 24 to increase audio quality. At this point, the image was flashed 3x instead of twice.
Fast forward to the 50s, and the advent of TV. Since the early signals were weak and prone to interference, 60 Hz was a good manufacturing quality, and it matched the FPS of any movies shown on the new medium. FYI: The early movies did in fact run faster when put on TV (60/2 = 30 FPS), but this was in fact a benefit since time was at a premium, without any noticeable difference to the viewers (think ~5 min per every two hours). Now the broadcasts are sync'd with the reciever.
Here's what it boils down to: You'll notice a difference on games up to 48 FPS, then it doesn't matter. That's also why monitors don't usually go past 85 Hz
A couple of final points---
First, the reason why TV has not changed its speed (30 FPS in the US) is due to the fact of tradition and compatibility. You can watch the same films and tv on the earliest TVs, but in that case the picture would get garbled because of the technology not being able to handle the simultaneous images. Increasing the FPS too much would make billions of TVs obsolete overnight. Also, since films are still shot at 25-30 FPS, there would be no point as that's where viewers would see the most difference.
Lastly, you may be wondering why the current TV images seem so much more fluid than early shows and games. Due to a process called interleaving, multiple images are shown simultaneously snd the brain sees less of a flicker than a game running at 30 FPS. The below is what happens in the first 1/6th of a second:
Time You see:
1/30 Frames 1&2
2/30 Frames 2&3
3/30 Frames 3&4
4/30 Frames 4&5
5/30 Frames 5&6
And so on...