With all the buzz around high Hz monitors, I am wondering how demanding it is on the computer.
If you compare a game running at 1920x1050 @ 60Hz, against 1920x1050 @ 120Hz, is the latter twice as demanding on the graphics card, as it has to draw out twice as many frames?
Comments
But I'd never see a higher refresh rate as an argument in itself against getting a monitor. If all your video card can deliver at given settings in a given game is 50 frames per second, it will be completely busy delivering those 50 frames per second, regardless of whether you have a 60 Hz monitor or 120 Hz. Those 50 frames per second will also look smoother at 120 Hz than at 60 Hz, as it can space the frames more evenly. Or better yet, if you have a monitor with adaptive sync, it can space the frames perfectly and have 50 frames per second look pretty smooth, without the jutter you'd see at 60 Hz with no adaptive sync.
People were playing cs in pc bangs for years before anyone anywhere even knew what a 120hz monitor even was
It's also extremely helpful in any twitch shooter/moba, etc ...
Not to mention most other games start being a lot more smoother above about 75-80hz(90hz and much more for people with keen sense of momentum and good eyesight)
Oh and for the not being able to deliver the fps keep in mind, that in most competitive shooters and mobas the proper thing to do is lower all the non-essential graphic settings to minimum and not to increase them.
For instance for CSGO u only want to have global shadow quality on highest to give u proper and smooth shadows, because that enables you to in a corner situation (where u positioned urself properly in relation to the fixed light-source) be able to spot the enemy before they spot you, because u can see their shadow and they cannot
everything is lowered to minimum except the optional anti-aliasing/texture filtering, which some people put to x2 or x4 simply to not get distracted by distant objects flickering in zoom and out of zoom.
Do you think higher refresh rates could mean reduced nausea from games? Or is that mainly a framerate issue?
The problem has never been Hz for me I've run games at 200 FPS that are running at 60HZ, so while their may be a bit of annoying flicker because of the refresh rate, the game still runs butter smooth which is the most important factor to me.
1) Poor frame rates that make the motion irregular
2) Poor monitor quality, especially flickering
3) Large granularity of images that makes stuff appear too pixelated
4) Perspective-incorrect field of view such that movement or rotation is an optical illusion of sorts
Of those, (3) is still a big issue with VR headsets and possibly other stereoscopic 3D, but otherwise shouldn't be a problem. (2) could be a problem with a bad monitor, but is readily fixed by getting a good monitor. I'd only suspect (4) if the nausea varies wildly from game to game--and in particular, isometric games are completely fine. I'd be very surprised if (1) is a problem when you're getting a steady 60 frames per second.
#4 definitely gets to me. It's one of the reasons I haven't done any multimonitor gaming - you gotta get the perspective just right or it's like looking through a fisheye lens and it does make me car sick, and not every game has easily-adjustable FOV controls.
I've also had the motion blur effect in some games make me a bit queasy - that one is easy to turn off usually.
Everquest drives me nuts - get crazy high FPS in nearly every place, except the Guild Lobby where everyone hangs out. You mouselook across where people are sitting AFK, and the framerate tanks down past single digits. I had to get very good at navigating the lobby while staring at the floor or walls, because if you accidently looked the wrong direction, it would take many many seconds to get your viewing window back to someplace where the computer would respond.
That isn't a hardware problem, it's a very old software problem. But yeah.
Don't start with that your eye can only see bullcrap unless you know what your talking about. It is an undeniable fact that depending on how quickly you move your mouse determines the smoothness of what is rendered based on your current FPS. I will none the less give you a quick lesson.
If your monitor is running at 30FPS and you do not move your mouse and stare at the wall you are correct!!!! your eye cant tell the difference between 30FPS and 300 because there is no movement to test the technically limitations of your computers hardware or software limitations ( 30 FPS frame lock ) by the developers.
HOWEVER once you start wiping your mouse around rapidly, EVERYTHING changes technically, in order for the monitor to maintain a draw rate the achieves the clarity of a still image your computer HAS to be able to render higher that 30FPS or even 60FPS to maintain smooth frames. AND the faster you move your mouse the more important this becomes.
If you're a slow mouse turner in game, FPS or other wise, and never cause your monitor to have to redraw faster then 30 FPS to maintain still picture clarity you will agreeable NEVER notice the difference between 30 and 300. HOWEVER if you wipe you mouse around say changing your view180 degrees in .25 seconds you will see VISIBLY on screen a blurring of the image as the monitor struggles to redraw that much of the environment in .25 seconds. Effectively cutting your 30FPS down to less than 10 to cover .25 seconds of motion resulting in a slide show of images
I'm sick of explain this to people that play games in such a manor as to not test the limitations of the their hardware or even understand them and say "The game plays smooth for me". The guy that barely changes his field of view.
Having just barely or slightly over average fps means next to nothing if your minimum fps at times falls below the refresh rate, and because of how different games are coded, there could be a lot of fps drops.
its a completely different problem compared to trying to reduce motion blur and/or input lag with 120/144hz and/or lightboost.