I can tell you right now that when i went shopping back when 1080 was mass hyped,i lined them all up and could not tell the difference between 1080p and 720,they looked the same and at times the 720 looked better for what ever reason.
Most people can not tell the difference between 720p and 1080p from my experience.
Haha what does that even mean?
Claims "most people can't tell" her evidence "from my experience".
I can tell you right now that when i went shopping back when 1080 was mass hyped,i lined them all up and could not tell the difference between 1080p and 720,they looked the same and at times the 720 looked better for what ever reason.
Most people can not tell the difference between 720p and 1080p from my experience.
I can tell you right now that when i went shopping back when 1080 was mass hyped,i lined them all up and could not tell the difference between 1080p and 720,they looked the same and at times the 720 looked better for what ever reason.
Most people can not tell the difference between 720p and 1080p from my experience.
Haha what does that even mean?
huh
Again you cut out the rest...
I will try to simplify it for you...
What experience do YOU have where you know what "most people" can and can not do?
What experience do YOU have where you know what "most people" can and can not do?
because I know these experiments from biology classes
you put a screen up with slits, and you check how long it takes before the person can say which rows are slits and which are not
the faster they can tell the higher resolution they can see
you can test this with animals, instead of magnifying the slits, you just put the animal farther and farther away, by setting food behind the slits, you can check which side it picks, as long as it picks the right side, it can see the resolution
or you just put up actual screens with slits in front of the animal and hide food behind one
hawks for example can see several magnification beyond our eyes, dogs can not
Cost of entry for 4K is certainly a barrier for many, but there's another reason too. The best hardware can only just run 4k, and to get there, you need SLi or Crossfire, which introduces other issues that many of us simply don't want to deal with. When the shrink to 14 nm happens later this year, you'll see a LOT more people making the jump to 4K.
I can tell you right now that when i went shopping back when 1080 was mass hyped,i lined them all up and could not tell the difference between 1080p and 720,they looked the same and at times the 720 looked better for what ever reason.
Most people can not tell the difference between 720p and 1080p from my experience.
Haha what does that even mean?
huh
Again you cut out the rest...
I will try to simplify it for you...
What experience do YOU have where you know what "most people" can and can not do?
She's just posting imaginary facts, as usual.
If you want a real fact, here it is:
It all depends on the viewing distance and the size of your screen. I have a 40" 4K screen at 1m (3.3feet), and you definitely see the difference between 1080p and 4K. And while a 40" screen seems small by today's standards when in a living room, when it's on a computer desk and you watch a movie, you feel like you are in the movie theater.
Agreed, i recently moved from 1080 to 1440, and i can definitely see the improvements, so much so that i can't imagine just going back to 1080, and i most certainly wouldn't put up with an even lower resolution. The reason i don't play games at 4k is simply because while i have a 980ti, i only get about 45 fps in the more graphically intensive games, and i prefer to have at least 60 fps, which at 1440 is easily achievable.
I could play minecraft or some graphical awsome game. Once I am really into the game, I really do not register the difference. I would have to take a step back and decide to look at how pretty a game is to notice the difference. That is also the reason I usually play game on lower setting then I strickly need to. So while a 4k setup would look great. Most of the time I would not notice the difference anyway, so not worth the money.
cagan said: I remember going to 1k resolution from 800x600.
Wait till the prices drop and everyone will be 4k
I really don't think this will happen. 4k drops your framerate by around 70% compared to 1080p.
That is a massive loss for a very marginal gain.
And this is a linear loss. In a few years from now, you will still lose 70% of your FPS by using 4k.
It's not like an AA filter where the cost decreased over time.
It was different with 800x600, the gain was noticeable.
From HD to 4k is a very very small gain, some people can barely tell the difference.
You know, going from 640x480 to 1080p drops your framerate. By a lot. But you know what else happened? Graphics got faster. You can't honestly sit here and say 4K will always suck because it will always be slower, otherwise we'd still be sitting at vanilla VGA 640x480 (or worse) because it's always going to be faster (and I hear some competitive players do exactly that, which good on them)
I happen to be a big believer in higher resolutions, and higher DPI. I can definitely tell, it looks considerably better to me. So what if I can't run 8xMSAA at 4K - you know what - I wouldn't need to, because increasing the resolution is better than simulating the effect via AA anyway.
Some people who can barely tell the difference are not ~ALL~ people. No one is forcing anyone to use 4K, or to upgrade to 4K immediately, If you are in total love with 1080p, then by all means, keep right on using it, and be happy that computers can render at any of a myriad of resolutions without too much headache.
I honestly can't see a dot pitch getting too small - at least until we go back to true analog displays and rasterized/vector-based graphics
I can tell you right now that when i went shopping back when 1080 was mass hyped,i lined them all up and could not tell the difference between 1080p and 720,they looked the same and at times the 720 looked better for what ever reason.
Most people can not tell the difference between 720p and 1080p from my experience.
But, I made a little experiment.
I took a high quality image still, in 1920*1080, from a movie. I made sure it's a sharp image with lots of fine detail.
I turned it into a 1280*720 image and put it on top:
What I did next is take that 720p image, and upscaled it to 1080p.
I won't tell where I put it..........I put it either above or below the 1080p image.
If you can't tell where I put it...you can't tell the difference between 1080p and 720p.
(feel free to click the image for the full resolution)
(only click the image to 100%, no extra "CTRL+ zooming" beyond the normal screen resolution, otherwise it's no longer fair)
From actual studies players cannot tell the difference between 30 fps and 60 fps. Yet somehow everyone thinks they must have 60fps.
Yes, and every one of those .07% will complain incessantly if a game doesn't support it. They've got to justify spending enough cash to make a down payment on a home for their gaming rigs somehow I suppose.
I'm perfectly happy with 1080p, and most likely will be for quite some time.
AN' DERE AIN'T NO SUCH FING AS ENUFF DAKKA, YA GROT! Enuff'z more than ya got an' less than too much an' there ain't no such fing as too much dakka. Say dere is, and me Squiggoff'z eatin' tonight!
We are born of the blood. Made men by the blood. Undone by the blood. Our eyes are yet to open. FEAR THE OLD BLOOD.
Jean-Luc_Picard said: PS: it's the bottom image with Jon Snow which has been up-scaled. It's very obvious to see, specially in the hair. In the top image, the image is precise and you can see every single lock of hair, in the bottom image it's all blurry. So long for the demonstration...
I found her demonstration convincing and helpful. I was totally unable to tell any difference between either picture, although I did sense a minor blending on one image, had no idea what it meant. Not at all obvious to me.
Bottom john snow is 720p, ... and fighter pilots can see changes up to 960hz, with 480hz being the cutoff for most people.
the difference between 144hz and 60hz is so high for me i get physical discomfort when i'm at someone's place, complain about the dragging cursor, and horrible delay and input lag, and cant do anything in any game without hours of adjustment, ...
I could play minecraft or some graphical awsome game. Once I am really into the game, I really do not register the difference. I would have to take a step back and decide to look at how pretty a game is to notice the difference. That is also the reason I usually play game on lower setting then I strickly need to. So while a 4k setup would look great. Most of the time I would not notice the difference anyway, so not worth the money.
This is where I notice the difference the most. Going from 480p back in the day to 1080p today is MOST noticable to me when gaming. I tend to examine distant trees, like in JC3, or texture resolution in games, just because I really like graphics. It's a big deal for me when gaming.
I couldn't care less when watching a movie. These days I'm generally watching it on a tablet anyway
as for john snow the most noticeable part is the area around the right side of the picture next and above his eye, transition between the hair and face, and different individual hair is heavily blurred.
I could play minecraft or some graphical awsome game. Once I am really into the game, I really do not register the difference. I would have to take a step back and decide to look at how pretty a game is to notice the difference. That is also the reason I usually play game on lower setting then I strickly need to. So while a 4k setup would look great. Most of the time I would not notice the difference anyway, so not worth the money.
This is where I notice the difference the most. Going from 480p back in the day to 1080p today is MOST noticable to me when gaming. I tend to examine distant trees, like in JC3, or texture resolution in games, just because I really like graphics. It's a big deal for me when gaming.
I couldn't care less when watching a movie. These days I'm generally watching it on a tablet anyway
Then again my big MMO these days is EQ Classic... Obviously it's not mandatory...
There's a simple test to see if your monitor resolution is high enough that you wouldn't be able to tell the difference with a higher resolution, and you can do it yourself. Pick a game that offers anti-aliasing and try turning anti-aliasing on and off. See if you can tell the difference between anti-aliasing on and anti-aliasing off. If you can't, then your monitor resolution is high enough and you have nothing to gain with a higher resolution. If there's an obvious difference, then a higher resolution would look better to you.
The blurred picture really isn't a good demonstration, as your monitor pixels are whatever size they are and rescaling a picture doesn't change that. You can make it into a better demonstration by blowing everything up by a factor of 2 and not interpolating, so that one pixel of the original becomes four pixels of the new. But that will only tell you whether your current resolution will be better than half your current resolution, which it will and by a wide margin.
I wouldn't trust photo editing tools to refrain from any sort of interpolation, though some might. But you can get a clear demonstration of this in an actual game. Try loading Champions Online and turn half resolution on. See if you can tell the difference. You will be able to, as it's a huge difference. Even if you're on a 4K monitor, the difference is considerable.
From actual studies players cannot tell the difference between 30 fps and 60 fps. Yet somehow everyone thinks they must have 60fps.
That is wildly false. It might be true if you're only watching a pre-recorded video and not interacting with it. But when playing a game, even if the animations look smooth to someone else watching, a lower frame rate adds latency, so that there's a longer delay between when you press a button and when it takes effect on the screen. And that's very noticeable, even for the difference between 30 and 60 frames per second.
The effect is similar to using a software cursor rather than a hardware cursor. Champions Online lets you manually force a software cursor. Try using it instead of the hardware cursor for a while and see how long it takes to drive you nuts.
There are diminishing returns to higher frame rates, of course. You gain a lot less by going from 60 Hz to 120 Hz than you gained by going from 30 Hz to 60 Hz. In terms of latency, 60 Hz is as close to 30 Hz as it is to infinite frame rate. But higher frame rates are always better in terms of gameplay experience.
I forgot I do play 4k gaming. My phone is 1440x2560 640 dpi. Very beautiful screen. I can see the difference when I force lower the resolution to get better frames. Of course games are as graphically intensive in most cases.
I can tell the difference between when I'm getting 60fps and when it's lower. I can also tell the difference between 1080p and various resolutions.
Making a statement about people not knowing the difference exposes you as not knowing what you're talking about while making statements that have no base in fact or personal experience. Have the decency to AT LEAST speak from personal experience.
"As far as the forum code of conduct, I would think it's a bit outdated and in need of a refre *CLOSED*"
I am Gaming in 4K and I absolutely love it. However, I am not using a 15 inch monitor, I am gaming on my 65 inch Vizio 4K tv and it looks awesome. Yeah, they are coming out with 4K 27 inch monitors, but heck with that lol.
Also the John Snow "test" is flawed because how was the image captured? Did the person grabbing the image take .jpg compression artifacting into consideration? Is the grab from source? Why are the black levels different? A still is not a representation of motion.
"As far as the forum code of conduct, I would think it's a bit outdated and in need of a refre *CLOSED*"
Well obviously if prices are low enough people will buy just because they need a new Monitor or whatever,maybe a whole system.Adding in expensive video cards,super fast cpu's,hi end monitors with fast seek times you'll end up with a VERY marginal improvement ,not nearly worth the cost.
I can tell you right now that when i went shopping back when 1080 was mass hyped,i lined them all up and could not tell the difference between 1080p and 720,they looked the same and at times the 720 looked better for what ever reason.People tend to let their brains dictate that well it costs more,is hyped,better on paper,i should buy it !!. Then of course you have to be careful of a storefront setting up the monitors to make 4k look way better.I do not foresee a single game needing these,we are lucky to see texture sizes of 1200x1200 being used,often times much lower resolutions.So in essence you'll be paying for a FAKE picture not really a better picture.
There are other factors that go into it as well,like your connections,if the connections are ancient tech your downgrading the purpose of going 4k.
The difference between 720p and 1080p: 1 million pixels The difference between 1080p and 4k: 6.2 million pixels
If you don't see the difference then you need glasses, go see a doctor. ___
The texture size has nothing to do with the resolution. You don't need 4K Textures to make the game look better in 4K. That has to be the mots ridiculous myth and shows people have no idea how 3D games are rendered.
Tell me how often do you see a game Texture fill your whole screen (using all of your 2 million pixels of your 1080p monitor)? Pretty much NEVER unless you like to stare at walls at close range.
Now ask yourself why you even need a 1920 x 1080 Texture if you never looking at it flat across the whole screen? 99.9% of the time you do not see the textures in full size on screen.
___
Ancient Connection: Another uninformed Myth. ALL MODERN CONNECTIONS ARE DIGITAL. There is no loss!
There are only TWO connections that work with 4K and that is the newest HDMI 2.0 and Display Port 1.3. Both are fully digital, there is no quality loss at all.
HDMI 2.0 Max Resolution: 4096x2160 Refresh: 60Hz/FPS Audio: 32 Channel
Display Port 1.3 Max Resolution: 7680 x 4320 ___
You need to inform yourself before posting stuff like you did. Obviously you never seen a 4K Monitor and you have no idea about Resolution and 3D games either.
"It's pretty simple, really. If your only intention in posting about a particular game or topic is to be negative, then yes, you should probably move on. Voicing a negative opinion is fine, continually doing so on the same game is basically just trolling." - Michael Bitton Community Manager, MMORPG.com
"As an online discussion about Star Citizen grows longer, the probability of a comparison involving Derek Smart approaches 1" - MrSnuffles's law
"I am jumping in here a bit without knowing exactly what you all or talking about." - SEANMCAD
Comments
Wish I was that way myself , but I can tell cheap shit , when it's cheap shit.
Claims "most people can't tell" her evidence "from my experience".
I will try to simplify it for you...
What experience do YOU have where you know what "most people" can and can not do?
http://www.digitaltrends.com/home-theater/720p-vs-1080p-can-you-tell-the-difference-between-hdtv-resolutions/
http://lifehacker.com/can-you-tell-the-difference-between-720p-1080p-and-4k-1731323537
http://carltonbale.com/1080p-does-matter/
you put a screen up with slits, and you check how long it takes before the person can say which rows are slits and which are not
the faster they can tell the higher resolution they can see
you can test this with animals, instead of magnifying the slits, you just put the animal farther and farther away, by setting food behind the slits, you can check which side it picks, as long as it picks the right side, it can see the resolution
or you just put up actual screens with slits in front of the animal and hide food behind one
hawks for example can see several magnification beyond our eyes, dogs can not
I happen to be a big believer in higher resolutions, and higher DPI. I can definitely tell, it looks considerably better to me. So what if I can't run 8xMSAA at 4K - you know what - I wouldn't need to, because increasing the resolution is better than simulating the effect via AA anyway.
Some people who can barely tell the difference are not ~ALL~ people. No one is forcing anyone to use 4K, or to upgrade to 4K immediately, If you are in total love with 1080p, then by all means, keep right on using it, and be happy that computers can render at any of a myriad of resolutions without too much headache.
I honestly can't see a dot pitch getting too small - at least until we go back to true analog displays and rasterized/vector-based graphics
I'm perfectly happy with 1080p, and most likely will be for quite some time.
AN' DERE AIN'T NO SUCH FING AS ENUFF DAKKA, YA GROT! Enuff'z more than ya got an' less than too much an' there ain't no such fing as too much dakka. Say dere is, and me Squiggoff'z eatin' tonight!
We are born of the blood. Made men by the blood. Undone by the blood. Our eyes are yet to open. FEAR THE OLD BLOOD.
#IStandWithVic
Thanks for the additional clarification, though.
the difference between 144hz and 60hz is so high for me i get physical discomfort when i'm at someone's place, complain about the dragging cursor, and horrible delay and input lag, and cant do anything in any game without hours of adjustment, ...
I couldn't care less when watching a movie. These days I'm generally watching it on a tablet anyway
The blurred picture really isn't a good demonstration, as your monitor pixels are whatever size they are and rescaling a picture doesn't change that. You can make it into a better demonstration by blowing everything up by a factor of 2 and not interpolating, so that one pixel of the original becomes four pixels of the new. But that will only tell you whether your current resolution will be better than half your current resolution, which it will and by a wide margin.
I wouldn't trust photo editing tools to refrain from any sort of interpolation, though some might. But you can get a clear demonstration of this in an actual game. Try loading Champions Online and turn half resolution on. See if you can tell the difference. You will be able to, as it's a huge difference. Even if you're on a 4K monitor, the difference is considerable.
That is wildly false. It might be true if you're only watching a pre-recorded video and not interacting with it. But when playing a game, even if the animations look smooth to someone else watching, a lower frame rate adds latency, so that there's a longer delay between when you press a button and when it takes effect on the screen. And that's very noticeable, even for the difference between 30 and 60 frames per second.
The effect is similar to using a software cursor rather than a hardware cursor. Champions Online lets you manually force a software cursor. Try using it instead of the hardware cursor for a while and see how long it takes to drive you nuts.
There are diminishing returns to higher frame rates, of course. You gain a lot less by going from 60 Hz to 120 Hz than you gained by going from 30 Hz to 60 Hz. In terms of latency, 60 Hz is as close to 30 Hz as it is to infinite frame rate. But higher frame rates are always better in terms of gameplay experience.
Making a statement about people not knowing the difference exposes you as not knowing what you're talking about while making statements that have no base in fact or personal experience. Have the decency to AT LEAST speak from personal experience.
¯\_(ツ)_/¯
Yeah, they are coming out with 4K 27 inch monitors, but heck with that lol.
And yes, I filled in the Steam Survey lol
¯\_(ツ)_/¯
The difference between 1080p and 4k: 6.2 million pixels
If you don't see the difference then you need glasses, go see a doctor.
___
The texture size has nothing to do with the resolution. You don't need 4K Textures to make the game look better in 4K. That has to be the mots ridiculous myth and shows people have no idea how 3D games are rendered.
Tell me how often do you see a game Texture fill your whole screen (using all of your 2 million pixels of your 1080p monitor)? Pretty much NEVER unless you like to stare at walls at close range.
Now ask yourself why you even need a 1920 x 1080 Texture if you never looking at it flat across the whole screen? 99.9% of the time you do not see the textures in full size on screen.
___
Ancient Connection: Another uninformed Myth. ALL MODERN CONNECTIONS ARE DIGITAL. There is no loss!
There are only TWO connections that work with 4K and that is the newest HDMI 2.0 and Display Port 1.3. Both are fully digital, there is no quality loss at all.
HDMI 2.0
Max Resolution: 4096x2160
Refresh: 60Hz/FPS
Audio: 32 Channel
Display Port 1.3
Max Resolution: 7680 x 4320
___
You need to inform yourself before posting stuff like you did. Obviously you never seen a 4K Monitor and you have no idea about Resolution and 3D games either.
"It's pretty simple, really. If your only intention in posting about a particular game or topic is to be negative, then yes, you should probably move on. Voicing a negative opinion is fine, continually doing so on the same game is basically just trolling."
- Michael Bitton
Community Manager, MMORPG.com
"As an online discussion about Star Citizen grows longer, the probability of a comparison involving Derek Smart approaches 1" - MrSnuffles's law
"I am jumping in here a bit without knowing exactly what you all or talking about."
- SEANMCAD
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬