Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Ugh, why Acer why... (New Gaming Monitor News)

2»

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Kilrane
    I apologize if I came off as argumentative, that was not my intention. First off, you and Quizz know a lot more than I do, I only posted the results I was able to dig up. I honestly like the cheaper route and I have no intention of buying a G-Sync monitor anytime soon. I was just confused by the fact that from what I was able to gather with my google-fu skills I was under the impression that AMD's solution was only going to be directly supported by AMD's GPU line. 

    It's kind of like saying that only Nvidia has CUDA cores, while only AMD has stream processors.  They're just different names for the same thing:  the thing that we used to call shaders.

    FreeSync is AMD's name for their implementation of adaptive sync.  But anyone else can also implement adaptive sync and call it whatever they want.  It won't use the FreeSync name, but it will be exactly the same thing by another name.  And given that the original point of adaptive sync was to reduce power consumption by not making monitors refresh as often, I'd expect to see broad support sooner rather than later--and that includes Intel.  I don't know if tablets and cell phones already have some way of doing the same thing, but if they don't, they probably will soon.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
     
    AMD just announced adaptive sync monitors from five vendors:  BenQ, LG, Nixeus, Samsung, and Viewsonic.  The LG monitors use a stupid resolution, and the Samsung are all 4K resolution with not enough inches.  But the others could be interesting if they use some panel technology that gives good image quality.  And, of course, these are the first monitors to support adaptive sync; they'll hardly be the last.
  • F0URTWENTYF0URTWENTY Member UncommonPosts: 349

     

    For anyone who doesn't understand, at 1440p you will not be running any games at 144fps to use that 144hz refresh rate. Stick to 1920x1080 monitors for gaming right now or be prepared to run all games on lowest settings in which case the 1440p is a handicap. When video cards can handle higher resolutions better, then start thinking about resolutions >1920. GPU's will need much higher memory than the 4gb standard currently to handle resolutions above 1920x1080, and it is not a focus for nvidia or amd at the moment.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Sketch420

     

    For anyone who doesn't understand, at 1440p you will not be running any games at 144fps to use that 144hz refresh rate. Stick to 1920x1080 monitors for gaming right now or be prepared to run all games on lowest settings in which case the 1440p is a handicap. When video cards can handle higher resolutions better, then start thinking about resolutions >1920.

    But the point of adaptive sync is that that 144 Hz is an upper bound, not some fractional multiple thing.  If you buy a 144 Hz adaptive sync monitor and your video card can deliver a smooth 113 Hz, the monitor displays it as a smooth 113 Hz with frame display times as even as the video card can deliver.  If the video card can deliver 79 Hz, you get 79 Hz, again as smooth as the video card can deliver.  You get it with lower display latency and no jutter.  If the video card can deliver 113 Hz in one area, and then it drops to 79 Hz is another, you get the maximum possible frame rate in both, with no jutter, no tearing, no having to adjust monitor settings yourself, and reduced display latency.

  • WizardryWizardry Member LegendaryPosts: 19,332

    Marketing is just that,marketing,they will always tell you the good side,never any down side.When i think of Acer i automatically think budget low end  peripherals,no matter what they say about their products.

    A perfect example is my monitor,i could look at stats and say my monitor is crap but i bought it looking at all the monitors with my eyes and it looked the best,that is all that matters to me.

    I know the term ghosting is most certainly real i have seen the results and on screens that were supposed to be really good,so you always need to see for yourself and not rely on stat sheets or what Tom's hardware or what any other tech site is saying,go look for yourself compare it to other screens on sale and just look don't read stat sheets.

    My true gut feeling is that when people buy something new they get all excited and want to see something better than is really there.I used to get all caught up in buying and checking out all the new stuff,get it home then realize ,wow this is not really worth the money to get such a small improvement,in other words if you are happy with a much cheaper screen,then it is plenty good enough.

    You only buy the very best and most expensive because you have money to waste not because you NEED it.

    Never forget 3 mile Island and never trust a government official or company spokesman.

  • F0URTWENTYF0URTWENTY Member UncommonPosts: 349
    Originally posted by Quizzical
    Originally posted by Sketch420

     

    For anyone who doesn't understand, at 1440p you will not be running any games at 144fps to use that 144hz refresh rate. Stick to 1920x1080 monitors for gaming right now or be prepared to run all games on lowest settings in which case the 1440p is a handicap. When video cards can handle higher resolutions better, then start thinking about resolutions >1920.

    But the point of adaptive sync is that that 144 Hz is an upper bound, not some fractional multiple thing.  If you buy a 144 Hz adaptive sync monitor and your video card can deliver a smooth 113 Hz, the monitor displays it as a smooth 113 Hz with frame display times as even as the video card can deliver.  If the video card can deliver 79 Hz, you get 79 Hz, again as smooth as the video card can deliver.  You get it with lower display latency and no jutter.  If the video card can deliver 113 Hz in one area, and then it drops to 79 Hz is another, you get the maximum possible frame rate in both, with no jutter, no tearing, no having to adjust monitor settings yourself, and reduced display latency.

     

    Right but competitive gamers want to play with >120hz. All 60 top csgo players have >120hz monitors for a reason. No sense buying an expensive 1440p 144hz monitor when you cant run any games at that frame rate with nearly twice the resolution of 1920x1080.

  • TamanousTamanous Member RarePosts: 3,030

    Man, companies sure can get people to buy crap that is useless. I understand higher refresh and other factors outside of resolution being the drive for true image quality but these tv and monitors already being pushed without DP 1.3 or HDMI 2.0 are a complete wastes of money. You can't even see 4k resolution until 10 feet from a 140" screen.

     

    Without improved bandwidth to the screen from video card you are actually reducing your image performance with zero increase to noticeable resolution by going to such high resolutions. This approach to selling isn't anything new but the amount of people blowing cash on "new tech" for absolutely no performance gain with the chance is even comes with legacy hardware and no chance to upgrade to functional standards like DP 1.3 and HDMI 2.0 is crazy. Luckily this lcd supports HDMI 2.0 at least.

    You stay sassy!

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Sketch420
    Originally posted by Quizzical
    Originally posted by Sketch420

     

    For anyone who doesn't understand, at 1440p you will not be running any games at 144fps to use that 144hz refresh rate. Stick to 1920x1080 monitors for gaming right now or be prepared to run all games on lowest settings in which case the 1440p is a handicap. When video cards can handle higher resolutions better, then start thinking about resolutions >1920.

    But the point of adaptive sync is that that 144 Hz is an upper bound, not some fractional multiple thing.  If you buy a 144 Hz adaptive sync monitor and your video card can deliver a smooth 113 Hz, the monitor displays it as a smooth 113 Hz with frame display times as even as the video card can deliver.  If the video card can deliver 79 Hz, you get 79 Hz, again as smooth as the video card can deliver.  You get it with lower display latency and no jutter.  If the video card can deliver 113 Hz in one area, and then it drops to 79 Hz is another, you get the maximum possible frame rate in both, with no jutter, no tearing, no having to adjust monitor settings yourself, and reduced display latency.

     

    Right but competitive gamers want to play with >120hz. All 60 top csgo players have >120hz monitors for a reason. No sense buying an expensive 1440p 144hz monitor when you cant run any games at that frame rate with nearly twice the resolution of 1920x1080.

    The difference in GPU load between 2560x1440 at 144 Hz and 1366x768 at 30 Hz is about the difference in GPU performance between relatively good laptop integrated graphics and a high end desktop video card.  Are games playable on laptop integrated graphics?  Not necessarily at max settings, but most of them are at reduced settings.

  • 13lake13lake Member UncommonPosts: 719

    Sketch, all top 60 CS:GO players play either @ 800x600 resolution or 1024x768, heck 99% of all pro cs:go players play at 1 of those 2 resolutions regardless what monitor they use, a 1440 would just enable them to stretch their twitch, and brodcaster programs more, and stretch out the twitch chat and rearrange their desktop.

     

    Also because ultra low motion blur and similar cannot be used while gsync adaptive sync is on(only when its turned off, its one or the other) the pro players never got to use adaptive sync before, are not using it now and probably never will use it.

     

    However i also don't know if it can be used with Freesync or adaptive sync in general, that would be a problem of can adaptive sync run when strobe is set to 50% or less for decreasing motion blur to levels of 500hz and 1000hz

     

     Considering it doesnt work with gsync it's highly likely it doesn't work with freesync, i vaguely remember reading that the adaptive sync and lightboost/utra low motion blur are mutually exclusive, but i don't recall the specifics of why, John Carmack did a whole huge explanation on it.

    If indeed somehow it could be used, then automatically all freesync monitors would become de facto cs monitors and none of  the pro scene would touch anything with a gsync board with a ten foot poll.

  • HrimnirHrimnir Member RarePosts: 2,415

    So, a guy clarified for me on the anandtech forums.  DP 1.2 is capable of 32bit color at 144hz/1440p, here was his explanation, for anyone who cares:

    err strike part of that; I was off one DP generation (and 2x speed) mentally. DP 1.2 is able to fit both of those resolutions in a single stream. For high bitrate color; output levels are 30 bits (10 bits/channel); which does just squeeze in at 1440p /144Hz. What is normally called 32-bit color on your GPU is 24bit color and an 8 bit alpha (transparency) channe. The alpha is only used in rendering and not output to the monitor. I'm not sure if 30bit color is bitpacked to 40bits (to minimize memory consumption) when an alpha channel is needed or expanded to 64 bits/pixel (memory accesses are easier to program/generally faster when aligned to the size of processor data words).

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • KyutaSyukoKyutaSyuko Member UncommonPosts: 288
    Originally posted by Quizzical
    Originally posted by Kilrane
    *snip*

    *snip*

    Nvidia hasn't officially announced that they'll support adaptive sync just yet, but I'll be very, very surprised if they don't.  It would be suicidal not to, as you'd be telling people considering buying video cards that they have to pay an extra $100 per monitor to get the same thing that AMD offers without the $100 surcharge.

    *snip*

    Excuse my ignorance, but in the nVidia Control Panel under the "Manage 3D settings" Vertical sync has Adaptive and Adaptive (half refresh rate) options.  I'm guessing that's not the same thing as what you're discussing now?

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by KyutaSyuko
    Originally posted by Quizzical
    Originally posted by Kilrane
    *snip*

    *snip*

    Nvidia hasn't officially announced that they'll support adaptive sync just yet, but I'll be very, very surprised if they don't.  It would be suicidal not to, as you'd be telling people considering buying video cards that they have to pay an extra $100 per monitor to get the same thing that AMD offers without the $100 surcharge.

    *snip*

    Excuse my ignorance, but in the nVidia Control Panel under the "Manage 3D settings" Vertical sync has Adaptive and Adaptive (half refresh rate) options.  I'm guessing that's not the same thing as what you're discussing now?

    As you guessed, that's something different.  Suppose that you have a 60 Hz monitor but your computer can only deliver 50 frames per second.  What should it do?

    Traditional vertical sync says, when you finish rendering a frame, stop for a bit to synchronize with when the monitor is next ready to display a frame.  That means that your computer will stop rendering much of the time, and you actually see 30 frames per second--though every frame will have the same amount of time that it is displayed.

    Triple buffering says, keep rendering frames all the time, and when it's time to display one, show the most recent completed frame.  If your system can render frames faster than your monitor can display them, then some of the frames never get shown, which is a waste of power.  This will get you 50 frames per second in the initial scenario, but some frames will be displayed twice as long as others, as the monitor has to refresh every 1/60 of a second, so it repeats the previous frame when it doesn't have one ready.

    Nvidia's adaptive v-sync says, don't stop rendering a frame unless you're rendering frames faster than the monitor can display them, in which case, you slow down to match the monitor refresh rate.  This is kind of a hybrid option between triple buffering and traditional vertical sync, combining the benefits of both.  If you're over the monitor refresh rate, it acts like traditional vertical sync, to ease the load on a GPU.  If you're under the monitor refresh rate, it acts like triple buffering, to get you the highest frame rate possible.

    The VESA standard adaptive sync says, let's not make the monitor refresh every 1/60 of a second.  Let's let it refresh whenever there is a new frame ready, up to some maximum rate.  So in the initial scenario, let's have the monitor refresh every 1/50 of a second.  This keeps your 50 frames per second, but unlike triple buffering or adaptive v-sync, every frame is displayed for 1/50 of a second, rather than some frames being up for 1/60 of a second and others for 1/30.  This allows smoother animations.

    Also important is that adaptive sync reduces display latency.  With any of the other options, once the monitor has a frame ready to display, it just sits on it for, on average, several milliseconds, waiting until the next time the monitor refreshes before doing anything with it.  Adaptive sync says, as soon as you have a new frame to display, refresh the monitor immediately rather than sitting and waiting.  It only waits if you're rendering frames at faster than the monitor's refresh rate, so that every refresh draws a new frame.

    Nvidia's adaptive v-sync was a good thing, and arguably the most important proprietary video driver feature of the last several years.  (The usual hyped proprietary stuff like GPU PhysX, CUDA, and Mantle are just stupid.)  If Nvidia fanboys were smart, they'd have pointed to it as a reason to claim that Nvidia's video drivers were better, rather than just vague claims that my neighbor's dog had an AMD card once thirty years ago and it spontaneously combusted and gave him fleas.  But with the arrival of adaptive sync, adaptive v-sync going to be obsolete very soon.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Wizardry
    A perfect example is my monitor,i could look at stats and say my monitor is crap but i bought it looking at all the monitors with my eyes and it looked the best,that is all that matters to me.I know the term ghosting is most certainly real i have seen the results and on screens that were supposed to be really good,so you always need to see for yourself and not rely on stat sheets or what Tom's hardware or what any other tech site is saying,go look for yourself compare it to other screens on sale and just look don't read stat sheets.

    I find this to be an extremely relevent statement with regard to monitors, and probably the biggest reason I haven't jumped on the hype train for any of this just yet.

    Monitor specs are notorious for being completely fabricated. Two monitors retailed from different companies with the exact same panel could list widely different "specs" on the side of their box.

    And none of that is accounting for biological differences in visual ability - eyesight, color perception, peripheral vision, acuity - those all vary a good deal from person to person, and often results in what one person says is great to another person saying looks horrible.

    With regard to a monitor, all that really matters is how it looks and performs to you. Maybe you can (or can't) tell 1080p from 4k, or 144Hz from 60Hz from 30Hz, or 24-bit color from 32-bit color, or 10ms latency from 2ms latency, but that doesn't mean that everyone sees it that way.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Hrimnir
    So, a guy clarified for me on the anandtech forums.  DP 1.2 is capable of 32bit color at 144hz/1440p, here was his explanation, for anyone who cares:

    err strike part of that; I was off one DP generation (and 2x speed) mentally. DP 1.2 is able to fit both of those resolutions in a single stream. For high bitrate color; output levels are 30 bits (10 bits/channel); which does just squeeze in at 1440p /144Hz. What is normally called 32-bit color on your GPU is 24bit color and an 8 bit alpha (transparency) channe. The alpha is only used in rendering and not output to the monitor. I'm not sure if 30bit color is bitpacked to 40bits (to minimize memory consumption) when an alpha channel is needed or expanded to 64 bits/pixel (memory accesses are easier to program/generally faster when aligned to the size of processor data words).


    There are machines that are driving 5K at 60Hz, 32bit color over a single DisplayPort (Retina 5k iMac)

    You are absolutely right about the alpha channel - thanks for posting that here for clarification.

    DP 1.2 supports up to 17.28 Gbit/s of video data in HBR2 mode (High Bit Rate).

    Just because I wanted to see the numbers for myself, I ran the calculations through. I was mildly interested, and bored stuck at work.

    2560x1440 is "1440p" - which is 3,686,400 pixels
    32-bit color actually only has 24 bits dedicated to color, per pixel
    144 Hz is 144 refreshes per second
    2^30 = 1 Gigabit, just to get our units straight

    Putting it all together:
    (2560 x 1440 x 24 x 144) / (2^30) = 11.87 Gbits/s
    well within the standard of 17.28 Gbit/s

    In fact, if you reverse engineer that, you can find the max display resolution for any given color depth or refresh rate:
    z = SQRT( 17.28 * 2^30 / ( timing * colordepth * 16 * 9) )
    16z x 9z = Maximum screen resolution for given video bandwidth (16:9 aspect ratio)
    Solved for truecolor color depth:
    60Hz is 4786x2692
    144Hz is 3089x1737
    Now those aren't real resolutions, just theoretical maximums. And if we do actually get adaptive refresh technology working well - then the timing variable becomes a bigger part of that since it's no longer a static number.

    The 5K problem:
    5120x2880 resolution, 60Hz, 32-bit color
    (5120 x 2880 x 24 x 60) / (2^30) = 19.78 Gbit/s

    How did Apple get 5k to work over a single DP connection? They haven't said, although the rumor is that since it's a proprietary system and they don't have to interface with any other standard, they just "overclocked" their displayport hardware driver (they claim a "custom display interface")

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Ridelynn

     


    Originally posted by Hrimnir
    So, a guy clarified for me on the anandtech forums.  DP 1.2 is capable of 32bit color at 144hz/1440p, here was his explanation, for anyone who cares:

     

    err strike part of that; I was off one DP generation (and 2x speed) mentally. DP 1.2 is able to fit both of those resolutions in a single stream. For high bitrate color; output levels are 30 bits (10 bits/channel); which does just squeeze in at 1440p /144Hz. What is normally called 32-bit color on your GPU is 24bit color and an 8 bit alpha (transparency) channe. The alpha is only used in rendering and not output to the monitor. I'm not sure if 30bit color is bitpacked to 40bits (to minimize memory consumption) when an alpha channel is needed or expanded to 64 bits/pixel (memory accesses are easier to program/generally faster when aligned to the size of processor data words).


     

    There are machines that are driving 5K at 60Hz, 32bit color over a single DisplayPort (Retina 5k iMac)

    You are absolutely right about the alpha channel - thanks for posting that here for clarification.

    DP 1.2 supports up to 17.28 Gbit/s of video data in HBR2 mode (High Bit Rate).

    Just because I wanted to see the numbers for myself, I ran the calculations through. I was mildly interested, and bored stuck at work.

    2560x1440 is "1440p" - which is 3,686,400 pixels
    32-bit color actually only has 24 bits dedicated to color, per pixel
    144 Hz is 144 refreshes per second
    2^30 = 1 Gigabit, just to get our units straight

    Putting it all together:
    (2560 x 1440 x 24 x 144) / (2^30) = 11.87 Gbits/s
    well within the standard of 17.28 Gbit/s

    In fact, if you reverse engineer that, you can find the max display resolution for any given color depth or refresh rate:
    z = SQRT( 17.28 * 2^30 / ( timing * colordepth * 16 * 9) )
    16z x 9z = Maximum screen resolution for given video bandwidth (16:9 aspect ratio)
    Solved for truecolor color depth:
    60Hz is 4786x2692
    144Hz is 3089x1737
    Now those aren't real resolutions, just theoretical maximums. And if we do actually get adaptive refresh technology working well - then the timing variable becomes a bigger part of that since it's no longer a static number.

    The 5K problem:
    5120x2880 resolution, 60Hz, 32-bit color
    (5120 x 2880 x 24 x 60) / (2^30) = 19.78 Gbit/s

    How did Apple get 5k to work over a single DP connection? They haven't said, although the rumor is that since it's a proprietary system and they don't have to interface with any other standard, they just "overclocked" their displayport hardware driver (they claim a "custom display interface")

    It's also possible to make a 5K "monitor" that is really two monitors right next to each other with no bezel between them.  Some have done that, and in an all-in-one, the multiple monitor connections wouldn't mean multiple external cables.  I'm not claiming that Apple did that; I'm only saying that it's possible and some have done it.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Quizzical
    It's also possible to make a 5K "monitor" that is really two monitors right next to each other with no bezel between them.  Some have done that, and in an all-in-one, the multiple monitor connections wouldn't mean multiple external cables.  I'm not claiming that Apple did that; I'm only saying that it's possible and some have done it.

    You are right - the Apple one is confirmed a single display with a single interface, not tiled displays. It's presumed to be the same panel as the Dell UP2715K - that panel uses 2 DP connections to drive 5k @ 60Hz, so maybe that's all Apple is doing internally as well.

Sign In or Register to comment.