Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Ugh, why Acer why... (New Gaming Monitor News)

HrimnirHrimnir Member RarePosts: 2,415

Why do i have the feeling i'm going to end up spending $1000-1500 in march 2015...

http://www.anandtech.com/show/8812/acer-announces-two-gaming-monitors-at-ces-2015

"The first monitor is the Acer XB270HU, which, according to the Acer press release, is the world’s first IPS monitor with G-SYNC capability. It is a 27” 2560x1440 resolution IPS panel, with a maximum refresh rate of 144 Hz. This will give much better viewing angles (up to 178°) and generally a better color accuracy as well, although that will have to be tested. The XB270HU also comes on a height adjustable stand which offers tilt and swivel. The specifics of the panel are not mentioned, so at this time we cannot say whether it is a 6 or 8 bit panel. Availability is March 2015."

"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

- Friedrich Nietzsche

«1

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,531

    Unless adaptive sync is a complete fraud, it will launch just in time to be obsolete.  After all, do you really want something that works properly only with an Nvidia GPU, or do you want something that works with everything for $100 cheaper?

    There's no way to deliver 32-bit color at 144 Hz and 2560x1440 resolution on a single DisplayPort 1.2a connection, let alone something lower bandwidth like HDMI.  That means using lower color depth, making it into multiple monitors with no bezel between them, or using DisplayPort 1.3 and requiring the use of a video card that doesn't exist yet.

  • JayFiveAliveJayFiveAlive Member UncommonPosts: 601
    Seems a lot of money for the hertz :P is it really worth it? I like my Korean 27" AH-IPS 1440p monitor that was $350 a year and a half ago. For that much money, I would personally wait for a 30+ inch 4K IPS, but that's me heh.
  • Zarf42Zarf42 Member Posts: 250
    Originally posted by Quizzical

    Unless adaptive sync is a complete fraud, it will launch just in time to be obsolete.  After all, do you really want something that works properly only with an Nvidia GPU, or do you want something that works with everything for $100 cheaper?

    There's no way to deliver 32-bit color at 144 Hz and 2560x1440 resolution on a single DisplayPort 1.2a connection, let alone something lower bandwidth like HDMI.  That means using lower color depth, making it into multiple monitors with no bezel between them, or using DisplayPort 1.3 and requiring the use of a video card that doesn't exist yet.

    If you only use Nvidia GPUs, then yes. The second monitor announced, has a red bezel, which looks terrible and would be distracting IMO.

  • BelgaraathBelgaraath Member UncommonPosts: 3,205
    Acer is HORRIBLE with supply and demand. Their last monitor has been difficult to find for months now unless you purchase an entire new system to get it OR if you pay more than the list price on amazon. I'm waiting patiently for Nvidia to release cards that will easily support 4K at 60fps without SLI and still play on high settings and it looks as if we are quite a ways off still. 

    There Is Always Hope!

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by Quizzical

    Unless adaptive sync is a complete fraud, it will launch just in time to be obsolete.  After all, do you really want something that works properly only with an Nvidia GPU, or do you want something that works with everything for $100 cheaper?

    There's no way to deliver 32-bit color at 144 Hz and 2560x1440 resolution on a single DisplayPort 1.2a connection, let alone something lower bandwidth like HDMI.  That means using lower color depth, making it into multiple monitors with no bezel between them, or using DisplayPort 1.3 and requiring the use of a video card that doesn't exist yet.

    Considering the only way I will EVER buy an AMD gpu at this point is if Nvidia no longer exists as a company... No, i'm not particularly worried about it.

    Edit: Also, i forgot to mention, there is a benefit to gsync that adapative vsync doesn't do.  Primarily removing the input lag, which on IPS monitors can be very significant

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by keithian
    Acer is HORRIBLE with supply and demand. Their last monitor has been difficult to find for months now unless you purchase an entire new system to get it OR if you pay more than the list price on amazon. I'm waiting patiently for Nvidia to release cards that will easily support 4K at 60fps without SLI and still play on high settings and it looks as if we are quite a ways off still. 

     I honestly don't get the 4K thing.  You have to be using either a stupidly large monitor, or sitting incredibly close to your monitor to be able to see jaggies at 1440p.  4k is COMPLETE overkill for the screen sizes we normally use on PC's.  Its very useful when you have a 40' diagonal screen in a theater, but outside of some applications in home theater, its just stupid and wasteful.

    Edit: Unfortunately i forgot to mention. Acer *may* not be at fault for this.  From what i understand there are only 2 gsync monitors available at all, and people have been buying them in bulk so they can resell them on places like amazon or ebay and stuff over the original MSRP because (IMO dumbasses) are willing to pay out the nose for it.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Zarf42
    Originally posted by Quizzical

    Unless adaptive sync is a complete fraud, it will launch just in time to be obsolete.  After all, do you really want something that works properly only with an Nvidia GPU, or do you want something that works with everything for $100 cheaper?

    There's no way to deliver 32-bit color at 144 Hz and 2560x1440 resolution on a single DisplayPort 1.2a connection, let alone something lower bandwidth like HDMI.  That means using lower color depth, making it into multiple monitors with no bezel between them, or using DisplayPort 1.3 and requiring the use of a video card that doesn't exist yet.

    If you only use Nvidia GPUs, then yes. The second monitor announced, has a red bezel, which looks terrible and would be distracting IMO.

    I can understand not caring if it works with AMD, if you're an Nvidia fanboy.  But why pay ~$100 extra for the privilege of having a monitor that won't work properly with an AMD video card?  If there's an adaptive sync version of the same monitor, that costs $100 less for Acer to build without losing any feature support, and you can bet that the difference gets passed on to the consumer.

    Because adaptive sync isn't available in commercial products just yet, I suppose it's still possible that the whole thing is a complete fraud and there will be lawsuits and firings and so forth.  But I don't regard that as terribly likely.

    -----

    There's also the issue that industry standards tend to be supported forever, while proprietary things that are obviously no better than an industry standard version don't necessarily have driver support last as long.  How long did Nvidia keep offering driver support for GLide, again?  They still support DirectX and OpenGL today.

    If you buy an Nvidia card today, it will support G-sync, though not necessarily with the bandwidth needed for that monitor.  But what if you buy a new Nvidia card in five years, and no one has cared about G-sync in four years by then?  Will the new card still support it?  How about ten years?  Monitors don't last forever, but they do tend to last longer than video cards.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Hrimnir

    Edit: Also, i forgot to mention, there is a benefit to gsync that adapative vsync doesn't do.  Primarily removing the input lag, which on IPS monitors can be very significant

    It's not possible to eliminate display latency entirely.  G-sync will reduce it, yes, but adaptive sync should likewise reduce it in the same way for the same reasons.  And the place where those can reduce display latency is irrelevant to IPS versus TN.  Both can eliminate the "we've got a full frame on the monitor ready to display but we're going to just sit on it until the next scheduled frame time" latency, but neither can affect IPS taking longer than TN to switch pixels from one color to another.

  • Zarf42Zarf42 Member Posts: 250
    I've only ever purchased Nvidia GPUs, since the 90s. I guess I've never had a reason not to. Does that make me a fanboy? Does G-sync make that much of a difference?
  • RidelynnRidelynn Member EpicPosts: 7,383

    I have been waiting for IPS panels to hit the higher end gaming scene, as I"m not a twitch gamer and I prefer the better color and image of the IPS panels to the faster response of the TN.

    I don't know if this particular monitor will be the one I was waiting for though.

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by Quizzical
    Originally posted by Zarf42
    Originally posted by Quizzical

    *snip*

    *snip*

    I can understand not caring if it works with AMD, if you're an Nvidia fanboy.  But why pay ~$100 extra for the privilege of having a monitor that won't work properly with an AMD video card?  If there's an adaptive sync version of the same monitor, that costs $100 less for Acer to build without losing any feature support, and you can bet that the difference gets passed on to the consumer.

    Because adaptive sync isn't available in commercial products just yet, I suppose it's still possible that the whole thing is a complete fraud and there will be lawsuits and firings and so forth.  But I don't regard that as terribly likely.

    -----

    There's also the issue that industry standards tend to be supported forever, while proprietary things that are obviously no better than an industry standard version don't necessarily have driver support last as long.  How long did Nvidia keep offering driver support for GLide, again?  They still support DirectX and OpenGL today.

    If you buy an Nvidia card today, it will support G-sync, though not necessarily with the bandwidth needed for that monitor.  But what if you buy a new Nvidia card in five years, and no one has cared about G-sync in four years by then?  Will the new card still support it?  How about ten years?  Monitors don't last forever, but they do tend to last longer than video cards.

     Well, you hit it on the nailhead. "If there's an adaptive sync version of the same monitor"

    Obviously i'll re-evaluate things once this is actually.  But for the time being this is the only thing on the horizon that ticks all the checkboxes, and i do mean all of them.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Zarf42
    I've only ever purchased Nvidia GPUs, since the 90s. I guess I've never had a reason not to. Does that make me a fanboy? Does G-sync make that much of a difference?

    G-Sync and Freesync are essentially identical.

    One is nVidia's brand, proprietary, and requires additional hardware installed in the monitor (as well as a supported nVidia video card). The other just requires using DisplayPort with a compatible monitor and AMD has made the standard open source and license free, nVidia hasn't commented on if it will support it or not via it's drivers. AMD is supporting Freesync with pretty much every GCN-based video card with a DisplayPort output, and possibly some earlier than that.

    Both essentially do the same thing though:

    A traditional monitor runs at a set refresh rate: usually 60Hz, but not always. Your video card produces frames independent of that rate; if they don't sync up well (which is what Vsync tries to do), you can get tearing or shuddering or a host of other problems. Enabling Vsync to fix those problems artificially limits your video card to a lower frame rate, and if it can't hit those frame rates exactly will really produce a huge additional performance hit, which can cut your effective FPS in half or more.

    G-Sync/Freesync allow the video card to drive the refresh rate, and for that refresh rate to be variable: when a frame is ready, it's updated on the monitor. It results in a much more fluid visual experience on the monitor, without huge drastic dips in framerate associated with refresh rate or VSync.

    I haven't seen either of these work in person, but I have seen the effects of tearing and Vsync - and I know if the technology works as advertised and they do go away I'd be a lot happier.

    As far as if using nVidia makes you a fanboy or not - it's your money, buy what you want. I'd only consider you a fanboy if you were pushing other people to spend their money based on your preferences.

  • RidelynnRidelynn Member EpicPosts: 7,383

    I have been modestly interested in this one (I run a side-by-side monitor setup now):

    LG 21:9 Freesync monitor

  • CryptorCryptor Member UncommonPosts: 523
    Only $1500 for a monitor that will make my game look exactly the same as my $300 monitor.  Oh boy what a bargain, you should get three!
  • KilraneKilrane Member UncommonPosts: 322

    Please, correct me if I'm wrong, but aren't the Freesync monitors only made to work on specific AMD video cards? if this is true, whats the difference between Freesync and G-Sync, outside of the fact that Freesync is based off of an open standard while G-Sync is a closed standard?

     

    To me it's no different if the reasoning is that you can't use a G-Sync monitor on non-Nvidia hardware in the future.

     

    I did a quick google search and I didn't see anything pop up as far as adaptive sync other than the Freesync monitors that will be hitting the market soon.

     

    For what it's worth, I do run an Nvidia card and I have been very interested in obtaining a G-Sync monitor. However, I'm holding out till it looks like long term support is all but guaranteed. Plus, I think the prices are a bit too high for my wallet for the time being. If there are going to be monitors that will do exactly what G-Sync does, for less, and supports my current video card (GTX 970) I'll keep an eye out for them.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Kilrane
    Please, correct me if I'm wrong, but aren't the Freesync monitors only made to work on specific AMD video cards? if this is true, whats the difference between Freesync and G-Sync, outside of the fact that Freesync is based off of an open standard while G-Sync is a closed standard?


    Freesync is designed to work via the DisplayPort standard - so theoretically, any GPU that supports the updated DisplayPort standards can with with Freesync, and monitor manufacturers don't have to include extra hardware, and video card manufacturers don't have to pay AMD licensing fees for official driver support.

    That being said, any other video card manufacturer could also work with G-Sync, the difference being that the monitor manufacturer has to buy the G-Sync module from nVidia, and the video card company would likely have to pay licensing fees in order to include driver support.

    So GSync right now is exclusive to nVidia, but it wouldn't necessarily have to be that way if Intel or AMD wanted to license it. Freesync is developed by AMD, but has explicitly been made an open standard, and the only thing preventing Intel/nVidia from supporting it as well is for them to include support with their driver.

  • breadm1xbreadm1x Member UncommonPosts: 374

    144hz refresh ?

    all its missing is the "3D" label and it will sell like hotcakes.

    Anyways its not an IPS panel its called AHVA

    http://www.tftcentral.co.uk/news_archive/31.htm#144hz_ips


  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by breadm1x

    144hz refresh ?

    all its missing is the "3D" label and it will sell like hotcakes.

    Anyways its not an IPS panel its called AHVA

    http://www.tftcentral.co.uk/news_archive/31.htm#144hz_ips

     AHVA is a marketing term from AUO, the screen uses In-Plane Switching (IPS)

    http://www.tftcentral.co.uk/articles/content/panel_technologies_content.htm#ahva

    "Again while not strictly an IPS panel variant, we have left AHVA in this section as it is designed as an "IPS-mode" technology. Introduced first at the end of 2012 this technology is designed by AU Optronics as another alternative to IPS. Confusingly the AHVA name makes it sound like it's a VA-type panel, which AU Optronics have been manufacturing for many years. It should not be confused with AMVA which is their current "true" VA technology produced. To date there has only been one AHVA panel produced, a 27" 2560 x 1440 resolution module which has been used in only one screen. The BenQ BL2710PT featured this new technology and gave us some insight into the performance characteristics of AHVA.

    Response time specs reach as low as 4ms G2G on paper but in reality the matrix does not perform any better than the faster IPS or PLS panels. Contrast ratios can reach up to the advertised 1000:1 and viewing angles are also very comparable to IPS. There is no off-centre contrast shift like you see on normal VA panels, but a pale glow is visible on dark content from an angle like with IPS/PLS. There is no support for higher refresh rates than 60Hz at this time. All in all AHVA does seem to be very comparable to IPS in practice. It remains to be seen whether AU Optronics will continue to invest in this technology much and whether other sized panels will emerge."

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • KilraneKilrane Member UncommonPosts: 322
    Originally posted by Ridelynn

     


    Originally posted by Kilrane
    Please, correct me if I'm wrong, but aren't the Freesync monitors only made to work on specific AMD video cards? if this is true, whats the difference between Freesync and G-Sync, outside of the fact that Freesync is based off of an open standard while G-Sync is a closed standard?

     


    Freesync is designed to work via the DisplayPort standard - so theoretically, any GPU that supports the updated DisplayPort standards can with with Freesync, and monitor manufacturers don't have to include extra hardware, and video card manufacturers don't have to pay AMD licensing fees for official driver support.

    That being said, any other video card manufacturer could also work with G-Sync, the difference being that the monitor manufacturer has to buy the G-Sync module from nVidia, and the video card company would likely have to pay licensing fees in order to include driver support.

    So GSync right now is exclusive to nVidia, but it wouldn't necessarily have to be that way if Intel or AMD wanted to license it. Freesync is developed by AMD, but has explicitly been made an open standard, and the only thing preventing Intel/nVidia from supporting it as well is for them to include support with their driver.

    AMD's FAQ about Freesync states otherwise. 

     

    "To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort Adaptive-Sync, a compatible AMD Radeon™? GPU with a DisplayPort connection, and a compatible AMD Catalyst™ graphics driver. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort Adaptive-Sync monitors."

     

    http://support.amd.com/en-us/search/faq/216

  • CleffyCleffy Member RarePosts: 6,414
    It's an LG panel. They are probably gonna sell it for less than the 27" LG. LG commonly sells their monitors for more than their partners.
  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Kilrane
    AMD's FAQ about Freesync states otherwise.  "To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort Adaptive-Sync, a compatible AMD Radeon™? GPU with a DisplayPort connection, and a compatible AMD Catalyst™ graphics driver. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort Adaptive-Sync monitors." http://support.amd.com/en-us/search/faq/216

    Of course it says that on AMD's web page.

    The fact that it's been adopted by VESA as a royalty-free standard for DisplayPort 1.2a for use by any digital video output device doesn't mean anything, I suppose.

    So.. if you have any graphics card with DisplayPort 1.2a (or later, when 1.3 and such get released), you technically have everything you need for Freesync. It's just a matter of if the vendor decides to include the feature in their driver package or not.

    Now I suppose Intel and nVidia could choose to consciously forgo DisplayPort from here forward (HDMI does work nearly as well, after all), or stubbornly refuse to support any revision beyond 1.2, but I don't know why they would, since you do have to pay royalties for HDMI (payable to the HDMI Forum, Inc., who's members include Sony, Hitachi, Phillips, Matshashita, RCA, and a few others).

    So hmm... I guess you can read into AMD's FAQ what you want. You aren't incorrect in that right now only AMD has announced support for Freesync, and only nVidia has announced support for GSync.

    But the only thing preventing anyone else from supporting Freesync is adding it to a driver, whereas the only thing preventing GSync is a hardware module in a monitor and paying nVidia some amount of cash, as well as adding it to your driver.

    So I'm with Quiz on this one: You have two competing standards that do essentially the same thing. One is proprietary and costs a lot of money. The other is free and open. There are examples where proprietary and costly wins out, but only when they have a clear and obvious technical advantage, and that is yet to be seen with Adaptive Sync technologies, 2015 will be the telling year.

    I'd put my money on Freesync, and nVidia adding it to their driver package sooner or later. That said, if you already have an nVidia card, GSync is available now and you don't lose anything by getting it, as you already have the supported hardware.

  • KilraneKilrane Member UncommonPosts: 322
    I apologize if I came off as argumentative, that was not my intention. First off, you and Quizz know a lot more than I do, I only posted the results I was able to dig up. I honestly like the cheaper route and I have no intention of buying a G-Sync monitor anytime soon. I was just confused by the fact that from what I was able to gather with my google-fu skills I was under the impression that AMD's solution was only going to be directly supported by AMD's GPU line. 
  • BraindomeBraindome Member UncommonPosts: 959
    Gaming monitor huh? Bet you can't even play Duck Hunt w/lightgun on that thing. :p
  • Loke666Loke666 Member EpicPosts: 21,441

    Well, my one year old Dell screen don't have the refresh rate but it cost me 500 Euro back then, same size and same resolution.

    So while this is a nice screen it wont be worth the price. 

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Kilrane

    Please, correct me if I'm wrong, but aren't the Freesync monitors only made to work on specific AMD video cards? if this is true, whats the difference between Freesync and G-Sync, outside of the fact that Freesync is based off of an open standard while G-Sync is a closed standard?

     

    To me it's no different if the reasoning is that you can't use a G-Sync monitor on non-Nvidia hardware in the future.

     

    I did a quick google search and I didn't see anything pop up as far as adaptive sync other than the Freesync monitors that will be hitting the market soon.

     

    For what it's worth, I do run an Nvidia card and I have been very interested in obtaining a G-Sync monitor. However, I'm holding out till it looks like long term support is all but guaranteed. Plus, I think the prices are a bit too high for my wallet for the time being. If there are going to be monitors that will do exactly what G-Sync does, for less, and supports my current video card (GTX 970) I'll keep an eye out for them.

    FreeSync is what AMD calls their implementation of the industry standard adaptive sync.  If a monitor supports adaptive sync, then it supports FreeSync--but it can also do exactly the same thing on a GPU built by any other vendor willing to support the industry standard.  Nvidia hasn't officially announced that they'll support adaptive sync just yet, but I'll be very, very surprised if they don't.  It would be suicidal not to, as you'd be telling people considering buying video cards that they have to pay an extra $100 per monitor to get the same thing that AMD offers without the $100 surcharge.

    G-sync is proprietary to Nvidia, and no one else can support it in their drivers even if they want to.

    I'd regard it as likely that all DisplayPort monitors that launch after some date in the future support adaptive sync.  Once you figure out how to make it work, supporting it is free.  G-sync, however, requires a monitor vendor to buy $100 in proprietary hardware from Nvidia.  You don't want to do that if you can get the same thing for free.

    I wouldn't be terribly surprised if Nvidia decides to overload the term G-sync and use the name for their own implementation of adaptive sync.

Sign In or Register to comment.