Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Checklist for my next computer

QuizzicalQuizzical Member LegendaryPosts: 25,499

The ideal time to buy a new computer is right after a bunch of important product launches, not right before.  Obviously, if your old computer dies prematurely and you need something new, you buy something new rather than going without a computer for a year.  But if you don't have such an emergency situation, it's useful to be able to time your purchases.  And a good time to buy a new computer is probably coming soonish.

For my next computer, I want:

1)  A Radeon R9 Fury X,

2)  Monitors that are larger than 1080p, at least 120 Hz, IPS panel, support adaptive sync, and allow the monitor to pivot to use a portrait view rather than landscape,

3)  A Sky Lake quad core CPU, and

4)  Windows 10.

On (1), a GeForce GTX 980 Ti could be substituted in if Nvidia decides to support adaptive sync.  At the moment, it doesn't look like that will happen in the near future, though I do expect them to support it eventually because it would be suicidal not to.  Regardless, with both AMD and Nvidia having recently launched new, high end video cards, it's not a bad time to buy one there.

On (2), we have the recent release of this:

http://www.newegg.com/Product/Product.aspx?Item=N82E16824236466

I'm not saying that I'm definitely going to get that monitor.  But it checks all the boxes, and I'm not aware of any other monitor on the market that does everything I want.

On (3), Internet rumors say August.  I don't know exactly how good Sky Lake will be, but it's very plausible that it will push desktop CPU performance forward more than any previous generation has done so since Sandy Bridge brought a big jump at the start of 2011.

On (4), we're looking at the end of July.  Yes, yes, Microsoft offers a free upgrade from Windows 7 or 8.1, but it's better to do a clean install of the OS you want than to install some other OS and hope that an OS upgrade process goes smoothly.

So I'm probably looking at an August timeframe for my next computer.  Most other components are mature markets now; we're not going to see some big jump in power supply or memory performance from a revolutionary new product.  In memory, that's especially the case if you have DDR4, which Sky Lake presumably will.

Coming immediately after major launches on all of the components where major launches bring big jumps is the perfect time to buy a new computer.  And that's what we're probably going to be looking at in August.

I haven't yet mentioned SSDs, and those are going to have some transitions soon with movement to PCI Express, TLC NAND, and 3D NAND.  There are already some of each on the market, but they're going to all be a lot more common.  But I'm not keen on TLC NAND at all, and not willing to pay a large price premium for the others, as they don't matter much.  3D NAND might eventually bring large increases in capacity for a given price tag, but that could easily be years away.

«1

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,499

    Modern computers have a bunch of PCI Express lanes of different sorts.  For example, a Haswell based system with a Z97 chipset has a PCI Express 3.0 x16 connection that can be split into two x8 or an x8/x4/x4 configuration.  It also has 8 PCI Express 2.0 lanes from the chipset, probably with an option for an x4 connection and several x1, though I'm not sure how it is done.  Which PCI Express connections from a chipset go to which motherboard slots (or are used to offer extra SATA or USB ports, etc.) is dependent on the motherboard.  You're always going to have to use some PCI Express lane(s) to attach a PCI Express device, but it's a question of how it's set up and finding a motherboard that does it the way you want it.

    My plan is to get three monitors in portrait mode for an enormous resolution that way, with plenty of vertical screen space, which is usually the limiting factor anyway.  Three monitors in landscape mode doesn't add as much vertical space and the stuff way off to the far sides wouldn't be so useful, anyway.

    It's likely that eventually SSDs are mostly going to be TLC 3D NAND like the Samsung 850 Evo has, but we're not there yet.  And considering how much trouble Samsung has had trying to get it to work right, I'd say it's still immature and just avoid it for now.  I don't think it's an accident that the rest of the industry avoided TLC NAND in SSDs for about two years after Samsung started using it--and that Samsung had plenty of problems with it in that time.

    In the meantime, it's not like it's unduly expensive to get a reasonable amount of MLC NAND.  For example:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16820226689

  • Entropy14Entropy14 Member UncommonPosts: 675

    I will be buying a new computer as well in August, my system is 5 or 6 years old now, time for a complete overhaul.

     

    Skylake does look like it will be a pretty nice chip, the only thing I am on the fence with is the video card.  Do I buy a good high end one now , and make it last a few years, or do I buy a $200ish card and wait for 2016 to buy a high end one , since we are at the end of the cycle for these video cards.  I really want the 16nm video cars with all the new bells and whistles , it should be a huge step up over these very outdated 28nm cards.  

     

     

  • NephelaiNephelai Member UncommonPosts: 185

    Just replaced my dead 780Ti with a 980TI and finished in the > 96% range of 3DMark. My old 2700k@4.8Ghz on air is still kicking butt.

    Cant see any reason to upgrade a CPU until Intel ups the herbz - I couldn't care less about green improvements.

    Fury X - how could you even consider that after the reviews. They had to slap water cooling on it to get it to within 5% of a 980TI.

    Imagine a water cooled 980TI.

     

     

  • NitthNitth Member UncommonPosts: 3,904


    Originally posted by Quizzical
    My plan is to get three monitors in portrait mode for an enormous resolution that way, with plenty of vertical screen space.

    Good luck trying to keep your sanity with those amd drivers.

    image
    TSW - AoC - Aion - WOW - EVE - Fallen Earth - Co - Rift - || XNA C# Java Development

  • wandericawanderica Member UncommonPosts: 371
    Originally posted by Entropy14

    I will be buying a new computer as well in August, my system is 5 or 6 years old now, time for a complete overhaul.

     

    Skylake does look like it will be a pretty nice chip, the only thing I am on the fence with is the video card.  Do I buy a good high end one now , and make it last a few years, or do I buy a $200ish card and wait for 2016 to buy a high end one , since we are at the end of the cycle for these video cards.  I really want the 16nm video cars with all the new bells and whistles , it should be a huge step up over these very outdated 28nm cards.  

     

     

    That's a tough choice.  With the 14/16 nm shrinks and DX12 right around the corner, I would be tempted, now more than ever before, to wait.  However, with 28 nm having run its course, so to speak, the leap in current gen technology has really pushed what they can do.  The 980s and the Fury X (once they get air cooling released, and drivers worked out) are great cards.  It would likely be a long time before you could no longer play at the settings you desire.  My suggestion would be to grab a 290 or 970 and upgrade sooner after the next gens come out. 

     

    OP may not have that luxury.  Custom resolutions, like we see with 3 monitor setups, can really test a single card.  Keeping up with the Jones' gets harder and more expensive once you get into these high end multi-monitor setups.  Man are they sweet to look at though.


  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by Nephelai

    Just replaced my dead 780Ti with a 980TI and finished in the > 96% range of 3DMark. My old 2700k@4.8Ghz on air is still kicking butt.

    Cant see any reason to upgrade a CPU until Intel ups the herbz - I couldn't care less about green improvements.

    Fury X - how could you even consider that after the reviews. They had to slap water cooling on it to get it to within 5% of a 980TI.

    Imagine a water cooled 980TI.

    You say that as though liquid cooling is a bad thing.  It ordinarily adds about $100 or so to the price of a card, which is the main reason people usually go with air cooling.

    But if you want to know why the Fury X, try reading my posts.  I'm planning on getting three monitors at the same time.  G-sync adds about $150 to the price of a monitor over adaptive sync.  So if I go with Nvidia, three monitors at an extra $150 each adds about $450 to the price of the computer.

    Is a GeForce GTX 980 Ti a better card than a Radeon R9 Fury X?  Perhaps, though it's close.  It's it $450 worth of better?  Most certainly not.  If Nvidia still doesn't support adaptive sync when it's time to buy, getting the Fury X will be an easy call.  If they do, then I'll have something to think about.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by Nitth

     


    Originally posted by Quizzical
    My plan is to get three monitors in portrait mode for an enormous resolution that way, with plenty of vertical screen space.


     

    Good luck trying to keep your sanity with those amd drivers.

    AMD drivers are about as good as Nvidia drivers and have been for several years.  I've got a Radeon HD 5850 now and it hasn't had much in the way of driver problems.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by wanderica
    Originally posted by Entropy14

    I will be buying a new computer as well in August, my system is 5 or 6 years old now, time for a complete overhaul.

     

    Skylake does look like it will be a pretty nice chip, the only thing I am on the fence with is the video card.  Do I buy a good high end one now , and make it last a few years, or do I buy a $200ish card and wait for 2016 to buy a high end one , since we are at the end of the cycle for these video cards.  I really want the 16nm video cars with all the new bells and whistles , it should be a huge step up over these very outdated 28nm cards.  

     

     

    That's a tough choice.  With the 14/16 nm shrinks and DX12 right around the corner, I would be tempted, now more than ever before, to wait.  However, with 28 nm having run its course, so to speak, the leap in current gen technology has really pushed what they can do.  The 980s and the Fury X (once they get air cooling released, and drivers worked out) are great cards.  It would likely be a long time before you could no longer play at the settings you desire.  My suggestion would be to grab a 290 or 970 and upgrade sooner after the next gens come out. 

     

    OP may not have that luxury.  Custom resolutions, like we see with 3 monitor setups, can really test a single card.  Keeping up with the Jones' gets harder and more expensive once you get into these high end multi-monitor setups.  Man are they sweet to look at though.

    Current cards will mostly support DirectX 12, which borrows quite a bit from AMD Mantle, after all.  Indeed, AMD cards will support more of DirectX 12 than Nvidia, though developers will probably avoid anything not supported by both outside of sponsored games.

    I'm not expecting much in the way of frame rate problems.  Those tend to be caused by excessively demanding graphical settings that don't do much other than hurt your frame rates.  Depth of field usually makes games look worse, not better.  Same with shadows.  Ambient occlusion makes a game look different, but not really better or worse.  Turn those off and you'll more than double your frame rate in a lot of demanding games.

  • RidelynnRidelynn Member EpicPosts: 7,383

    My only real question is why Adaptive Sync is such a priority in the first place?

    Your getting a top-tier video card, and not expecting frame rates to be a problem. Adaptive Sync really only benefits you when your struggling, to eliminate the VSync penalty. Maybe with 3x1440 your pushing a Fury X hard enough to get that, but as you say, if your having frame rate issues, you just lower the settings a bit...

    To compound it, both sides are having first-generation issues with their products, it's unclear if those are just driver issues, or hardware spec issues, or what. It could very well be that we don't see this particular technology carried forward (or at least in a manner that is backwards compatible). Like SLI and CFX, when it works, it works great. When it doesn't, your better off turning it off.

    The choice between FreeSync and G-Sync, yeah, that's mostly preferential (and financial), and I guess if your buying monitors you'd rather have it than not.. but as far as needing your video card to utilize it right now, that's the part I don't really understand yet.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499

    All that a monitor needs to do to be able to support adaptive sync is to tell a video card what frame rates it can handle, refresh when the video card tells it to, and perhaps do something sane if the video card asks it to do something that it can't.

    The hard part is figuring out when the video card should ask a monitor to refresh.  There could easily be some gains left to be had by driver improvements there.  But I'd be shocked if the hardware is fundamentally broken and needs replacing.  What the hardware has to do here shouldn't be hard.

    And this is not at all like CrossFire/SLI, which require understanding some finer details of a game engine to get them to work properly, in addition to varying by hardware even for a given game.  All that drivers need to decide on for adaptive sync is when it makes sense for a monitor to refresh, and all that you really need there is some timing data on when the last however many frames were ready.  Picking the optimal refresh time from that timing data is not trivial, but it doesn't depend on the game or the GPU, and only depends on the monitor to the extent that different monitors can have different max and min refresh rates.

  • RidelynnRidelynn Member EpicPosts: 7,383

    The technology isn't at all like CFX/SLI, but the marketing and hype very well could be.

    If it doesn't catch on, if it doesn't drive sales - do you think it will continue to be supported?

    3d stereoscopic is in a similar boat, physics coprocessors (they were an addon board once upon a time), discrete sound cards, I think VR will be as well...

    All those things were crazy neat when they first came out, and either evolved into something else, or just faded away into relative obscurity or extreme niche markets.

    In that regard, Adaptive Sync could very well be like all those other neat, but ultimately insignificant, technologies.

    And I think it could be a game changer, I am interested in the technology behind it, but more in the vein of making lower-performing cards seem to perform better; or rather, make it less distracting when they are not capable of performing as well - I can see it being a big deal in the laptop area in particular. But in a desktop, with a top tier card... I wouldn't consider it a priority, and by the time games and such have advanced to the point where that build would benefit greatly from it, Adaptive Sync could look a lot different than it does today: A different physical interface, different hardware requirements, it could be native in every video card by them, or it could have passed by the wayside and been largely ignored and irrelevant. It's all first-gen right now, and hard to tell where it will go.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499

    Let's suppose that I drop most of the monitor requirements, and just want an IPS panel with a high (> 80 Hz) refresh rate.  What are my options?  There's the Asus panel I linked earlier, and there's this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16824009742

    Nearly the same thing from Acer, except G-sync instead of adaptive sync and $200 more.  And, as best as I can tell, that's it.  I know that you value IPS image quality, and surely you can see the value of higher refresh rates for gaming.  Even if you regard adaptive sync as a nearly meaningless throw-in, it's not likely that it would change my monitor decision.

    Unlike stereoscopic 3D or CrossFire/SLI, adaptive sync doesn't require a bunch of extra hardware.  All it requires in the monitor is the ability to refresh when the video card tells it to, and to communicate with the video card as to when it should refresh.  If some other monitor port comes in to replace DisplayPort, I'd bet that it is built to have the little bit of signaling data that requires, too.

    The original impetus for adaptive sync surely isn't going away, either:  a monitor that can refresh irregularly can refresh less often when it makes sense to do so, thus saving power in a laptop.  That's why I'd bet on Intel supporting the standard, too, and sooner rather than later.  Intel is not going to be happy if using an AMD APU rather than Intel enables the monitor to use 1 W less power.

  • RidelynnRidelynn Member EpicPosts: 7,383

    "Because I want to" is a perfectly valid reason. Lord knows I waste enough of my money using that excuse.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499

    30 fps is an awfully low threshold.  If a game can get 60 fps on average, being able to refresh when it makes sense will on average reduce display latency by 8 ms because you can skip the "sit on a frame and wait for the next scheduled refresh time" portion.  For 40 fps, that's 12 ms.  In both cases, this is in addition to making all frames display for about the same amount of time rather than having jutter, and that means smoother animations.

    Now, as the frame rate gets higher, this matters less because the frames are so short.  But if I want to play games at a resolution of 4320x2560, I'm not at all confident that most games are going to be able to deliver 144 frames per second, even on high end hardware and at reduced graphical settings.

    But as I said above, even if I said I don't care about adaptive sync, but just want a high frame rate and IPS panel, that doesn't change the choice of monitor.  It's probable that going forward, all or nearly all high refresh rate monitors are going to have adaptive sync or G-Sync, at least until Nvidia decides to stop pushing the latter.

  • RidelynnRidelynn Member EpicPosts: 7,383

    My point would be that your basing your video card decision off a technology right now that is immature, and in your application, doesn't even really matter that much.

    The 980Ti, by most benchmarks right now, appears to be the better card. It certainly uses less power. And it costs less right now (although a good sale could easily swing that either way). The only area AMD has right now is that the Fury X is physically smaller, and considerably so.

    And how many times have I heard you say if Item XYZ performs as well or better than ABC, and for less money, then why pay more for equal or less?

    So sure, it may not affect which monitor you would be getting, but no one is suggesting you get a different monitor, or change your monitor specs. It's your particular instance on FreeSync that has locked you out of a better performing and less expensive video card option that we are questioning. It seems like your basing this off "Because the monitor supports it I'm going to use it", or rather "I want this tech", moreso than if it will do you any good or not - pretty much the same argument people use for getting PhysX support.

    I'm sure you've read the Anand review, which pretty much echos what I'm saying:
    http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/27

    If it's just "because I want to" - that's fine. But you seem to be ignoring your own advice here, and your responses always skirt the questions. But you don't have to justify it to us, it's your money and your computer.

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by Quizzical

    Let's suppose that I drop most of the monitor requirements, and just want an IPS panel with a high (> 80 Hz) refresh rate.  What are my options?  There's the Asus panel I linked earlier, and there's this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16824009742

    Nearly the same thing from Acer, except G-sync instead of adaptive sync and $200 more.  And, as best as I can tell, that's it.  I know that you value IPS image quality, and surely you can see the value of higher refresh rates for gaming.  Even if you regard adaptive sync as a nearly meaningless throw-in, it's not likely that it would change my monitor decision.

    Unlike stereoscopic 3D or CrossFire/SLI, adaptive sync doesn't require a bunch of extra hardware.  All it requires in the monitor is the ability to refresh when the video card tells it to, and to communicate with the video card as to when it should refresh.  If some other monitor port comes in to replace DisplayPort, I'd bet that it is built to have the little bit of signaling data that requires, too.

    The original impetus for adaptive sync surely isn't going away, either:  a monitor that can refresh irregularly can refresh less often when it makes sense to do so, thus saving power in a laptop.  That's why I'd bet on Intel supporting the standard, too, and sooner rather than later.  Intel is not going to be happy if using an AMD APU rather than Intel enables the monitor to use 1 W less power.

    The other thing to consider is its likely the only reason GSYNC is more expensive than adaptive sync is because there isn't any real competition... yet.

    Once adaptive sync monitors come out i see one of two things happening.

    1. Adaptive Sync monitors come out at a lower price point and force gsync monitors to be cheaper.

    2. Adaptive Sync monitors come out at similar price points to gsync (this i think is most likely to happen)

    I honestly don't see it just being the case where the gsync monitors end up $150 more than an adaptive sync.  Thats just bad business, they wouldnt sell and the companies would be forced to reduce the costs.  Sadly like i said i think the more likely option, give that those are more "enthusiast" grade monitors, is that the adaptive sync ones will just come out at a similar price point.

    Either way i would wait to buy your Fury X until the adaptive sync monitors are securely on the market at a settled price point to make your decision.  Because frankly the ONLY reason it would be worth it over a 980ti would be if it does pan out the way you think it will.  If it doesnt, then you're better off with the nvidia / gsync solution.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • ZandilZandil Member UncommonPosts: 252

    I'm by far no expert on market or hardware pricing, but with the VR headsets all due out end this year - Q1 next year.

    Oculus has stated Min GTX 970 / R9 and would assume Vive will be around the same, A lot of people will be going for either these or higher end versions in the next 6 months, a lot of incoming sales in the GFX card area, wonder if this will affect prices ?

    image
  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Zandil
    I'm by far no expert on market or hardware pricing, but with the VR headsets all due out end this year - Q1 next year.

    Oculus has stated Min GTX 970 / R9 and would assume Vive will be around the same, A lot of people will be going for either these or higher end versions in the next 6 months, a lot of incoming sales in the GFX card area, wonder if this will affect prices ?


    Only a few things have historically moved GPU prices after the MSRP has been announced:

    a) Competition on the price/performance curve from the other side. Both major GPU manufacturers will often juggle prices significantly in response to a new product or price cut from the opposite side. There is no mystery why the 980Ti released (and the price point it's MSRP was set at) and the 980 price was slashed just before the Fury X was announced. AMD does the same thing - there's a reason the 390X costs what it does. Both want to maximize profit margin, but they have to balance that across the competition - no surprises there.

    That is the most common method of price adjustments in most tech sectors.

    b) Extreme crunch on demand outstrips supply, and supply vs demand pushes prices up. You don't see this very often. The bitcoin craze threw 6970 and 7970 prices through the roof for a little while - and that was mostly on the backs of entrepreneurs and retailers buying them at MSRP when they could and (re)selling them at whatever markup the market would bear - the GPU manufacturers don't see much of that price hike.

    New cards don't tend to discount previous cards - they stay at their price until the inventory goes, or a retailer is willing to take the loss to clear their shelf space (these are the fire sales you see just before a new product release - the retail takes that loss). A mild surge in demand won't affect prices majorly - you'd see fewer rebates and other incentives, but the sticker price isn't going to budget much.

    Something like VR coming out - I don't see that impacting much on prices at all. It will be first generation, with little software support, and extremely niche. It would take some time before VR could start driving hardware sales with a measurable impact, and we'd likely be on a new generation (or two) beyond what we have now if/when that occurs anyway.

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Hrimnir

    1. Adaptive Sync monitors come out at a lower price point and force gsync monitors to be cheaper.

    2. Adaptive Sync monitors come out at similar price points to gsync (this i think is most likely to happen)

    I honestly don't see it just being the case where the gsync monitors end up $150 more than an adaptive sync.  Thats just bad business, they wouldnt sell and the companies would be forced to reduce the costs.  Sadly like i said i think the more likely option, give that those are more "enthusiast" grade monitors, is that the adaptive sync ones will just come out at a similar price point.

    Either way i would wait to buy your Fury X until the adaptive sync monitors are securely on the market at a settled price point to make your decision.  Because frankly the ONLY reason it would be worth it over a 980ti would be if it does pan out the way you think it will.  If it doesnt, then you're better off with the nvidia / gsync solution.


    Well, right now, neither side has a lot of traction - both are still very much first-gen products, with nVidia having about a year, maybe 18 months, of lead time on AMD. And the reality right now is that, yes, GSync does cost $150 more. Which I'm sure is no small part why there just aren't a lot of GSync monitors out there.

    That being said, AMD was able to get theirs included in the DisplayPort specifications (which is probably no small part in why it's so much later to the market), nVidia's requires a seperate hardware module. So there is a cost factor for GSync for that hardware module that isn't there for FreeSync. Now, that's probably not $150, but it's more than $0.

    But the real reason that GSync costs $150 more - is because enough people are paying it. Sure, nVidia could drive more adoption by choosing to support FreeSync, or by lowering the cost of GSync. But they haven't - which means either that they are getting enough sales and are comfortable with the adoption rate (and the nice profit margin doesn't hurt, but mostly it's just a marketing bulletpoint they can use to drive card sales), or they are still evaluating the market impact of the technology, or they have Rev 2.0 in the works and are just letting 1.0 float until they can release 2.0.

    I suspect the latter, and the latest Origin laptop announcement by nVidia gives a decent sneak peak at what nVidia's overall strategy is. I do think your right, if this picks up, nVidia can't compete against Free for too much longer, even the Green fanboys will start to balk at that - but the nVidia brand has always carried some premium price, and I wouldn't be shocked in the least if nVidia never supports Freesync outright, that GSync 2.0 is really the same thing as Freesync but with an nVidia licenseing fee (such that it gets rid of the additional hardware costs - since AMD already did that for them), and that GSync always carries some price premium on a monitor (even if it's only $20-50).

  • RidelynnRidelynn Member EpicPosts: 7,383

    For the record, I am interested in Adaptive Sync (I don't really care which brand) - but it's not to the point that it's usable for me.

    I run 2x 1920x1200 monitors right now. Neither are GSync/FreeSync - the newest is a couple of years old, the oldest is ... right at 10 years now?

    I don't play in extended monitor mode - I game on one screen, and have web browser/etc open on the second. So I'm essentially gaming at 60Hz 1080p (a bit higher, because I run 16x10, the difference amounts to about an extra inch of vertical on my 24" monitors).

    Since I tab over to the other screen a lot, I usually play in Windowed mode (Windowed Fullscreen where available, and when it's not I use a program like Gamers Window Relocator to get it as close as I can.

    Adaptive Sync, neither one to my knowledge, works in Windowed mode, it requires Fullscreen. And in my case, I'm effectively only driving 1080p, that doesn't require a lot of GPU power to keep a decent framerate - and indeed in most nearly every game, I'm able to keep a decent 60 clip in most situations. I'm not particularly sensitive to it, I'm getting old and slow, 30fps honestly is playable to me, although I can notice the difference.

    I'm due for a new monitor (or two, or three), but I'm not willing to sacrifice the convenience of Windowed gaming for Adaptive Sync yet, so I'm brute forcing enough GPU power and adjusting game settings accordingly to maintain a high enough frame rate - just like we've always done up until Adaptive Sync.

  • syntax42syntax42 Member UncommonPosts: 1,385
    Originally posted by Quizzical

    3D NAND might eventually bring large increases in capacity for a given price tag, but that could easily be years away.

    3D NAND might be closer than you think.

    http://www.engadget.com/2015/03/27/toshiba-intel-3d-nand-chips/

    http://www.gizmag.com/high-capacity-3d-flash-memory/36782/

     

    According to the first article, Micron and Intel are already manufacturing the technology required to produce 10TB 2.5" form factor SSDs.  Even if they aren't available to consumers within a year, something cheaper at a lower capacity should be.  At this point, SSD technology is just a waiting game.  It will get cheaper than hard drives on a price per GB ratio soon.  The longer you can wait to buy, the better SSD you'll get for your money.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by Hrimnir
    Originally posted by Quizzical

    Let's suppose that I drop most of the monitor requirements, and just want an IPS panel with a high (> 80 Hz) refresh rate.  What are my options?  There's the Asus panel I linked earlier, and there's this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16824009742

    Nearly the same thing from Acer, except G-sync instead of adaptive sync and $200 more.  And, as best as I can tell, that's it.  I know that you value IPS image quality, and surely you can see the value of higher refresh rates for gaming.  Even if you regard adaptive sync as a nearly meaningless throw-in, it's not likely that it would change my monitor decision.

    Unlike stereoscopic 3D or CrossFire/SLI, adaptive sync doesn't require a bunch of extra hardware.  All it requires in the monitor is the ability to refresh when the video card tells it to, and to communicate with the video card as to when it should refresh.  If some other monitor port comes in to replace DisplayPort, I'd bet that it is built to have the little bit of signaling data that requires, too.

    The original impetus for adaptive sync surely isn't going away, either:  a monitor that can refresh irregularly can refresh less often when it makes sense to do so, thus saving power in a laptop.  That's why I'd bet on Intel supporting the standard, too, and sooner rather than later.  Intel is not going to be happy if using an AMD APU rather than Intel enables the monitor to use 1 W less power.

    The other thing to consider is its likely the only reason GSYNC is more expensive than adaptive sync is because there isn't any real competition... yet.

    Once adaptive sync monitors come out i see one of two things happening.

    1. Adaptive Sync monitors come out at a lower price point and force gsync monitors to be cheaper.

    2. Adaptive Sync monitors come out at similar price points to gsync (this i think is most likely to happen)

    I honestly don't see it just being the case where the gsync monitors end up $150 more than an adaptive sync.  Thats just bad business, they wouldnt sell and the companies would be forced to reduce the costs.  Sadly like i said i think the more likely option, give that those are more "enthusiast" grade monitors, is that the adaptive sync ones will just come out at a similar price point.

    Either way i would wait to buy your Fury X until the adaptive sync monitors are securely on the market at a settled price point to make your decision.  Because frankly the ONLY reason it would be worth it over a 980ti would be if it does pan out the way you think it will.  If it doesnt, then you're better off with the nvidia / gsync solution.

    In order to make a monitor with G-sync, the monitor manufacturer has to buy a module from Nvidia for about $100 and incorporate it into the monitor.  Adaptive sync does not analogously require extra hardware to add to the cost of building the monitor.  After various markups from everyone in the chain, that $100 difference ends up about $150 at retail.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by syntax42
    Originally posted by Quizzical

    3D NAND might eventually bring large increases in capacity for a given price tag, but that could easily be years away.

    3D NAND might be closer than you think.

    http://www.engadget.com/2015/03/27/toshiba-intel-3d-nand-chips/

    http://www.gizmag.com/high-capacity-3d-flash-memory/36782/

     

    According to the first article, Micron and Intel are already manufacturing the technology required to produce 10TB 2.5" form factor SSDs.  Even if they aren't available to consumers within a year, something cheaper at a lower capacity should be.  At this point, SSD technology is just a waiting game.  It will get cheaper than hard drives on a price per GB ratio soon.  The longer you can wait to buy, the better SSD you'll get for your money.

    If you want an SSD with 3D NAND, you can buy it today:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16820147361

    But you pay a hefty price premium for it.  I'm not interested in paying that price premium.

    Eventually 3D NAND will probably be the cheapest way to build chips of a given capacity, at which point, it could bring SSD prices down considerably.  But we're not there yet.

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by Quizzical
    Originally posted by Hrimnir
    Originally posted by Quizzical

    Let's suppose that I drop most of the monitor requirements, and just want an IPS panel with a high (> 80 Hz) refresh rate.  What are my options?  There's the Asus panel I linked earlier, and there's this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16824009742

    Nearly the same thing from Acer, except G-sync instead of adaptive sync and $200 more.  And, as best as I can tell, that's it.  I know that you value IPS image quality, and surely you can see the value of higher refresh rates for gaming.  Even if you regard adaptive sync as a nearly meaningless throw-in, it's not likely that it would change my monitor decision.

    Unlike stereoscopic 3D or CrossFire/SLI, adaptive sync doesn't require a bunch of extra hardware.  All it requires in the monitor is the ability to refresh when the video card tells it to, and to communicate with the video card as to when it should refresh.  If some other monitor port comes in to replace DisplayPort, I'd bet that it is built to have the little bit of signaling data that requires, too.

    The original impetus for adaptive sync surely isn't going away, either:  a monitor that can refresh irregularly can refresh less often when it makes sense to do so, thus saving power in a laptop.  That's why I'd bet on Intel supporting the standard, too, and sooner rather than later.  Intel is not going to be happy if using an AMD APU rather than Intel enables the monitor to use 1 W less power.

    The other thing to consider is its likely the only reason GSYNC is more expensive than adaptive sync is because there isn't any real competition... yet.

    Once adaptive sync monitors come out i see one of two things happening.

    1. Adaptive Sync monitors come out at a lower price point and force gsync monitors to be cheaper.

    2. Adaptive Sync monitors come out at similar price points to gsync (this i think is most likely to happen)

    I honestly don't see it just being the case where the gsync monitors end up $150 more than an adaptive sync.  Thats just bad business, they wouldnt sell and the companies would be forced to reduce the costs.  Sadly like i said i think the more likely option, give that those are more "enthusiast" grade monitors, is that the adaptive sync ones will just come out at a similar price point.

    Either way i would wait to buy your Fury X until the adaptive sync monitors are securely on the market at a settled price point to make your decision.  Because frankly the ONLY reason it would be worth it over a 980ti would be if it does pan out the way you think it will.  If it doesnt, then you're better off with the nvidia / gsync solution.

    In order to make a monitor with G-sync, the monitor manufacturer has to buy a module from Nvidia for about $100 and incorporate it into the monitor.  Adaptive sync does not analogously require extra hardware to add to the cost of building the monitor.  After various markups from everyone in the chain, that $100 difference ends up about $150 at retail.

    I understand that, but that module doesn't cost $100.  Nvidia is charging that much for the module, but it probably costs them less than $10 to produce.  Ridelynn made a great post.  The point i was trying to make is that the $150 price premium is only plausible right now because there is no market competition.  Once adaptive sync monitors start coming out, Nvidia will be forced to shit or get off the pot.  Like Ride said, they might be able to get away with a 20 or 40 dollar price premium, but certainly not 150.  My big concern because companies do this kind of crap in the enthusiast market all the time, is that rather than try to undercut nvidia, they will just sell their products at the same or similar price point to gsync so they can reap bigger profits.  The market has already proven that people will pay that kind of money for those monitors (look at me).  So why sell something for less than you could?

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • syntax42syntax42 Member UncommonPosts: 1,385
    Originally posted by Quizzical
    Originally posted by syntax42
    Originally posted by Quizzical

    3D NAND might eventually bring large increases in capacity for a given price tag, but that could easily be years away.

    3D NAND might be closer than you think.

    http://www.engadget.com/2015/03/27/toshiba-intel-3d-nand-chips/

    http://www.gizmag.com/high-capacity-3d-flash-memory/36782/

     

    According to the first article, Micron and Intel are already manufacturing the technology required to produce 10TB 2.5" form factor SSDs.  Even if they aren't available to consumers within a year, something cheaper at a lower capacity should be.  At this point, SSD technology is just a waiting game.  It will get cheaper than hard drives on a price per GB ratio soon.  The longer you can wait to buy, the better SSD you'll get for your money.

    If you want an SSD with 3D NAND, you can buy it today:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16820147361

    But you pay a hefty price premium for it.  I'm not interested in paying that price premium.

    Eventually 3D NAND will probably be the cheapest way to build chips of a given capacity, at which point, it could bring SSD prices down considerably.  But we're not there yet.

    That isn't far off from Samsung's typical price premium for their SSDs.  The 840 series was slightly more expensive on a cost per GB than the market rate when it launched, too.

     

    If you're looking to buy now, Samsung seems to be the only 3D NAND option.  Obviously, it isn't worth the premium at this moment.  If you're considering buying in the next two years, my point was that 3D NAND will play a part in the rapid decrease of pricing of SSDs.  One article predicted SSDs will be dropping below hard drives in price per GB by the end of 2016.

    Link to pretty graph:  http://wikibon.org/w/images/4/44/Projection2015-2020CapacityDiskNANDflash.png

    The closer to December of 2016 that you can wait, the better your options will be.

Sign In or Register to comment.