Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

I facepalm AMD(due to first day reviews, it changes quit a bit in last few days) it's been beaten ev

1356

Comments

  • JohnP0100JohnP0100 Member UncommonPosts: 401
    So AMD's flagship is not as good as Nvida and their other products are just refreshes.
    I mean... wow...

    It shows what PvP games are really all about, and no, it's not about more realism and immersion. It's about cowards hiding behind a screen to they can bully other defenseless players without any risk of direct retaliation like there would be if they acted like asshats in "real life". -Jean-Luc_Picard

    Life itself is a game. So why shouldn't your game be ruined? - justmemyselfandi

  • JayFiveAliveJayFiveAlive Member UncommonPosts: 601
    Is this honestly a surprise to anyone? :P The Fury X was very suspect from the delays and crap. Hopefully Samsung or some other company buys them and makes them a better company.
  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by Classicstar

    It even in some games 980 came close to Fury X.

    AMD will not survive this.

    Sorry i even thought AMD now would comeback with big bang and be KING again oh man was i wrong. I dunno what AMD is doing but this is as i see it now last nail in the coffin.

    I won't upgrade only thing they improved is TEMPATURE which is extremely good but noise watt and performance mediocre at best very sad indeed.

    I will never buy Nvidia because of greedy manipulative company but now i won't upgrade because of AMD FAILED HARD saying fastest card in world while it way behind 980 ti and Titan X.

    Also i refuse to pay 750 euros for card thats not even faster then 980 ti.

    Sorry AMD unless DX12 do wonders(no game in DX12)or next year HBM 2.0?(don't think you survive this disaster)im done with upgrading. Stick with my 290x 2x forever:P

    Sorry guys you can burn me down now as you all do so well(it was never personal but some always think is so im done with hardware section)

    This all based on REVIEW GURU3D so burn them is they showed a bad made review?

    ROFLMAO.  How is this possible, you were SO confident it was the giant killer.  I was just about to come here and post this link:

    http://www.pcgamer.com/amd-radeon-r9-fury-x-tested-not-quite-a-980-ti-killer/

    Anyways, i don't want to rub too much salt in the wound (not that im particularly happy about this, i would rather it have been a strong competitor, just to push Nvidia and benefit the consumer, etc).  I will admit im surprised to see you willing to admit that its not actually faster.  I was expecting lots of spinning and hemming and hawing to be honest.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • BaitnessBaitness Member UncommonPosts: 675
    What a bad bunch of decisions by AMD.  They have built a business around being the cheaper option, and they pull this crap?  I have always preferred nvidia products, but team green is all too willing to skyrocket their prices if AMD does not provide real competition.  Fix your shit, AMD.
  • PurutzilPurutzil Member UncommonPosts: 3,048
    If you need the latest and greatest its Nvidia. If you want value its AMD. TO me I stand by AMD since it gives good bang for the buck and with a small portion being it is the lesser of two evils by a decent gap. 
  • F0URTWENTYF0URTWENTY Member UncommonPosts: 349
    Originally posted by Purutzil
    If you need the latest and greatest its Nvidia. If you want value its AMD. TO me I stand by AMD since it gives good bang for the buck and with a small portion being it is the lesser of two evils by a decent gap. 

     

    Have you even looked at the prices? Where I live the AMD FURY X is more expensive than the 980 TI. So no they are not offering more bang for the buck, in this case they are offering much less with their flagship card.

     

    Their rebranded cards are more bang for the buck sure but the FURY X is worse and more expensive.

  • ThorqemadaThorqemada Member UncommonPosts: 1,282

    What do you expect - they use the same old 28nm process Nvidia uses and run into physical limitations.

    You should be happy AMD drives Nvidia to cut their GPU Prices - aside of that i would not support Nvidias bad business practices and sect like attitudes!

    "Torquemada... do not implore him for compassion. Torquemada... do not beg him for forgiveness. Torquemada... do not ask him for mercy. Let's face it, you can't Torquemada anything!"

    MWO Music Video - What does the Mech say: http://www.youtube.com/watch?v=FF6HYNqCDLI
    Johnny Cash - The Man Comes Around: https://www.youtube.com/watch?v=Y0x2iwK0BKM

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by skeaser
    Originally posted by Laughing-man
    Originally posted by skeaser

    Originally posted by Thane

    each user has to believe in AMD once, and buy at least one of their cards.

    after that, that user is usualy healed of the AMD idea.

    Pretty much this.

    Originally posted by Laughing-man
    So the underdog can't compete against the evil empire that is cornering the market through sneaky tactics.  Even with all their honest marketing and transparent corporate intentions they just can't compete with $$$$$.  Seems like the oldest tale ever told.

    I don't see how NVidia is the evil empire. They make good stuff. 

     Guess you haven't been paying attention to G-sync and other such things that will essentially make it so if you aren't running Nvidia on Nvidia based products then you are going to have a bad time. Nvidia has been making game makers 'optimize' games for years so they run 'better' on Nvidia systems.  Nvidia now is having you buy a monitor to match the card you're buying, and if you are trying to run AMD with these things it will just slow everything down because the code isn't 'optimized' for AMD.    Nvidia has always had the money to convince developers to make an unfair playing field for those two companies.

    Oh, they're evil like Apple is evil for not making their apps run on Android, or how Microsoft is evil for not making sure their programs run on Linux.

    You expect NVidia to spend money on tech then open-source it so their competitor can make money? On what planet does that work?

    I have been paying attention to G-sync, I'm saving up for my Acer XB27HU 1440P IPS g-sync monitor and I'm stoked. I don't care if NVidia shuts AMD out because AMD sucks. I broke away from NVidia once and bought the hype about how AMD is better for cheaper and it was the worst build I've ever made. 

    Bottom line is if you buy NVidia, you won't have to worry about everything you're talking about.

    Seriously LOVE that monitor, bought one a little over a month ago and don't regret the purchase in any way shape or form.

    Granted, it did put me into the position of needing a new video card, so i dropped on a 980ti, but, overall im pretty set for the next 2-3 years or soon the video card, and short of the monitor 'sploding i should be good for a good long while on the monitor.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • The user and all related content has been deleted.

    image

    Somebody, somewhere has better skills as you have, more experience as you have, is smarter than you, has more friends as you do and can stay online longer. Just pray he's not out to get you.
  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    I think people underestimate just how much of a test part the Fury X is.

    There have, over the course of the years, been a number of video cards that, while commercially available, existed primarily as an experiment.  See, for example, the Radeon HD 4770.  AMD surely lost money on that card directly, as yields were abysmal.  But what AMD learned from it contributed greatly to AMD having markedly superior hardware for the entire duration of the 40 nm process node, about a 2 1/2 year span.  And that paid for the 4770's development many times over.

    The Radeon HD 4770 was hardly a top end, flagship card.  It was a dinky little chip less than 1/4 of the size of Fiji or GM200.  It served primarily to test out TSMC's 40 nm process node.  One could argue that GM207 was a test part to some degree, testing out the Maxwell architecture so that Nvidia could learn how stuff worked and apply it to the rest of the chips of that architecture that would come many months later.

    But test parts are basically never flagship parts.  Besides Fiji, I can't think of another that has been.  The problem is that a small test part for HBM doesn't make a bit of sense.  You don't stick a 1200 mm^2 silicon interposer into a part that sells for $100 at retail and expect to make money.  I don't know if anyone else in the whole history of computing had ever made a single chip with a 4096-bit memory bus like Fiji has.  I'm not aware of any others wider than the 512 bits in the Radeon R9 290X, Radeon HD 2900 XT, GeForce GTX 280/285, Intel Xeon Phi Knight's Corner, and other parts based on those same dies.

    It's a pretty safe bet that there are things in Fiji's HBM controller that AMD screwed up.  And it's an equally safe bet that AMD is going to learn from them and apply the lessons to their next HBM controller.  If Fiji results in AMD really getting HBM right on 14/16 nm parts while Nvidia struggles with it--basically what happened for a few years with GDDR5--then Fiji was worthwhile for AMD regardless of today's reviews.

    But also important is that Fiji isn't the first chip of many of an architecture that will be derived from it.  Bulldozer was such a disaster not because one chip was bad, but because many chips that would be derived from it over the course of several years would also be troubled.  The same is true for the original Pentium 4.  Returning to GPUs, one could make a similar case for the GeForce GTX 480, though GPU architectures tend to be shorter lived than CPU architectures.

    But Fiji is not the first chip of a new architecture.  It is likely the last discrete GPU chip AMD will ever build on 28 nm, and so it tells is little about what AMD will release in a year or two.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by Mtibbs1989
    Originally posted by Laughing-man
    Originally posted by skeaser

    Originally posted by Thane

    each user has to believe in AMD once, and buy at least one of their cards.

    after that, that user is usualy healed of the AMD idea.

    Pretty much this.

    Originally posted by Laughing-man
    So the underdog can't compete against the evil empire that is cornering the market through sneaky tactics.  Even with all their honest marketing and transparent corporate intentions they just can't compete with $$$$$.  Seems like the oldest tale ever told.

    I don't see how NVidia is the evil empire. They make good stuff. 

     Guess you haven't been paying attention to G-sync and other such things that will essentially make it so if you aren't running Nvidia on Nvidia based products then you are going to have a bad time. Nvidia has been making game makers 'optimize' games for years so they run 'better' on Nvidia systems.  Nvidia now is having you buy a monitor to match the card you're buying, and if you are trying to run AMD with these things it will just slow everything down because the code isn't 'optimized' for AMD.    Nvidia has always had the money to convince developers to make an unfair playing field for those two companies.

    You do know that a company isn't forced to support their competitor's product right? So why should Nvidia boost AMD so that G-sync works for their hardware?

    Nvidia is by default the larger competitor between the two companies. It's only logical to select the larger slice of pie to focus on supporting. 

    Nvidia is not forcing you to buy anything with their GPU's. I have a 980 and I don't need the G-sync. Nor do I see fluctuations often enough to justify G-Sync. Improve your computer and you won't have to rely on these special features...

    G-sync does the same thing as adaptive sync.  The difference is that G-sync adds about $100 to the price of building a monitor and adaptive sync doesn't.  After markups along the way, that's a difference of about $150 at retail.

    If Nvidia supported adaptive sync and just kept G-sync support in drivers for several years so as not to pull the rug out from under early adopters, I'd be fine with it.  G-sync monitors would disappear pretty quickly as everyone moved to adaptive sync to save $150.  But that's not what Nvidia is doing.  They're refusing to support adaptive sync.  Who benefits from more monitors doing unnecessarily proprietary stuff and not working properly with video cards?  Certainly not gamers.

    I'm planning on building a new computer in about two months.  I'm going to get new monitors at the same time, and planning on getting three of them.  If I go with Nvidia, that adds about $450 to the price of the monitors as compared to going with AMD.  Is a GeForce GTX 980 Ti a better card than a Radeon R9 Fury X?  There's a case for it, especially at lower resolutions.  Is it $450 worth of better at ultra high resolutions?  Obviously not.  If Nvidia still won't support adaptive sync, getting a Fury X will be an easy and obvious call for me.

    Now, I think ceding the market of people who might upgrade their monitors sometime to AMD is positively suicidal on Nvidia's part.  And that's why I expect them to support adaptive sync eventually.  But they want as many G-sync monitors out there as they can possibly get first, so that people in the future upgrading a video card will think they need to buy Nvidia to work properly with their monitor.

    Who benefits from such vendor lock-in?  Certainly not gamers who want to buy whatever makes sense each generation without your purchase today locking you in to an inferior part a few years down the road.  But pushing this sort of garbage is standard procedure for Nvidia.  See, for example, HairWorks, GPU PhysX, or CUDA.

  • The user and all related content has been deleted.

    image

    Somebody, somewhere has better skills as you have, more experience as you have, is smarter than you, has more friends as you do and can stay online longer. Just pray he's not out to get you.
  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by Quizzical

    I think people underestimate just how much of a test part the Fury X is.

    There have, over the course of the years, been a number of video cards that, while commercially available, existed primarily as an experiment.  See, for example, the Radeon HD 4770.  AMD surely lost money on that card directly, as yields were abysmal.  But what AMD learned from it contributed greatly to AMD having markedly superior hardware for the entire duration of the 40 nm process node, about a 2 1/2 year span.  And that paid for the 4770's development many times over.

    The Radeon HD 4770 was hardly a top end, flagship card.  It was a dinky little chip less than 1/4 of the size of Fiji or GM200.  It served primarily to test out TSMC's 40 nm process node.  One could argue that GM207 was a test part to some degree, testing out the Maxwell architecture so that Nvidia could learn how stuff worked and apply it to the rest of the chips of that architecture that would come many months later.

    But test parts are basically never flagship parts.  Besides Fiji, I can't think of another that has been.  The problem is that a small test part for HBM doesn't make a bit of sense.  You don't stick a 1200 mm^2 silicon interposer into a part that sells for $100 at retail and expect to make money.  I don't know if anyone else in the whole history of computing had ever made a single chip with a 4096-bit memory bus like Fiji has.  I'm not aware of any others wider than the 512 bits in the Radeon R9 290X, Radeon HD 2900 XT, GeForce GTX 280/285, Intel Xeon Phi Knight's Corner, and other parts based on those same dies.

    It's a pretty safe bet that there are things in Fiji's HBM controller that AMD screwed up.  And it's an equally safe bet that AMD is going to learn from them and apply the lessons to their next HBM controller.  If Fiji results in AMD really getting HBM right on 14/16 nm parts while Nvidia struggles with it--basically what happened for a few years with GDDR5--then Fiji was worthwhile for AMD regardless of today's reviews.

    But also important is that Fiji isn't the first chip of many of an architecture that will be derived from it.  Bulldozer was such a disaster not because one chip was bad, but because many chips that would be derived from it over the course of several years would also be troubled.  The same is true for the original Pentium 4.  Returning to GPUs, one could make a similar case for the GeForce GTX 480, though GPU architectures tend to be shorter lived than CPU architectures.

    But Fiji is not the first chip of a new architecture.  It is likely the last discrete GPU chip AMD will ever build on 28 nm, and so it tells is little about what AMD will release in a year or two.

    The problem is this could very well put them out of business.  Even worse is that by the time they figure out and sort out these potential issues you speak of, Nvidia will likely have pascal out or be very close to pascal being out.  And if we're being honest, Nvidia just simply doesn't make these epic of engineering screwups.  The whole 3.5gb of memory thing was far and away Nvidias worst screwup and in reality it was something that made an almost insignificant difference in performance.  And that whole fiasco wasn't like it wasn't designed that way on purpose, the issue was that they (technically) lied to customers.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Hrimnir
    And if we're being honest, Nvidia just simply doesn't make these epic of engineering screwups.  The whole 3.5gb of memory thing was far and away Nvidias worst screwup and in reality it was something that made an almost insignificant difference in performance.  And that whole fiasco wasn't like it wasn't designed that way on purpose, the issue was that they (technically) lied to customers.

    Are you sure about that?

    First generation Fermi comes to mind immediately as possibly the most epic, and AMD had the performance crown there for a good while.

  • stefanakisgrstefanakisgr Member UncommonPosts: 38
    Originally posted by Ridelynn

     


    Originally posted by Hrimnir
    And if we're being honest, Nvidia just simply doesn't make these epic of engineering screwups.  The whole 3.5gb of memory thing was far and away Nvidias worst screwup and in reality it was something that made an almost insignificant difference in performance.  And that whole fiasco wasn't like it wasn't designed that way on purpose, the issue was that they (technically) lied to customers.

     

    Are you sure about that?

    First generation Fermi comes to mind immediately as possibly the most epic, and AMD had the performance crown there for a good while.

      Indeed , also fx 5xxx series , where nvidia didnt bother to follow DX specs while lying about it. Both firms lie on occasion , and that's a fact.  AMD was king for a while , and I owned AMD , I like Nvidia lineup these days and own nvidia. Will move to AMD again as soon as I feel its worth it. Or when enough time has passed. 

    It is very important that there is competition. Prices go down , they drive each other to innovation , etc. Dont stick to one brand just because. If one brand wins , they will be the only winners. Consumers will lose , big time. 

  • HrimnirHrimnir Member RarePosts: 2,415
    Originally posted by Ridelynn

    Originally posted by Hrimnir
    And if we're being honest, Nvidia just simply doesn't make these epic of engineering screwups. The whole 3.5gb of memory thing was far and away Nvidias worst screwup and in reality it was something that made an almost insignificant difference in performance. And that whole fiasco wasn't like it wasn't designed that way on purpose, the issue was that they (technically) lied to customers.

    Are you sure about that?

    First generation Fermi comes to mind immediately as possibly the most epic, and AMD had the performance crown there for a good while.

    Can't watch youtube at work, you're gonna have to be more specific.

    From memory i remember Fermi being hot, but (again from memory) that was basically because they were making the chips physically massive with a large number of transistors.

    The point is its not like they didnt expect it, it wasn't an engineering screwup.  Its like saying that Taking a 5 liter V8, adding 4 more cylinders and making it a v12 thats 7.5 liters that uses an absurd amount of gas is an engineering screw up.  An engineering screw up would be doing that and then not upgrading the fuel pump to allow it to get enough fuel to the cylinders and then gimping the engine because it can't get enough fuel.

    Edit: Ok, Ridelynn, did some reading/research. I officially retract my previous statement. Fermi was a pretty epic engineering screwup.

    Originally posted by stefanakisgr
    Originally posted by Ridelynn

    Originally posted by Hrimnir
    And if we're being honest, Nvidia just simply doesn't make these epic of engineering screwups. The whole 3.5gb of memory thing was far and away Nvidias worst screwup and in reality it was something that made an almost insignificant difference in performance. And that whole fiasco wasn't like it wasn't designed that way on purpose, the issue was that they (technically) lied to customers.

    Are you sure about that?

    First generation Fermi comes to mind immediately as possibly the most epic, and AMD had the performance crown there for a good while.

    Indeed , also fx 5xxx series , where nvidia didnt bother to follow DX specs while lying about it. Both firms lie on occasion , and that's a fact. AMD was king for a while , and I owned AMD , I like Nvidia lineup these days and own nvidia. Will move to AMD again as soon as I feel its worth it. Or when enough time has passed.

    It is very important that there is competition. Prices go down , they drive each other to innovation , etc. Dont stick to one brand just because. If one brand wins , they will be the only winners. Consumers will lose , big time.

    First, i agree 100% with your second point.

    To your first point, i was referring strictly to engineering, i.e. phsyical design of the card/chip.  Nvidia has definitely had some questionable decision and business practices in the past, i won't even try to deny that.  But as far as the actual engineering/design of the hardware, they've always been top notch.

    Edit: Ok, Ridelynn, did some reading/research. I officially retract my previous statement. Fermi was a pretty epic engineering screwup.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • HrimnirHrimnir Member RarePosts: 2,415
    Ok, Ridelynn, did some reading/research.  I officially retract my previous statement.  Fermi was a pretty epic engineering screwup.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • daltaniousdaltanious Member UncommonPosts: 2,381
    Originally posted by Thane

    each user has to believe in AMD once, and buy at least one of their cards.

    after that, that user is usualy healed of the AMD idea.

    Not really, my last 3 cards were AMD, better or same performance for less money. And I'm not talking about some fancy math stats ... I'm talking about actually playing. Son have nVidia which is way noisier and not better if I compare similar cards. Yes, once problem with AMD was slow and bad drivers updates ... but this is not true for long. Still this does not mean I'm married to AMD :-), will decide at next buy ... but for sure I have no reason not to consider AMD.

  • JohnP0100JohnP0100 Member UncommonPosts: 401

    I'm not terribly surprised that Quizzical decided to fanboy over AMD; facts be damned!

    What is interesting is that there are people who reported a lot of this before the official announcement and got it right.

    These are the people who you should follow for the next release. They've already proven that they know what they are talking about.

    I'm also not convinced that '14nm will save AMD!' is a good argument when it is still (at best) 12 months away from a product in the store.

    It also makes no sense to believe the 'hype' again just after AMD let everyone down. Remember the 'fastest GPU ever' line from a week ago by AMD? Yeah...

    It shows what PvP games are really all about, and no, it's not about more realism and immersion. It's about cowards hiding behind a screen to they can bully other defenseless players without any risk of direct retaliation like there would be if they acted like asshats in "real life". -Jean-Luc_Picard

    Life itself is a game. So why shouldn't your game be ruined? - justmemyselfandi

  • GdemamiGdemami Member EpicPosts: 12,342


    Originally posted by JohnP0100I'm not terribly surprised that Quizzical decided to fanboy over AMD; facts be damned!

    Hehe yeah, he rarely ever bothers with actual facts or data. If there is by chance need for any, he will simply make them up :-)

  • 13lake13lake Member UncommonPosts: 719

    Fury X tested with wrong drivers : http://forums.hardwarezone.com.sg/hardware-clinic-2/%5Bgpu-review%5D-sapphire-amd-r9-fury-x-rise-5087633-41.html

     

    Graphs start at post 612, there's a 2 digit difference in the full name of the 15.5 drivers between these and drivers used by every other tester.

     

    Oh and the card is out of stock everywhere it completely sold out in the whole world in 24 hours

  • MothanosMothanos Member UncommonPosts: 1,910

    AMD 1/10 the budget of Nvidia.

    Lots of benchmarkers used an older driver and even the newest driver wasnt mature yet.

    Direct X 12 is close and AMD is shwing up to 30% better performance on drawcalls then Nvidia.
    How this will translate to better gaming performance is unknown but frame pacing will be a fcton better and who knows what more.


    I am not a fanboy, but people need to realise that without AMD you will get a monopoly that doesnt do the industry any good let alone we as consumers.

    For me personaly the aging of AMD gpu's have been better then Nvidia, my 7950 is still a beast and it oc.c like a legend, 1200 / 1520 and even the witcher 3 with a few settings turned down gives me a smooth 60 fps 95% of the time.

    Its not a secret AMD is having more difficulties with their driver department, so let me say this, imagen AMD with the same budget as Nvidia and their driver department and you have a dead even or better gpu.

    I dislike a ton of things of both corperations, specialy the tricks Nvidia has been doing lately, the overpricing of their gpu's skyrocketed past years, and again what would happen when AMD would seize to excist ?

    a 970 for 700 euro ?
    a 980 for 900 euro ?
    a 980Ti for 1200 euro ?

    I heared Nvidia is dropping price to combat AMD, doesnt this sound like music in your ears ?

    Also the big bang comes with Pascal / 400 series as 4k is still a joke on Fury X / Titan X on high or ultra settings.

    And realy a Fury X with the right drivers or Titan X insane o.c. is still poop to play on 4k ultra dipping to 30 fps in demanding games.

    And man seeing fanboys go full retard is like looking at apes throwing poop at each other.
    Hilarious to see but sad at the same time as you know they lack some brain cells to be inteligent, altough sometime i think apes might have a few more braincells then these pathetic fanboys.


  • 13lake13lake Member UncommonPosts: 719

    It seems that i could be possible that none of the reviews on any review site before a few hours ago were done on the june20th 15.15 drivers.

    Possible because maybe amd didn't release the driver anywhere nor to anyone before a few hours ago today.

     

    Driver fun :)

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by Mtibbs1989

    Are you ignoring the fact that people have to buy a Free-Sync enabled monitor as well? 

    http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/freesync#monitors

    A list of monitors that currently support Free-Sync. The only one currently available is for $599.99. Making Nvidia's list much more enticing as it ranges from $399.99 to $799.99. To say that Free-Sync is going to save the consumer money is a fallacy as it's no better than G-Sync.

    So if AMD slaps "Free" onto their product people are going to simply bait into it? Sheep; nothing less nothing more.

    Wake-up call: You have to buy a monitor that supports Free-Sync.

    I am going to buy some monitors anyway.  I'm currently still using 1280x1024 monitors.  I thought 1920x1080 was a stupid resolution for a computer monitor and not meaningfully better than the 1280x1024 that I had, so I skipped it.  Useful gaming monitors with higher resolutions and prices that aren't outlandish are a recent development.

    But the question is which monitors to get, especially as I'm buying them at the same time as the video card.  If I go with AMD, the monitors are $600 each.  If Nvidia, $750 each.  Multiply by three monitors and that's a $450 difference.

    -----

    For a monitor that is a few years old, of course it won't support FreeSync or G-sync.  But a year from now, if you're making a nice monitor anyway and supporting FreeSync adds nothing to the price tag, why not?  Supporting G-sync would add $150 to the price tag, so that's a good reason why not to add G-sync support.  And if you're looking to buy a new monitor and it's the same price whether you get one with adaptive sync as without it, why avoid it?

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by Hrimnir

    The problem is this could very well put them out of business.  Even worse is that by the time they figure out and sort out these potential issues you speak of, Nvidia will likely have pascal out or be very close to pascal being out.  And if we're being honest, Nvidia just simply doesn't make these epic of engineering screwups.  The whole 3.5gb of memory thing was far and away Nvidias worst screwup and in reality it was something that made an almost insignificant difference in performance.  And that whole fiasco wasn't like it wasn't designed that way on purpose, the issue was that they (technically) lied to customers.

    Once everyone has moved to 14/16 nm GPUs, whose first effort at HBM on 14/16 nm do you think will work better:

    1)  A company that co-invented HBM and has already released a commercial product on it to give plenty of data on what worked and what didn't and learn from it, or

    2)  A company that has never tried to implement HBM before and is taking its very first shot at it?

    Remember how GDDR5 worked?  AMD launched their first GDDR5 card in 2008, the Radeon HD 4870.  They soon after launched two other dies with GDDR5, the 4890 and 4770, to get some experience with GDDR5, and in the latter, 40 nm.  By the time the 5000 series was ready later in 2009, AMD had GDDR5 controllers down and had good, working controllers.

    Nvidia, meanwhile, launched their first GDDR5 controller on the GeForce GT 240.  They launched several other chips in the GeForce 400 and 500 series with GDDR5 controllers that were various degrees of broken.  Nvidia's first GDDR5 controller that was actually good wouldn't arrive until well into 2012, nearly three full years after AMD.

    And as for engineering screw-ups, the entire GeForce 400 series dwarves anything AMD's graphics division has done since at least the Radeon HD 2900 XT, and possibly since they bought ATI at all.  If a chip with such bad yields that you can't release a fully functional die at all, even for $4000 Quadro and Tesla cards, isn't a colossal screw-up, then I don't know what is.

Sign In or Register to comment.