Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD, lol

KiyorisKiyoris Member RarePosts: 2,130

The new R7 370 uses the Radeon 8770 chip from 2012.

Just lol. How are these cards even named new card, the only new card is the  Fiji Fury X, expensive card that is still under NDA until next wednesday.

Rebrand AMD.

«1

Comments

  • CleffyCleffy Member RarePosts: 6,414
    Both companies rebrand, they also have been on this process node for some time. Why would they release new cards on the same process node and architecture?
  • Mondo80Mondo80 Member UncommonPosts: 194
    It is a $150 card what are you expecting?
  • zaberfangxzaberfangx Member UncommonPosts: 1,796
    Pretty much both companies do it, they have chip left over they'll end up changing the name to sell you order chips.
  • QuizzicalQuizzical Member LegendaryPosts: 25,531

    It's a respin, not purely a rebrand.  It's kind of like how the GeForce 500 series was awfully similar to the GeForce 400 series (outside of the low end GeForce GT 520, which was new), but better because they fixed some stuff by redoing the parts.  Except that the previous AMD parts weren't horribly broken like the GeForce 400 series.  It's more comparable to going from Trinity to Richland, or likely from Haswell to Devil's Canyon.

  • RidelynnRidelynn Member EpicPosts: 7,383

    How many generations did nVidia use the Tesla G92 chip? 4 generations across as nearly as many years, and in various configurations at least 14 different SKUs that I can count. It wasn't retired until Fermi released.

    ATI/AMD has never done anything quite that bad - the sheer number of SKUs is the worst crime there, because that's a lot of confusion. Those who live in glass houses should throw no stones. Hawaii has only been out for a couple of years now with this new release it's second generation, and Pitcairn/Cura

  • mbrodiembrodie Member RarePosts: 1,504

    Fury - $549 / Fury X(Watercooled) - $649

    http://www.maximumpc.com/amd-announces-fury-and-300-series-graphics-cards/

    what NDA all the information is there and on paper they look nice.

     

  • NorseGodNorseGod Member EpicPosts: 2,654
    AMD is second rate. You get what you pay for.
    To talk about games without the censorship, check out https://www.reddit.com/r/MMORPG/
  • GestankfaustGestankfaust Member UncommonPosts: 1,989
    Originally posted by Kiyoris

    The new R7 370 uses the Radeon 8770 chip from 2012.

    Just lol. How are these cards even named new card, the only new card is the  Fiji Fury X, expensive card that is still under NDA until next wednesday.

    Rebrand AMD.

    You are one of those that think they know something....you don't.

     

    Have fun with your post though

    "This may hurt a little, but it's something you'll get used to. Relax....."

  • booniedog96booniedog96 Member UncommonPosts: 289
    Originally posted by NorseGod
    AMD is second rate. You get what you pay for.

    By second rate do you mean:

     -  because they were the first to break the 1Ghz speed barrier on the CPU?

     -  because they introduced multi-core CPU's to the average household?

     -  because they were first to use GDDR5 memory?

     - because they were the first to on-die GPU? (not integrated graphics, yes, there is a difference)

     - because Mantle paved the way for DX12?

     - because they have successfully implemented HBM furthering the evolution of the GPU?

     

    Is that what you are referring to as second rate?

  • Leon1eLeon1e Member UncommonPosts: 791
    Originally posted by booniedog96
    Originally posted by NorseGod
    AMD is second rate. You get what you pay for.

    By second rate do you mean:

     -  because they were the first to break the 1Ghz speed barrier on the CPU?

     -  because they introduced multi-core CPU's to the average household?

     -  because they were first to use GDDR5 memory?

     - because they were the first to on-die GPU? (not integrated graphics, yes, there is a difference)

     - because Mantle paved the way for DX12?

     - because they have successfully implemented HBM furthering the evolution of the GPU?

     

    Is that what you are referring to as second rate?

    You forgot to mention that they introduced the 64-bit instruction set to the x86 CPU, which powers just about any PC and current generation of consoles in every household. But I like you! Dope.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Ridelynn

    How many generations did nVidia use the Tesla G92 chip? 4 generations across as nearly as many years, and in various configurations at least 14 different SKUs that I can count. It wasn't retired until Fermi released.

    ATI/AMD has never done anything quite that bad - the sheer number of SKUs is the worst crime there, because that's a lot of confusion. Those who live in glass houses should throw no stones. Hawaii has only been out for a couple of years now with this new release it's second generation, and Pitcairn/Curaçao XT just hit 3 years and this will be it's third generation.

    Does a re-release/re-spin make it somehow inferior? It's still just as capable as it was when it released, it gets continuous tweaks upon each re-release, and while yeah, it may not be new and shiny, that doesn't make it any less capable, and stands to illustrate just how versatile and powerful the design was on it's initial release -- that goes for when nVidia does it as well as when AMD does it.

    To be fair to Nvidia, they did have a half-node die shrink in there at some point.  So it was really two generations of G92 (8800 GT, 9800 GTX) and three generations of G92b (9800 GTX+, GTS 150, GTS 250), not counting salvage part, laptop, and Quadro variants.  And furthermore, the launch dates for all of those were within two years.

    -----

    What's really driving the rebrands and respins is the lack of a new process node to move to.  AMD moved to 28 nm around the start of 2012.  It's now mid-2015, and 28 nm is still the state of the art--and likely to remain so for another year or so.

  • Leon1eLeon1e Member UncommonPosts: 791
    Fiji on 22nm or 14nm will be a beast though. 
  • MikehaMikeha Member EpicPosts: 9,196

    AMD is also working on innovations to VR.

     

  • MikehaMikeha Member EpicPosts: 9,196
    image
  • MikehaMikeha Member EpicPosts: 9,196
    image
  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Leon1e
    Fiji on 22nm or 14nm will be a beast though. 

    Fiji is a 28 nm chip.  There's never going to be a straight die shrink of it, as by the time a new, better process node is ready, HBM2 will presumably also be ready--and allow more than 4 GB of memory.

    GCN is also 3 1/2 years old, which is ancient in GPU architecture years.  It's not merely older than Maxwell; it's also older than Kepler, Maxwell's predecessor.

    The next process node will presumably be 16 nm if it's TSMC or 14 nm if it's Samsung or Global Foundries.  Internet rumors say Samsung, and it's been a while since the reality at Global Foundries had much in common with its roadmaps of a year or two prior.

    I don't expect to see any GPUs built on 22 nm ever, outside of Intel's Ivy Bridge and Haswell integrated graphics.  The other major foundries don't seem to be hitting 22 nm at all, instead going from 28 nm to 20 nm, then 14 or 16.

  • Leon1eLeon1e Member UncommonPosts: 791
    I assume they left some future proofing for this tech. After all it took them years to develop, took longer than expected, the GTX 9xx series were unmatched for over 8 months (if we agree that 290X was released as competition to 780ti). If I was AMD i'd be out for blood right now. A reasonable die shrinkage could deal with the heat. Seeing that the Fury X comes supplied with water cooling on stock the power per watt  might not be over what nvidia has. Can't wait for the actual benchmarks for sure. On paper Fury(X) look damn good.
  • NorseGodNorseGod Member EpicPosts: 2,654
    Originally posted by booniedog96
    Originally posted by NorseGod
    AMD is second rate. You get what you pay for.

    By second rate do you mean:

     -  because they were the first to break the 1Ghz speed barrier on the CPU?

     -  because they introduced multi-core CPU's to the average household?

     -  because they were first to use GDDR5 memory?

     - because they were the first to on-die GPU? (not integrated graphics, yes, there is a difference)

     - because Mantle paved the way for DX12?

     - because they have successfully implemented HBM furthering the evolution of the GPU?

     

    Is that what you are referring to as second rate?

    But does it work? For how long in comparison?

    Is it compatible with anything else?

    Are all those snappy advances in technology you listed supported in any game? If so, are they bottlenecked anyways and wasted resources?

    When those snappy new technologies are supported, will they be done better by someone else, as usual?

    To talk about games without the censorship, check out https://www.reddit.com/r/MMORPG/
  • goth1cgoth1c Member UncommonPosts: 79

    This is such a "flame" topic, I think we got what we got today cause Nvidia and AMD exist and they have to "keep up their game" if not the other one will get in front, if one didnt existed we would all still be playing in "8bit".

       Never the less Nvidia in my opinion is still in front of the AMD, as launched Maxwell as AMD had to rebrand is stuff, think AMD only rebranded this stuff cause of windows 10 and its DX12, the fury is probaly just gona be a "product unfinished" and rushed.

    And honestly what % of the people who buys graphics cards spend money in a 550$+ cards? I am  pretty sure is damn low... I feel disapointed at AMD and pretty sure not the only one, i really wonder how long they can keep like this, cause i am sure they arent making as much as Nvidia...

    I guess this is more a rant and disapointment at AMD as i have waited for their new cards the 300 series, and got nothing new just old stuff overclocked big disapointment didnt wort the wait for sure...

    Just can hope next year they can get their game up so Nvidia as to do the same too.

     

    image
  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by goth1c
    This is such a "flame" topic ....   Never the less Nvidia in my opinion is still in front of the AMD, as launched Maxwell as AMD had to rebrand is stuff, think AMD only rebranded this stuff cause of windows 10 and its DX12, the fury is probaly just gona be a "product unfinished" and rushed.And honestly what % of the people who buys graphics cards spend money in a 550$+ cards? I am  pretty sure is damn low... I feel disapointed at AMD and pretty sure not the only one, i really wonder how long they can keep like this, cause i am sure they arent making as much as Nvidia...
     

    You know, when Maxwell first shipped, it was only in the high end products as well, still isn't in a product below the $200 average price point, and they haven't even released replacements for the 700-series for their lower tiers as of yet.

    So about that flame topic...

  • Alpha_ChinoAlpha_Chino Member UncommonPosts: 36
    Love my HD7950.  Throttle my core gpu and my 3770K says,  aight, i got dis.
  • GdemamiGdemami Member EpicPosts: 12,342


    Originally posted by RidelynnYou know, when Maxwell first shipped, it was only in the high end products as well, still isn't in a product below the $200 average price point, and they haven't even released replacements for the 700-series for their lower tiers as of yet.So about that flame topic...

    First desktop GPU Maxwell released was GTX 745, 750/750ti, hardly high end products.

    You really have difficulties to get facts right...

  • RidelynnRidelynn Member EpicPosts: 7,383

    Your right, the GM107 - forgot about that one because, well, who cares about it really.

    I had meant the GM200 series, but your correct, as always.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by Ridelynn

     


    Originally posted by goth1c
    This is such a "flame" topic ....

     

       Never the less Nvidia in my opinion is still in front of the AMD, as launched Maxwell as AMD had to rebrand is stuff, think AMD only rebranded this stuff cause of windows 10 and its DX12, the fury is probaly just gona be a "product unfinished" and rushed.

    And honestly what % of the people who buys graphics cards spend money in a 550$+ cards? I am  pretty sure is damn low... I feel disapointed at AMD and pretty sure not the only one, i really wonder how long they can keep like this, cause i am sure they arent making as much as Nvidia...
     


     

    You know, when Maxwell first shipped, it was only in the high end products as well, still isn't in a product below the $200 average price point, and they haven't even released replacements for the 700-series for their lower tiers as of yet.

    So about that flame topic...

    Don't forget the GTX 750 and GTX 750 Ti:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814127788

    Those are Maxwell, too.

    Really, though, what happens when AMD launches a consumer APU with HBM on board that handily beats contemporary $100 video cards?  Give it a couple of years and it's probably coming.  Lower end discrete cards are going away, which is part of why the GTX 750 is Nvidia's lowest end retail Maxwell part for desktops.

Sign In or Register to comment.