Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD Hawaii to be announced in two days; 512-bit memory bus; faster than Titan for $600?

2»

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by Ridelynn

    Well, there's also the difference between nVidia Boost and AMD PowerTune

    To a large extent, Titan cards are all OCed as much as they can, given that Boost works by starting at some low default clock and then scaling upwards as far as it can until it hits a cap or a thermal/power limit.

    Sure, you can find cases where you are hitting the cap and just raise the cap, and you can put on premium coolers to help prevent from hitting thermal limits, but really that doesn't require special BIOSes or anything - Boost already has it built in. You also have to rely on Boost to give you performance, and that will vary title to title because you start low and have to build up.

    An overclock doesn't really do anything. You can start at a higher base clock, but not hugely because you can run into some bad thermal/power problems, and nearly every title can be accelerated by Boost past the base clock setting. And you can lift the caps, but your still relying on Boost to drive you to the caps and assuming your not going to get throttled on anything before that. Overclocks largely get bypassed by the Boost mechanism and you don't see much benefit, but on the flip side, your almost always able to get maximum performance from your card, because it's in effect automatically overclocking itself in a "safe" manner. You may be able to make it more aggressive, but your not really going to affect Boost much by itself. Here, if you lift the power/thermal cap you could get into some trouble, and if you raise the base clock too high you will definitely get into trouble, but raising the clock cap is safe because you'll likely get saved by the power/thermal caps (although largely ineffectual, because your probably already getting saved by a power/thermal cap).

    AMD PowerTune works differently - you start at a high clock, and if it senses a thermal/power limit, it will throttle you down until your safe. Every title starts at the base clock, and only comes down if required. Here, an overclock affects everything, since you already start high, and the overclock goes right on top of that. You overclock too far, and PowerTune will reign you back in (to a point). You can lift the PowerTune cap and get into some trouble, but aside from that it's pretty safe to OC and you get nearly full benefits from it all the time.

    So, even if you could put out a "turbocharged" Titan, it wouldn't matter to much, you'd still be bound by Boost. The best thing you could do is just put a better cooler on it, and that keeps you from hitting the thermal cap as much, and that doesn't require any fiddling with BIOS settings or clocks or anything else, because it's already part of Boost.

    AMD has already moved toward "PowerTune with Boost", clocking cards above the nominal clock speeds when power and heat allow it.

    You know what could make Titan faster?  Rather than capping power at 265 W as they do now, Nvidia could make a card that caps it at 300 W instead.  Or 350 W.  Or 375 W.  50% more power might "only" get you 20% more speed, but 20% is a lot.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by 13lake

    As the manufacturing process matures u get better yields, and some very good samples which can oc high and can be turned into Ultra models, the path of gk110 has been hard long and rocky, if the tsmc's process has finally matured enough, nvidia might have been able to get a few thousand perfect+ chips that will be turned into a ultra version of the titan.

     

    It all depends if the wafers at tsmc don't disappoint, they have to deliver insanely good yields or else there would be only be a dozen or so Titan Ultra cards which would be useless.

     

    What i'm getting at is that even though you feel like a turbo-charged TItan is the maximum of the gk110 chip, remember that there are still disabled parts of the chip, and that if more than a thousand golden samples can be made, we will get a Titan which is miles better than the current titan.

    GK110 launched more than a year after Tahiti.  By the time Titan launched, TSMC's 28 nm process node was pretty mature.  Larger dies intrinsically make it more likely that you have a flaw somewhere on the die.

  • 13lake13lake Member UncommonPosts: 719

    I love when i'm right, i always predict the future accurately :)

    The stock 290x set to Uber power/temperature target mode, is tied with the Titan :):):) by almost exactly the ammount of average fps across 1080p,1440p,4k and that is around 1-4fps.

     

    http://hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review

    http://www.anandtech.com/show/7457/the-radeon-r9-290x-review

    http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650.html

    http://www.techpowerup.com/reviews/AMD/R9_290X/

    http://www.extremetech.com/gaming/169402-to-slay-a-titan-amds-radeon-r9-290x-piledrives-nvidias-high-end-product-line/2

    http://www.guru3d.com/articles_pages/radeon_r9_290x_crossfire_vs_sli_review_benchmarks,17.html

     

    and here's the usual techpowerup comparison :

    Relative performance and per resolution performance : http://www.techpowerup.com/reviews/AMD/R9_290X/27.html

     

    To be frank it's even better than i predicted, i predicted this would happen with aftermarket models, not base models, however considering stock model is overpowering 780 and matching Titan at $550 which is 100 less than 780 :) what are the aftermarket models gonna do ?

    And also Quizzical you were wrong about Nvidia's response, instead of lowering prices or unshackling the Titan  to take over the throne of fastest GPU (Which seems like the best choice), they have shot themselves in the foot, done something pretty bad, and are releasing 780 Ti which is i mean where is it gonna fit ?

    Is it gonna be between 780 and Titan ? still not as good as 290x then which is fail. is it gonna be better than Titan ? at what price range 300-400 less dollars ? again fail.

    Nvidia is abandoning Titan as if it was Titanic :) which seems poetic given the similarities in name :)

    Ball is in Nvidia's court now, lets see if they can make a smart decision, or if they're not gonna lower any price on any high-end card, and put out a 780 Ti at bad performance to price ratio, and keep all other cards at worse performance to price ratio :)

     

     

  • AvatarBladeAvatarBlade Member UncommonPosts: 757
    So I have a question for you guys, since you obviously know than me on stuff. Do you reckon that the nvidia 800 series, that will probably be on 20nm, be worth a wait and the, probably, higher price than the 290x? Or don't you think the performance difference will be big enough to warrant it? Thanks.
  • RidelynnRidelynn Member EpicPosts: 7,383

    We haven't really seen a real response from nVidia yet with regard to the 290X.

    I will admit though - the benchmarks on the card look good. Equal/nearly equal to Titan at lower resolutions, and 10-25% faster at 4k.

    For $450 less - you could almost CF two of them for the same price as a single Titan.

    Sure, Titan runs cooler, and is more power efficient. But it's nearly double the cost. That's the barrier nVidia needs to deal with. The 780Ti, we haven't seen much regarding it, but given that the 780 is $100 more than a 290X, nVidia really didn't need a new card per say, they need to play with the pricing. 780Ti feels like a way for nVidia to provide Titan/Titan+ levels of performance from a die that's going to be cheaper to produce, so they can compete on the price basis -- but I haven't seen much regarding it yet, so we'll just have to see.

    So that's all interesting in the Red vs Green department, the real technical changes:

    4K - it's coming, it will be the next big thing. More pixels than 3x1 display configurations, no bezels to deal with. 290X may be a bit bleeding edge for it, but it is coming sooner than later.

    New Crossfire - no bridge needed. Will we actually need PCI3 now and true x16 lanes?

    Auto-support for multipanel 4k displays over DisplayPort only - will this help drive DisplayPort adoption, or will we still stick with HDMI for the most part?

    TruAudio - it's interesting. I don't know if it will matter though. Honestly it sounds about as useful as PhysX.

    New PowerTune - now it's almost identical to Boost 2.0. I do like that it's explained/offered clearer than Boost though. It's easy to understand what your "caps" are, and therefore easier for me to understand what I'm adjusting. This also helps to explain why the real power use is so much higher than the 7970, despite having similar TDPs.

    95C steady state operating temperature. Wow. That is going to make whole-system cooling a bigger deal in systems using these. Aftermarket/custom coolers will make a big difference in the operation of these cards (due to PowerTune) and in total system cooling (due to the higher normal operating temperature)

  • aspekxaspekx Member UncommonPosts: 2,167
    Originally posted by 13lake

    I love when i'm right, i always predict the future accurately :)

     

    will i ever find true love?

    "There are at least two kinds of games.
    One could be called finite, the other infinite.
    A finite game is played for the purpose of winning,
    an infinite game for the purpose of continuing play."
    Finite and Infinite Games, James Carse

  • RidelynnRidelynn Member EpicPosts: 7,383

    I don't think we'll see 20nm stuff anytime soon. A year out at best, and more likely well past that. We just went from 40nm to 28nm in the 600 series (nVidia) / 7000 series (AMD), and we were on 40nm for a while.

  • CaldrinCaldrin Member UncommonPosts: 4,505

    Well they seem pretty cheap really :) Will have to see how the benchmarks come out..  but thats quite a bit cheaper than Titan..

     

     

  • 13lake13lake Member UncommonPosts: 719

    @AvatarBlade 20nm is tricky especially TSMCs. Everybody seems to want to just skip it and go straight to 14nm/16nm, Quizzical made a very good post about it i remember reading it a few weeks back here on the forums.

     

    TSMCs 20nm might be ready in February 2014 at the earliest , and the first to tape it seems to be AMD:

     http://www.xbitlabs.com/news/other/display/20131017231002_AMD_to_Tape_Out_First_20nm_14nm_FinFET_Chips_Within_Next_Two_Quarters.html

    If all those dates are correct Nvidia 800 series on 20nm might come a few months after february, but if the yields are bad, then it will not come before summer and 800 20nm will be September-October at the earliest.

     

    Worst case scenario very late 2014, early 2015, best case scenario may-june.

    If they launch 800 series Maxwell on 28nm first it's not gonna be worth it to buy.

    (pricing is going to be ridiculously high, that's just Nvidia style, if we go by past price raises, Nvidia is gonna try to push 880 @ $750 which is just crazy, lets hope they settle for $650. Best option is $550 but we're not that lucky, it also depends on how many problems there will be with 20nm, if too many expect both AMD and Nvidia at $650-$750 for flagship.)

     

    @Ridelynn

     

    In the links i gave they test 2x290x @ crossfire, and come to a conclusion that Pci-express 2.0x16 is plenty for the 2 cards, heck even 2.0x8 is enough. You don't start running into bandwith problems before 4k monitors, or even 2x4k(UHD) monitors.

     

    Getting Pci-express 3.0 is a good precaution but it certainly is not a reason to buy a new motherboard if you don't really need one for anything else cept 2x290x

    Here the 290x Crossfire review : http://www.techpowerup.com/reviews/AMD/R9_290X_CrossFire/

  • AvatarBladeAvatarBlade Member UncommonPosts: 757
    Thanks 13lake.
  • DocBrodyDocBrody Member UncommonPosts: 1,926
    damn, my GTX 680 feels so outdated now. Guess I'll trash it tomorrow
  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by cura
    Originally posted by Ridelynn I don't think we'll see 20nm stuff anytime soon. A year out at best, and more likely well past that. We just went from 40nm to 28nm in the 600 series (nVidia) / 7000 series (AMD), and we were on 40nm for a while.
    Pretty interesting article on the subject 

    http://www.pcper.com/reviews/Editorial/Next-Gen-Graphics-and-Process-Migration-20-nm-and-Beyond


    Yup, I had read that article.

    Here's what I base my statement on, and I still stand by it:


    TSMC is not planning on opening up their 20 nm HKMG planar based lines until Q1/Q2 2014 with product being delivered in a Q3 timeframe. TSMC is ahead of the bunch so far with actually implementing a 20 nm line.

    If TMSC is way ahead of their competitors (with the exception of Intel, which isn't really a competitor), and they are looking at being able to deliver product in 3Q14 on an optimistic time table, you aren't going to see any retail products based on that anytime soon. If chips are shipping early 3Q, you may see retail products late 3Q, but if they continue to have problems with the process, have poor yields, have to do design changes, etc... there is a lot of potential for delays.

    nVidia has used TMSC in the past (nearly exclusively), but:

    http://www.extremetech.com/computing/123529-nvidia-deeply-unhappy-with-tsmc-claims-22nm-essentially-worthless

    That page, if what it is saying from nVidia is true, makes a 3Q14 release almost laughable, and looks entirely possible that nVidia may jump from TMSC to something else in order to help control costs. And if TMSC is the first to market with 20nm, but it's too expensive to practically use, then we're gonna be waiting on GF to get their production up.

    Given this statement (from the URL above)


    Further evidence for the accuracy of NV’s presentation comes, ironically, from the company’s primary GPU competitor. At AMD’s Financial Analyst Day, CEO Rory Read made a point of saying that the company no longer intends to aggressively transition to new process nodes given the diminishing marginal returns from doing so.

    The most likely outcome is that we do get "next generation" sometime mid-late next year, but it's still at 28nm.

  • 13lake13lake Member UncommonPosts: 719
    Who is Nvidia gonna jump to ? GlobalFoundries is way way late, and very unstable, as AMD problems have shown in the past, and that leaves what intel :P, i don't think intel is gonna give the cake away, especially not to Nvidia, after all the animosity that happened past few years
  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Fast card, yes.  But it's too hot and too loud.  I wouldn't want one for those reasons alone.  I thought that the high temperatures could plausibly be in part due to putting the thermal sensor in a hotter spot, but hardware.fr's thermal imaging killed that idea pretty quickly.

    http://www.hardware.fr/articles/910-10/bruit-temperatures.html

    They have the R9 290X topping out at 98.6 C in quiet mode, while Titan doesn't go over 82.1 C.  Ouch.  And neither are you getting a quiet card in exchange for such high temperatures.  AMD says that the card can handle the heat because, well, of course they say it can.  What else are they supposed to say?  Don't buy this card because it's unreliable?

    Fortunately, hot and loud are readily fixed by putting a better cooler on it.  For example, a variant of the cooler for high powered cards used by Sapphire, MSI, Gigabyte, Asus, XFX, HIS... well, you get the idea.  Hopefully the reference card will be short-lived and we'll soon be awash in custom cards with superior coolers.  If you have a $50 case that needs an external exhaust card, then maybe you ought to upgrade the case before looking at a $550 card.

    And that's where the good news comes:  it's about as fast as Titan ($1000) and definitely faster than a GeForce GTX 780 ($650) while costing only $550.  That means a massive shakeup of the price/performance curve, at least at the high end.  A better cooler might add a bit of cost, but that's closer to $5 than $450.

    Furthermore, a better cooler doesn't just mean a cooler and quieter card.  Lower temperatures tend to mean less leakage, and that will likely mean lower power consumption, too.  That could allow higher clock speeds, and thus better performance.

    As Ridelynn said, we haven't seen Nvidia's response.  And with Nvidia having already announced the GeForce GTX 780 Ti, a response is certainly coming.

    It's pretty obvious what Nvidia can, and likely will, do:  slash prices on the GTX 780 to $500 or so, and then make the GTX 780 Ti basically Titan without as aggressive of low-power binning and with the GPU compute stuff crippled.  Open up the card to aftermarket designs, and let board partners clock it however they want.  That could probably make a card faster than Titan or the R9 290X, and then they could sell it for $600.

    Titan would then become pointless to buy for gaming, but it would retain its point as an entry-level GPU compute card.  Titan would still have about 8x the double-precision performance of a GTX 780 Ti, among other things, and that overwhelms the difference between Titan's $1000 price tag (which I don't expect to see cut) and the $600 or so for the cheaper GTX 780 Ti.

  • 13lake13lake Member UncommonPosts: 719

    The most worrying thing though is that 20nm seems like it won't make a big impact and will actually make power issues worse, and considering that Free-Play foundries 14nm/16nm are basically 20nm with better wattage/leakage control, but same density as the 20nm, that would make 10nm/7nm the first process to allow 50% or more performance boost, and that leaves 5nm and we're done.

     

    That would mean silicon is done.

    If we won't get 2x more transistors by 5nm, or get 200% more power just on the hardware side without software tweaks, what's the point of even bothering with silicon, shouldn't a silicion-germanium nanowire or graphene, or another meta-material be researched as replacement and implementation  for 2017+ timeframe ?

  • ClassicstarClassicstar Member UncommonPosts: 2,697

    AMD 290X fastest single gpu card and cheaper then titan.
    Well for me im going 290x its cheap compare to titan and it BEATS TITAN!!!

    Sure its hot(got good pc case and noise is bit louder so what i play with headphones always anyway.

    But man for that price around 500 euro's and faster then titan is no brainer to me should i buy it or not.

    Hope to build full AMD system RYZEN/VEGA/AM4!!!

    MB:Asus V De Luxe z77
    CPU:Intell Icore7 3770k
    GPU: AMD Fury X(waiting for BIG VEGA 10 or 11 HBM2?(bit unclear now))
    MEMORY:Corsair PLAT.DDR3 1866MHZ 16GB
    PSU:Corsair AX1200i
    OS:Windows 10 64bit

  • AvatarBladeAvatarBlade Member UncommonPosts: 757
    So, do you all think that a couple of months would be a decent estimate for the 290x to start getting custom cooling from different companies?
  • breadm1xbreadm1x Member UncommonPosts: 374

    Already got one in my pc over here with the BF4 bundle, gonna wait untill the one without BF4 comes out to CF.

    Was working with 2 gtx570 in my game rig so its about time to upgrade :P

    Dont care less for aftermaket coolers, dont care for the heat and noise neither since i got an EK waterblock in its way.

     

    Prize i payd for the Limited Edion r290x was 529,- Euro's i estimate the normal will go for about 480.-

    So yeah you actualy PAY for the "free" BF4 u get with the limited edition since its just BF4 standard witch is 49,-

     

    The only thing limited about it is the box with bf4 on it :)

     


  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by AvatarBlade
    So, do you all think that a couple of months would be a decent estimate for the 290x to start getting custom cooling from different companies?

    In the last couple of card releases, good aftermarket solutions were available nearly same-day as reference designs.

    The GPUs have changed technically a lot over the years, but the past few generations the PCB layouts and designs haven't changed all that much, which makes it easy to recycle good cooling solutions.

    Expect to see Vapor-X, TwinFrozr, Windforce, and many other variants out sooner than later.

  • miguksarammiguksaram Member UncommonPosts: 835
    From what I've read AMD is actually intentially delaying the release of aftermarket cards for potentially a couple months.  I'm personally holding judgement on this card until after those release because this reference design just screams Fermi all over again.
  • AzureblazeAzureblaze Member UncommonPosts: 130

    Heat and noise is one of the primary things I look at in a video card or any other component in my computer. I wouldn't get it over that alone, hopefully the aftermarket ones are better, but how much better would be interesting to see.

    I'll stick with my 780 GTX SC with the ACX cooler for now (EVGA)

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by AvatarBlade
    So, do you all think that a couple of months would be a decent estimate for the 290x to start getting custom cooling from different companies?

    How quickly that happens depends on a number of factors.  One is when and whether AMD or Nvidia decide to allow custom cards; they don't show up until AMD and Nvidia give the OK.  It would be incredibly stupid for AMD to say no custom cards when the reference card is this bad, so AMD has probably already told board partners that they can ship custom cards when they're ready.

    Another factor is whether there is a shortage of the cards.  If AMD or Nvidia can't deliver enough GPU chips for a while, and your cards are going to sell out as soon as they make it to retail, you might as well go with the reference card to get it to retail faster.  That's part of why reference cards exist in the first place.  Once there are enough GPU chips that having a better cooler makes people willing to buy your version of a card rather than someone else's, you really want to have custom cards out there.  As AMD is launching the cards on a mature process node, it would be shocking if there are shortages akin to what we saw at the launch of the Radeon HD 5870/5850.

    A third factor is how much new design work you have to do to create a custom card.  It's not just sticking a cooler on it.  You have to figure out how long of a PCB you're going to have, with how many layers, where you're going to put VRMs, memory chips, monitor ports, and various other components.  Furthermore, the cooler doesn't just have to cool the GPU chip; VRMs need cooling, too, and memory chips and some other things like to see some airflow.

    This third factor is probably the important one for the Radeon R9 290X.  The R9 280X and R9 270X saw custom cards immediately because board partners could take existing boards built for a Radeon HD 7970 or 7870, stick the new bin of the old GPU chip in, set the clock speeds accordingly, and not need to change much.  But Hawaii is a new GPU chip, meaning different pins to the PCB, different power requirements, different memory requirements, and so forth.  If you use any board from another card as a starting point, you're going to need major changes to it.  Among other things, Hawaii is the first GPU chip with a 512-bit GDDR5 memory bus.

    It takes time to do the custom engineering to bring custom cards to market, and then it takes time to ship those new cards across the ocean.  When cards show up at retail depends greatly on when board partners were able to start on that, and there's only so much you can do until you have GPU chips on hand.  I'd expect to see new cards show up over the course of the next month or two, but that's really just guessing.

Sign In or Register to comment.