Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD CEO confirms ray tracing GPUs in development

AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
TLDR Quotes- 

"Saying that AMD is "deep in development" of their own ray tracing technology and that this development "is concurrent between hardware and software."

“I think ray tracing is an important technology, and it’s something we’re working on as well, both from a hardware and software standpoint”

Tom’s Hardware: How do you think ray tracing, in particular, fits into the future of GPUs, and are you working to address that type of rendering with specific optimizations for Radeon VII?

Lisa Su: I think that ray tracing is an important capability and we view that as important. We are continuing to work on ray tracing both on the hardware side and the software side, and you’ll hear more about our plans as we go through the year.

---
Recent rumours suggested that AMD's Navi 7nm GPU was supposed to of been talked about at CES, however we heard next to nothing. The sources for those same rumours came back pretty much saying that the Vega VII was a stop gap rush out as Navi had run into serious hardware problems, in turn leading to zero silence at CES because of said problems.

Is there a connection between the delay of having Navi be prominent at CES and the Ray Tracing news today that they are working on? Unsure, but with the interviews in the tech press today, AMD is confirming that Nvidia was right to go with Ray Tracing simply because they are following suit. Now, we don't know that what or if any AMD dedicated hardware will do but it will be interesting to see. There is even hints at possible ray tracing support on next gen consoles and an E3 (June) announcement.

Another way to look at it is "we are late for Ray Tracing and will not be ready until lots of RT supported games are in the market". Which is mighty convenient IMO. Nevertheless, RT and gaming right now is also in a sad state. 

Sources - 
https://www.overclock3d.net/news/gpu_displays/amd_has_ray_tracing_gpus_in_development/1
https://www.tomshardware.com/reviews/amd-7nm-lisa-su-interview,5961-2.html
https://www.pcworld.com/article/3332205/amd/amd-ceo-lisa-su-interview-ryzen-raytracing-radeon.html



Comments

  • RidelynnRidelynn Member EpicPosts: 7,383
    edited January 2019
    Given AMDs traditional compute-heavy architecture, and the fact that DX 12 RT is a software API, there isn’t really any technical reason AMD couldn’t turn it on now.

    it would probably be nearly worthless right now but I bet a Vega would make a showing.

    in a couple years and another architecture jump or two it’s entirely feasible AMD could support this with little to no dedicated hardware required.
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    What matters is not when AMD officially announces details about Navi, nor even when it officially launches.  What matters is when the cards are widely available for purchase, how good they are, and how much they cost.  Rumors have long put Navi as a mid-2019 product.

    Ray tracing is in DirectX now, and that surely came after Microsoft had extensive consultation with Nvidia, AMD, and even Intel.  Hopefully Navi will have a bit of dedicated silicon for ray tracing in whatever way makes sense.  Getting it right is more important than getting it first, as the ecosystem isn't ready for heavy ray tracing yet.

    Ray tracing won't really be that important until there are games that have to have ray tracing on to work properly.  That's surely several years away.  But it's not going to happen the day after AMD launches their first ray tracing card.  The sooner that both AMD and Nvidia have full lineups out that can handle it--not just a few high-end cards that are way out of the budget of most people--the sooner that clock can start ticking.

    As for rumors of hardware problems, there are always hardware problems, especially for a new architecture, a new process node, or both at once.  If it means an extra metal layer respin, that merits a shrug.  It's if it requires a base layer respin (Fermi may be the only GPU example of this ever) or the architecture just doesn't work how they hoped (see Kepler, Radeon HD 2000 series, Bulldozer, NetBurst) that it's trouble.
    AlmostLancelot
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    Ridelynn said:
    Given AMDs traditional compute-heavy architecture, and the fact that DX 12 RT is a software API, there isn’t really any technical reason AMD couldn’t turn it on now.

    it would probably be nearly worthless right now but I bet a Vega would make a showing.

    in a couple years and another architecture jump or two it’s entirely feasible AMD could support this with little to no dedicated hardware required.
    On the one hand, yes, they surely could have a purely software implementation of ray-tracing.  Nvidia could have done that with Pascal, too.  But from a marketing perspective, it may be better to say "we don't support it" than to have a software implementation that gets walloped by your competitor's hardware implementation.  AMD would prefer not to have legitimate benchmarks out there that show a Radeon VII losing badly to a GeForce RTX 2060, as would probably happen if something were heavy on ray tracing.

    As for AMD's GCN architecture and its derivatives being heavy on compute, they're built to be awesome at compute tasks that fit nicely onto GPUs.  That's why they were so good at mining, for example.  For things that don't fit easily on GPUs, the architecture is considerably less forgiving than Nvidia's Maxwell/Pascal.  One way to put it is that a GeForce GTX 1080 will tend to beat a Radeon RX Vega 64 at compute tasks in which both get crushed by a desktop Core i7 or Ryzen 7 CPU--which is to say, tasks where using a GPU is ridiculous.

    The reason that ray-tracing is so expensive is not that the computations are intrinsically expensive.  Rather, it's that they completely break a bunch of the optimizations that GPUs have to do rasterization efficiently--optimizations that Vega relies on to be good at compute--and in ways that don't suggest alternative optimizations to make ray-tracing efficient.
    AlmostLancelotPhry
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    Quizzical said:
    What matters is not when AMD officially announces details about Navi, nor even when it officially launches.  What matters is when the cards are widely available for purchase, how good they are, and how much they cost.  Rumors have long put Navi as a mid-2019 product.
    True, availability and performance are big KPI's. It needs to be a bit better for lower price or at least have some benefits over Nvidia counterparts. By the time Navi hits the stores (whenever that will be, with rumours June or July) Nvidia has already made a huge amount of money from the 2060 cards. They have like 6 a month lead. And that is not good. 
    Arcturus (post Navi) actually was planned to release in late 2019 back in roadmap slides from mid-2017 (back then called "Next Gen"). It's Navi that's awfully late. But even looking back then, Navi was supposed to have an early 2018 release.  i.e. it was initially planned as a 10nm chip at best, 14nm at worst.

    Ray tracing is in DirectX now, and that surely came after Microsoft had extensive consultation with Nvidia, AMD, and even Intel.  Hopefully Navi will have a bit of dedicated silicon for ray tracing in whatever way makes sense.  Getting it right is more important than getting it first, as the ecosystem isn't ready for heavy ray tracing yet.

    I hope so too. AMD FireRays / Radeon Rays 2.0 which is targeted at developers looking at ray-tracing capabilities with AMD GPUs, CPUs, and APUs via asynchronous compute is a bit different to Nvidia’s RTX ray-tracing technology, which as we know is on DX Raytracing API. It'll be interesting to see if Radeon Rays lives long or if AMD take differing approaching considering RR is open source (OpenCL 1.2 standard) giving it leverage so it could be deployed with non-AMD hardware and on multiple OS environments. (Consoles?). The thing is I doubt Navi was ever considered / planned to have RT silicon and I suspect a course change happened possibly resulting in the delay we see today. Their competitor showed it's hand and capability so it's time to call.

    Ray tracing won't really be that important until there are games that have to have ray tracing on to work properly.  That's surely several years away.  But it's not going to happen the day after AMD launches their first ray tracing card.  The sooner that both AMD and Nvidia have full lineups out that can handle it--not just a few high-end cards that are way out of the budget of most people--the sooner that clock can start ticking.

    I think 2019 will be a big year and step forward for ray tracing games. I fully agree with you. It sounds like AMD's tactic is to wait till the "ecosystem" is better.



  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    edited January 2019

    The thing is I doubt Navi was ever considered / planned to have RT silicon and I suspect a course change happened possibly resulting in the delay we see today. Their competitor showed it's hand and capability so it's time to call.

    The final decision on whether Navi would have dedicated ray-tracing hardware surely came long before Nvidia announced Turing.  At minimum, if Navi is going to be a mid-2019 launch, it would probably have needed to tape out before Nvidia announced Turing.  If AMD didn't decide that they needed ray-tracing hardware until Nvidia announced that Turing had it, we might be looking at a 2022 or so launch of it.  Navi better show up before then.

    If it's in DirectX now, then AMD surely knew that Nvidia was working on ray tracing years ago.  Microsoft is not going to ambush their partners with abrupt changes to the API.  Nor will they add something to the core API just because one vendor requests it.  That sort of thing is heavily discussed far ahead of time, as the precise details of the API need to be something amicable to everyone who wants to implement it.  Vendor-specific extensions are a different matter, but if real-time ray-tracing were only available via vendor-specific extensions, it would be dead on arrival.
  • RidelynnRidelynn Member EpicPosts: 7,383
    Navi is a low-mid range part, supposedly, the successor to Polaris, not Vega. Or so I thought. I wouldn't expect Navi to do anything with RT regardless.

    [Deleted User]Ozmodan
  • OzmodanOzmodan Member EpicPosts: 9,726
    "By the time Navi hits the stores (whenever that will be, with rumours June or July) Nvidia has already made a huge amount of money from the 2060 cards"

    Really?  At the current price point of a GTX 2060, I do not see it flying off the shelves.  Again the sweet spot for graphic cards is at most $250 and more like $200.   I don't see Nvidia discounting it either.  And it is obvious that the card is not going to do ray tracing considering the performance hit.  Most people don't play at 4k, and a $200 card will do great at 1k or 2k.  So I do not see much volume for Nvidia's 20XX lineup.
    Ridelynn
  • VrikaVrika Member LegendaryPosts: 7,990
    Ozmodan said:
    "By the time Navi hits the stores (whenever that will be, with rumours June or July) Nvidia has already made a huge amount of money from the 2060 cards"

    Really?  At the current price point of a GTX 2060, I do not see it flying off the shelves.  Again the sweet spot for graphic cards is at most $250 and more like $200.   I don't see Nvidia discounting it either.  And it is obvious that the card is not going to do ray tracing considering the performance hit.  Most people don't play at 4k, and a $200 card will do great at 1k or 2k.  So I do not see much volume for Nvidia's 20XX lineup.
    RTX 2060 is priced around the same as GTX 1070, which is currently 4th most popular GPU on Steam's hardware survey.

    The sales volume won't be anywhere near to what an $200 GPU would have, but it will still be good enough.
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    Ozmodan said:
    "By the time Navi hits the stores (whenever that will be, with rumours June or July) Nvidia has already made a huge amount of money from the 2060 cards"

    Really?  At the current price point of a GTX 2060, I do not see it flying off the shelves.  Again the sweet spot for graphic cards is at most $250 and more like $200.   I don't see Nvidia discounting it either.  And it is obvious that the card is not going to do ray tracing considering the performance hit.  Most people don't play at 4k, and a $200 card will do great at 1k or 2k.  So I do not see much volume for Nvidia's 20XX lineup.
    Why should AMD care how much money Nvidia makes?  AMD cares how much money AMD makes, and isn't bothered by competitors also making money.  If anything, Nvidia making more money means that there's more money to be made in the markets that AMD can address.
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    Ridelynn said:
    Navi is a low-mid range part, supposedly, the successor to Polaris, not Vega. Or so I thought. I wouldn't expect Navi to do anything with RT regardless.
    A lot depends on how much space the ray-tracing fixed-function logic takes.  If it adds under 1% to your die size, then they might as well put it in so that they can test it and be able to use the same compute units all up and down their lineup.  If the ray-tracing stuff is 20% of a Turing die, then leaving it out makes more sense.  I'm guessing here, but I suspect that it's a lot closer to 1% than 20%.  Turing's tensor cores seem a lot more likely to take a ton of die space.
  • RidelynnRidelynn Member EpicPosts: 7,383
    Well... apart from tensor stuff... a 2080 is close to a 1080Ti in rasterizing loads. I'll make a broad and ignorant assumption that the space allocated to that on the die is roughly equivalent.

    2080 die size: 12nm 545 mm2
    1080Ti die size: 16nm 471 mm2

    Gotta admit I don't know that you can just directly proportion it and get accurate comparison... but I don't know that you have to. With a die shrink in there and still being a larger die - you can tell Tensor cores are taking up ~a lot~ of space. I just couldn't tell you if it's 1% or 20%, but I suspect it's a lot more than 20%...
    AmazingAvery
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    Quizzical said:
    Ozmodan said:
    "By the time Navi hits the stores (whenever that will be, with rumours June or July) Nvidia has already made a huge amount of money from the 2060 cards"

    Really?  At the current price point of a GTX 2060, I do not see it flying off the shelves.  Again the sweet spot for graphic cards is at most $250 and more like $200.   I don't see Nvidia discounting it either.  And it is obvious that the card is not going to do ray tracing considering the performance hit.  Most people don't play at 4k, and a $200 card will do great at 1k or 2k.  So I do not see much volume for Nvidia's 20XX lineup.
    Why should AMD care how much money Nvidia makes?  AMD cares how much money AMD makes, and isn't bothered by competitors also making money.  If anything, Nvidia making more money means that there's more money to be made in the markets that AMD can address.
    Looks like you answered your own question :)
    With Nvidia at 70% market share and AMD loosing 4% last quarter it makes sense they care for their shareholders and their revenue. Having competitors intake disposable income of your customers (and potential customers) when you don't have (and a late) competing product isn't good at all.



  • VrikaVrika Member LegendaryPosts: 7,990
    DMKano said:
    Vrika said:
    Ozmodan said:
    "By the time Navi hits the stores (whenever that will be, with rumours June or July) Nvidia has already made a huge amount of money from the 2060 cards"

    Really?  At the current price point of a GTX 2060, I do not see it flying off the shelves.  Again the sweet spot for graphic cards is at most $250 and more like $200.   I don't see Nvidia discounting it either.  And it is obvious that the card is not going to do ray tracing considering the performance hit.  Most people don't play at 4k, and a $200 card will do great at 1k or 2k.  So I do not see much volume for Nvidia's 20XX lineup.
    RTX 2060 is priced around the same as GTX 1070, which is currently 4th most popular GPU on Steam's hardware survey.

    The sales volume won't be anywhere near to what an $200 GPU would have, but it will still be good enough.

    But rtx2060 delivers closer to 1070ti performance.

    1070ti prices are well above 450

    So ior $350 - if you are going for nvidia cards rtx2060 makes sense. 
    Delivering better performance doesn't mean people would have suddenly more disposable income. I'm comparing RTX 2060 sales potential to GTX 1070 sales because they cost about the same.
    Phry
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    Ridelynn said:
    Well... apart from tensor stuff... a 2080 is close to a 1080Ti in rasterizing loads. I'll make a broad and ignorant assumption that the space allocated to that on the die is roughly equivalent.

    2080 die size: 12nm 545 mm2
    1080Ti die size: 16nm 471 mm2

    Gotta admit I don't know that you can just directly proportion it and get accurate comparison... but I don't know that you have to. With a die shrink in there and still being a larger die - you can tell Tensor cores are taking up ~a lot~ of space. I just couldn't tell you if it's 1% or 20%, but I suspect it's a lot more than 20%...
    Tensor cores and ray tracing hardware are two completely independent things.  I could pretty much guarantee that the former takes a ton of space, but the latter probably doesn't.  I'm hoping that Navi has ray tracing but not tensor cores.

    Tensor cores are basically a machine learning ASIC portion of the chip, in the sense that the video decode block that all modern GPUs have is a video decode ASIC.  Building them or something like them makes a ton of sense if you're building a machine learning ASIC.  But it's a completely stupid thing to put into a consumer GPU because it wastes a ton of die space for basically no benefit.

    Also, part of Turing's die space increase is probably because they doubled the amount of register space per shader.  While that only brings Turing up to parity with what GCN has been offering since 2012, the die space must be significant or else Nvidia would have offered it several years ago.
    Gdemami
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    Quizzical said:
    Ozmodan said:
    "By the time Navi hits the stores (whenever that will be, with rumours June or July) Nvidia has already made a huge amount of money from the 2060 cards"

    Really?  At the current price point of a GTX 2060, I do not see it flying off the shelves.  Again the sweet spot for graphic cards is at most $250 and more like $200.   I don't see Nvidia discounting it either.  And it is obvious that the card is not going to do ray tracing considering the performance hit.  Most people don't play at 4k, and a $200 card will do great at 1k or 2k.  So I do not see much volume for Nvidia's 20XX lineup.
    Why should AMD care how much money Nvidia makes?  AMD cares how much money AMD makes, and isn't bothered by competitors also making money.  If anything, Nvidia making more money means that there's more money to be made in the markets that AMD can address.
    Looks like you answered your own question :)
    With Nvidia at 70% market share and AMD loosing 4% last quarter it makes sense they care for their shareholders and their revenue. Having competitors intake disposable income of your customers (and potential customers) when you don't have (and a late) competing product isn't good at all.

    Of course AMD would prefer to have a compelling product to address every price point.  But my point is that it's not a dire emergency if they don't for several months.  If your competitor makes a ton of money in that time, oh well.

    Remember that Nvidia had nothing for the over $100 market (and not a very good lineup for the $100 and under market, either) for about six months in late 2009 and early 2010.  That's a much, much worse position than AMD is in now, but it hardly meant the end of Nvidia.  AMD is still competitive in the under $300 market and has the Vega cards (including the upcoming Radeon VII) that aren't a completely terrible option above that.

    For that matter, let's not forget that we're talking about a company that withdrew from the x86 server market entirely for a few years.  That's a much bigger market than the entire GPU market, let alone the $350+ portion of it.  Or that Nvidia withdrew from the cell phone market entirely, which is also an enormous market.
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    Also, the transistor density on TSMC's "16 nm" and "12 nm" process nodes is effectively the same.  The latter is an improved version of the former, but not really a die shrink.
Sign In or Register to comment.