Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

GeForce GTX 1660 Ti launches: Turing at its most efficient

2

Comments

  • ForgrimmForgrimm Member EpicPosts: 3,069
    Ridelynn said:
    The web is calling it and AMD Price drop on Vega56.. but seems it was just one SKU from one vendor, and one crappy website that wanted to make a story out of nothing,
    It was actually multiple sites that reported it as a price drop because that's how AMD made it sound in their email announcement to the press: https://www.forbes.com/sites/jasonevangelho/2019/02/21/amd-drops-radeon-vega-56-to-279-to-fight-nvidias-gtx-1660-ti/#38e2f2937a28

    UPDATE 1: AMD framed this as a price drop in their email to press, and that's how most press reported it. However, this tweet from the official AMD account concerns me:

    Our friends at @Newegg had a killer deal for Radeon RX Vega 56 at $279.99 ($249.99 after MIR) just earlier today. As you can imagine, it sold out quickly.

    I'm seeking clarity on whether this was a limited-time sale on one particular Vega 56 model, or indeed a price drop.

    UPDATE 2: A spokesperson for AMD provided a disappointing response, but at least we know where things stand:

    “Radeon RX Vega 56 has been heavily promoted since the holidays and into the new year as partners have been eager to make RX Vega 56 and it's forward looking 8GB of HBM2 available for more gamers. To clarify, the current Radeon RX Vega 56 promotion is not a price drop. Additionally, the RX Vega 56 graphics card will continue to be offered as part of AMD’s Raise the Game: Fully Loaded bundle with three of this year’s blockbuster titles.”

    SlyLoK[Deleted User]
  • OzmodanOzmodan Member EpicPosts: 9,726
    edited February 2019
    Forgrimm said:
    Ozmodan said:
    Forgrimm said:
    Ozmodan said:
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    It's benchmarking at 25% faster than the 590 while having quite a bit less power draw. I don't think the 590's extra 2gb of memory are going to offset that. https://www.techspot.com/review/1797-nvidia-geforce-gtx-1060-ti/
    I have not seen any benchmark that has that much difference, most I have seen is 10-15%, I would still take the extra ram.

    It's right in the link I posted. Across several games it's running on average 24% faster at 1080p and 25% faster at 1440p.
    like I said I have run a bunch of them and have not see one that averages more than 15%

    Not sold on the Vega card, it runs too hot.

  • VrikaVrika Member LegendaryPosts: 7,990
    Ozmodan said:
    Forgrimm said:
    Ozmodan said:
    Forgrimm said:
    Ozmodan said:
    Yep, an interesting product choice.  Nvidia realizing that the ray tracing/dlss gimmick is exactly that a gimmick.  It is priced to compete with the RX-590.  I do not understand the 6gb memory though.  While 6gb is ok enough for today, what about next year or the year after?   I don't buy a video card every year or two.  I might still recommend the RX-590 just for that reason.
    It's benchmarking at 25% faster than the 590 while having quite a bit less power draw. I don't think the 590's extra 2gb of memory are going to offset that. https://www.techspot.com/review/1797-nvidia-geforce-gtx-1060-ti/
    I have not seen any benchmark that has that much difference, most I have seen is 10-15%, I would still take the extra ram.

    It's right in the link I posted. Across several games it's running on average 24% faster at 1080p and 25% faster at 1440p.
    like I said I have run a bunch of them and have not see one that averages more than 15%

    Not sold on the Vega card, it runs too hot.

    Here you go:




    Source: https://www.pcgamer.com/nvidia-geforce-gtx-1660-ti-review/



    This post is meant to have only one image, but due to MMORPG.com editor problems it may have anything between 1 - 4 images.

    Ozmodan
     
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited February 2019
    Forgrimm said:
    Ridelynn said:
    The web is calling it and AMD Price drop on Vega56.. but seems it was just one SKU from one vendor, and one crappy website that wanted to make a story out of nothing,
    It was actually multiple sites that reported it as a price drop because that's how AMD made it sound in their email announcement to the press: https://www.forbes.com/sites/jasonevangelho/2019/02/21/amd-drops-radeon-vega-56-to-279-to-fight-nvidias-gtx-1660-ti/#38e2f2937a28

    UPDATE 1: AMD framed this as a price drop in their email to press, and that's how most press reported it. However, this tweet from the official AMD account concerns me:

    Our friends at @Newegg had a killer deal for Radeon RX Vega 56 at $279.99 ($249.99 after MIR) just earlier today. As you can imagine, it sold out quickly.

    I'm seeking clarity on whether this was a limited-time sale on one particular Vega 56 model, or indeed a price drop.

    UPDATE 2: A spokesperson for AMD provided a disappointing response, but at least we know where things stand:

    “Radeon RX Vega 56 has been heavily promoted since the holidays and into the new year as partners have been eager to make RX Vega 56 and it's forward looking 8GB of HBM2 available for more gamers. To clarify, the current Radeon RX Vega 56 promotion is not a price drop. Additionally, the RX Vega 56 graphics card will continue to be offered as part of AMD’s Raise the Game: Fully Loaded bundle with three of this year’s blockbuster titles.”

    Thanks for the quotes. 
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Early reviews of the GeForce GTX 1660 Ti should be taken with considerable caution, as Nvidia apparently didn't provide them to a lot of major tech sites, so they got to cherry-pick who would have the early reviews.  Depending on how reviews are done, it could be largely measuring performance in Nvidia-sponsored titles that use Nvidia's GameWorks for a heavy processing load.  Such results won't be typical of other games.

    For example, in the PC Gamer review, all but one of the games where a GTX 1660 Ti beats a Radeon RX 590 by 25% or more is a GameWorks title.  If you exclude the GameWorks games, a margin of 15% is much closer to typical.

    A reputable review would disable sponsored code before running benchmarks.  Of course running benchmarks of code written by Nvidia--and closed source to ensure that AMD can't optimize for it--is going to produce results favorable to Nvidia.  But most games aren't sponsored titles, so those results will be atypical.  The PC Gamer review doesn't even mention GameWorks, so they probably didn't do this.

    Any questions of why PC Gamer got a review sample, while Tech Report and Hard OCP didn't?

    That's why I compared it to a GTX 1070.  Nvidia-written code isn't that likely to greatly favor one Nvidia GPU over another.  If a GTX 1660 Ti is about as fast as a GTX 1070 on average, then it typically falls about where we know that a GTX 1070 falls from older reviews from more reputable sites:  somewhat faster than a Radeon RX 590 and somewhat slower than a Radeon RX Vega 56.
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    1660 doesn't even come close to Vega 54 Performance .How is it possible to be on 14nm have such poor performance ? Pretty much 2 generations behind AMD now.  I guess that is ok  if it says "n|\/d@
    There are a lot of things wrong with that post.  The one that I'll point out is that there is no "Vega 54" GPU.
    Ridelynn
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    edited February 2019
    Quizzical said:
    A reputable review would disable sponsored code before running benchmarks.  Of course running benchmarks of code written by Nvidia--and closed source to ensure that AMD can't optimize for it--is going to produce results favorable to Nvidia.  But most games aren't sponsored titles, so those results will be atypical.  The PC Gamer review doesn't even mention GameWorks, so they probably didn't do this.
    There are plenty of AMD sponsored games out there. AMD has it's review guide of games too. Sites and reviews that are balanced are the best approach but at the end of the day you can't ignore Nvidia or AMD sponsored games especially as the reviews are to look at the performance of games are that typically popular at the time. In the case of the Radeon VII at launch the press drivers were trash so it is like saying don't bother reviewing at all.  Regarding the PC Gamer comparisons, thanks @Vrika for posting that. It got me interested in comparing other benchmarks. 

    Regarding being reputable - Assassins Creed Odyssey, Deus Ex: Mankind Divided, Far Cry 5, Forza Horizon 4, Grand Theft Auto V, Hitman, Strange Brigade, Total War : Warhammer, all have AMD sponsorship and/or AMD code support and were included in the PC Gamer round up so pretty much even to dispel your point.

    --

    Considering there was no barely any mention of performance in the opening comments on the thread and lots of gesticulating to misdirection. Something that AMD was very good at with the BOGUS announced price drop of VEGA 56 that was a no show.

    Regarding the naming convention I'm still scratching my head to why they called it that but did come across this Nvdia reasoning - 

    https://imgur.com/a/vVE2nwd

    And from that naming the 1660 Ti mentions new INT32 and FP16 operational improvements. It is definitely an interesting card with no founders edition, no RTX, no DLSS or tensor cores that buyers won't be paying for but they will get new Turing architecture, GDDR6 and arch. improvements. There is massive improvement to the way FP and integars/instructions work, now simultaneously rather than alternating giving big gains to shader improvements. Turing can co-issue fp16 + fp32 + int! 

    The Turing architecture as a whole brings 25% performance increase compared to older generations. This GTX 1660 Ti finally brings mesh shaders to mainstream. Also has all other Turing goodies such as int+float co-issue, double rate fp16, scalar instructions, high thoughput + low latency raw loads/stores, faster/larger L1$. Plus variable rate shading, texel footprint query (for vtex / texel shading) and pixel shader barycentrics. All of them important features. Turing has the best rasterization feature set. It is a big jump over older GPUs for future rasterization pipelines. However, for some of that we'll need to wait until DirectX and Vulkan adopt mesh shaders and variable rate shading as standard features. If that happens, widespread dev adoptation will follow. So this card looks good for near term future, especially if console support comes and that is a big plus too.

    The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC. Reading plenty of reviews it is clear that this card can overclock well. That said that $320-$330 pricing for the premium 1660 Ti cards puts it in range of the GTX 2060 which is a better card, simply put and something to consider. The card itself has super low power consumption with half the TDP of a Vega 56 for example. With the 1660 Ti at $280 MRSP it is priced well against the RX 590 = $279, the RTX 2060 = $349 and Vega 56 = $399

    On performance, the GTX 1660 Ti delivers around 34% more performance than the GTX 1060 6GB at 1440p, and a very similar 36% gain at 1080p. Against an RX 590 it is around 25% faster using 2.5 times less power to achieve that. That is pretty cool and stands up well for Turing. I think that AMD will have no choice but to bring down that 590 price. Or AMD can sit on it's hands and present NAVI when ready. Until then the 1660 Ti wins in this pricing segment $/performance at MSRP.

    Below is a 33 game benchmark video.

    I took from it at 1440p res for those interested - 
    GTX 1660 Ti vs GTX 1060 6GB = 34% faster
    GTX 1660 Ti vs GTX 1070 = same performance @ 30w less power
    GTX 1660 Ti vs RX 590 = 24% faster
    GTX 1660 Ti vs RTX 2060 = 14% slower
    GTX 1660 Ti vs RX Vega 56 = 8% slower (but 22% cheaper)

    GeForce GTX 1660 Ti Mega Benchmark + Bogus Vega 56 Pricing Discussion

    Post edited by AmazingAvery on



  • RidelynnRidelynn Member EpicPosts: 7,383

    The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
    This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
    [Deleted User]
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    Ridelynn said:

    The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
    This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
    True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
    Ozmodan



  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    edited February 2019
    Quizzical said:
    A reputable review would disable sponsored code before running benchmarks.  Of course running benchmarks of code written by Nvidia--and closed source to ensure that AMD can't optimize for it--is going to produce results favorable to Nvidia.  But most games aren't sponsored titles, so those results will be atypical.  The PC Gamer review doesn't even mention GameWorks, so they probably didn't do this.
    There are plenty of AMD sponsored games out there. AMD has it's review guide of games too. Sites and reviews that are balanced are the best approach but at the end of the day you can't ignore Nvidia or AMD sponsored games especially as the reviews are to look at the performance of games are that typically popular at the time. In the case of the Radeon VII at launch the press drivers were trash so it is like saying don't bother reviewing at all.  Regarding the PC Gamer comparisons, thanks @Vrika for posting that. It got me interested in comparing other benchmarks. 

    Regarding being reputable - Assassins Creed Odyssey, Deus Ex: Mankind Divided, Far Cry 5, Forza Horizon 4, Grand Theft Auto V, Hitman, Strange Brigade, Total War : Warhammer, all have AMD sponsorship and/or AMD code support and were included in the PC Gamer round up so pretty much even to dispel your point.
    I'm not saying to ignore all sponsored games.  I am saying that any features that involve closed-source code written by one GPU vendor and specifically optimized for that vendor's GPUs and de-optimized for their competitor's GPUs should be disabled if you're going to use benchmarks from a sponsored game.  Otherwise, what you're benchmarking is how GPUs perform in code written by Nvidia for the express purpose of making their GPUs look artificially better in benchmarks.
    Ozmodan[Deleted User]
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    And from that naming the 1660 Ti mentions new INT32 and FP16 operational improvements. It is definitely an interesting card with no founders edition, no RTX, no DLSS or tensor cores that buyers won't be paying for but they will get new Turing architecture, GDDR6 and arch. improvements. There is massive improvement to the way FP and integars/instructions work, now simultaneously rather than alternating giving big gains to shader improvements. The Turing architecture as a whole brings 25% performance increase compared to older generations. This GTX 1660 Ti finally brings mesh shaders to mainstream. Also has all other Turing goodies such as int+float co-issue, double rate fp16, scalar instructions, high thoughput + low latency raw loads/stores, faster/larger L1$. Plus variable rate shading, texel footprint query (for vtex / texel shading) and pixel shader barycentrics. All of them important features. Turing has the best rasterization feature set. It is a big jump over older GPUs for future rasterization pipelines.
    I'm skeptical of claims that Turing is a huge advance in instruction scheduling.  If there is, it sure doesn't seem to show up in game benchmarks.  What architectural improvements there are over Pascal could be explained purely from increasing some cache sizes, especially doubling the registers per shader.

    It's possible that there are indeed some scheduling improvements.  I'd have to try out one of the cards for myself to get some idea of what Nvidia's claims about simultaneous integer and floating-point usage means--if it means anything at all.  I'm extremely skeptical that it allows you to do max throughput on integer instructions and also max throughput on floating point instructions simultaneously.

    Part of my skepticism is because even if you could allow max integer throughput and maxing floating point throughput at the same time, that would be a very strange thing to do.  Compute workloads tend to be either almost exclusively integer or else overwhelmingly floating-point with integers only really used for memory addressing.  Putting in a ton of work to make it possible to do two things at once that you normally don't want to do at once wouldn't make sense.  If you're going to split instructions that way, why not split it into two halves such that you'll commonly want to use both at once?

    As for FP16, that's probably not useful for gaming.  Vega offers doubled FP16 throughput, too, but didn't look like a huge architectural leap over Polaris.
    Ozmodan
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    I've also not seen  (25 years of seeing) so many "reviewers" spend this much time on mid level card either, #paidAdverts You knows omething is weird is going on when you see certain perople peel out of the wood from no where to push the card. n\/d@ has lost so much market shares in the sectoir they are pulling all the amazing tricks. I know I've seen this before in terms of Age of Conan, i'm suree I asm not the only one even more skeptical of n\/@ future now ..
    By mid-range card, you mean the price range that many gamers who can't afford the high end buy, don't you?  AMD has commonly focused more on the mid-range than the high-end precisely because that's the most lucrative part of the consumer graphics market.  The market for $1200 consumer GPUs, or even $700 ones, just isn't very big.
    Ozmodan
  • RidelynnRidelynn Member EpicPosts: 7,383
    I don’t think nV has lost any significant market Share.

    I also don’t think people are rushing out to upgrade if the already own Maxwell/Pascal cards, which is why nV is in trouble
    OzmodanAmazingAvery[Deleted User]
  • OzmodanOzmodan Member EpicPosts: 9,726
    Ridelynn said:

    The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
    This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
    True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
    Please explain to me how the RTX 2060 is a better buy?  The ray tracing and DLSS on a 2060 is practically useless.  Hence it is not worth one penny more than the 1660.
    [Deleted User]
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited February 2019
    I don't know... Seeking Alpha isn't exactly plugged into the gaming community. They may be trying to extrapolate "market share" from revenue, of which AMD may also have custom work in (consoles, APUs, etc) that wouldn't necessarily pull away from nVidia's market share.

    A peek at Steam Hardware Survey shows marketshare by GPU vendor flat over the last year, with AMD solidly around 15%, and nVidia solidly around 75%

    https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

    Now I realize Steam Hardware isn't the end all beat all of surveys, but it's capturing relative data from actual gaming machines. If AMD were gaining marketshare versus nVidia, I would think we would see a significant bump in these Steam figures as well.

    For example: You ~can~ see AMD gaining marketshare versus Intel on the CPU side, Intel drops and AMD gains by about 4% in the same reporting period. I don't know if the actual marketshare has swung by 4% or that I would believe that number as an absolute truth, but it shows the beginning of a significant trend line, and that's what I'm really looking for from these hardware survey numbers.
  • GdemamiGdemami Member EpicPosts: 12,342
    Ridelynn said:
    If AMD were gaining marketshare versus nVidia, I would think we would see a significant bump in these Steam figures as well.
    ...you don't install steam on mining rigs.
  • ConnmacartConnmacart Member UncommonPosts: 723
    My steam data doesn't match my pc anymore. In fact I can only recall ever having done 1 survey at all. So how accurate are the steam figures to begin with. 

    If you have a 10 series card than either wait for Navi to see if that brings better performance or just skip this generation completely. Unless of course you have something like a 1050 and can get something higher end.

    I will be looking at Navi myself to see if replacing my GTX 1070 will be worth it. Am not expecting that before october though. Doubt AMD will release both Zen 2 and Navi at the same time and Zen 2 is scheduled for early summer.
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    As far as I'm aware, the Steam hardware survey has run once on this computer.  Steam thought it was a laptop.  A laptop with three 27" monitors.  So I don't trust that to be representative even of Steam users.

    As far as discrete desktop GPUs for gaming, AMD has surely gained considerable market share in the last year.  A year ago, miners were buying pretty much everything AMD had outside of the low end and bidding it up far out of the reach of gamers, while being much less aggressive about buying out GeForce cards.  Today, the miners have nearly left the market.  That meant that AMD had very little of the gaming market for $150+ discrete video cards, but today, they have a substantial fraction of it.
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited February 2019
    Here's the last thing I'll say about Steam surveys, then I'll let everyone have their last word

    I don't care if your Steam survey was inaccurate, or if only submitted it once 12 years ago, or if you only hit submit will gaming on your ancient Gateway 2000 laptop so you can giggle at uploading bad information. It's still a sizable cross section of an applicable market, and your personal experience with it doesn't dilute that.

    Steam hardware survey was accurate enough to pick up a change in marketshare on AMD/Intel on the CPU side. It seems completely implausible to me that a significant market share jump in GPUs, and you are talking about doubling on the part of AMD, would go completely undetected in the same period of time.

    Mining on PC is done, from my understanding in many markets you can't even break even on electric costs today. Even those people that did mine - a lot of those cards ended up sold as used and are now in gaming rigs (who else would be buying them used?). That hurt both AMD and nVidia, and I believe directly related to the inventory overstock they both have recently discussed in their Earnings statements. So I don't believe ~all~ the AMD marketshare went to just mining rigs and no where else... and I don't believe those that did go to mining rigs are still out there mining away. 

    I could believe that it was all mining marketshare, if we were still seeing mining going strong and those cards were still tied up in mining rigs crunching away. But now that the bubble is popped and a lot of that used inventory has started to flush through the system, we still aren't seeing any change at all in marketshare.

    Now, maybe Steam is wrong... that's certainly plausible, and there's a lot of anecdotal evidence here to suggest that. But I don't believe it would show the CPU change, as small of a fraction and as infrequent as those get upgraded, compared to the GPU upgrade cycle and magnitude of the supposed marketshare change in GPUs.

    If you believe the marketshare has changed - the Seeking Alpha article is a good source, but so far, that's the only one I've seen linked here that indicates as such and it's attempting to extrapolate based on revenue, earnings reports, or other investor-related information (or at least to my understanding, I may be wrong). At least the Steam survey is counting actual physical installations - which I won't claim is entirely accurate but I will say is representative. If you have another source -- Let's see the numbers.
    MendelAmazingAvery[Deleted User]
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    edited February 2019
    Ozmodan said:
    Ridelynn said:

    The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
    This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
    True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
    Please explain to me how the RTX 2060 is a better buy?  The ray tracing and DLSS on a 2060 is practically useless.  Hence it is not worth one penny more than the 1660.
    Sure - MSRP on the 1660ti is $280 on the 2060 is $350. Those are the lower ends so it varies on what card has your eye ie. Some 1660 Ti models go for $330. Nevertheless there is a basic $70 difference. So that is one piece understood on the other side what about performance? Well @1440p the 1660 Ti is 14% slower over 33 games see below - near the end. At 1080p there is 23% difference according to anandtech.
     
    https://www.techspot.com/review/1799-geforce-gtx-1660-mega-benchmark/

    so excluding RTX features like DLSS and Ray Tracing which I agree are in infancy stages but getting better and will get better on performance alone and the price difference it’s worth considering. As I mentioned if you are looking at a 1660 Ti higher end model at $330 and you compare that to a $350 2060 I’d take the $20 extra for a 2060 and the performance boost with it.




    Post edited by AmazingAvery on



  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    makes buying an VII a lot easier. 4096 bit memory bus LOL  compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.

    Vega 54  Memory bus = 2048bits ;Bandwidth 409.6 GB/s

    Not even a contest anymore.
    Huh?
    There is absolutely no meaningful reason, unless you mine, to get a Radeon VII at MRSP. In AMD's internal review guide numbers, by their own admission a 2080 will still be faster at the same price.



  • RidelynnRidelynn Member EpicPosts: 7,383
    I thought the 2060 was faster than a 1660Ti. 2060 was 1070Ti level, 1660 is 1070 level - or so I thought.

    I mean, yeah, that isn't a very wide margin of performance there, but still, it's also not "the same except for RayTracing and DLSS". And there was about a $70 delta between the 1070 and 1070Ti MSRP (street price is a different matter), so the price difference between a 1660Ti and 2060 is historically appropriate (regardless of how otherwise "appropriate" we may feel about it).

    Also, I can think of a few reasons where very large memory bandwidth is a good benefit. But by and large, memory bandwidth isn't the ultimate indicator of gaming performance, which is what I care about, and the intended purpose of most of these GPU cards we are discussing.

    Apart from that, I can think of one debatable reason to buy a Radeon VII at MSRP over a 2080, and one legit purpose. The legit purpose is 4K video editing, where the 16GB of HBM shines. Here is a review, it focuses on gaming performance, but it mentions video editing specifically if you read the article. If you frequently edit a lot of video, and want to buy something cheaper than a $5,000+ Pro card, the VII is a steal.

    The debatable reason would be a protest purchase. The performance is slower, but only marginally so, than a 2080, so it's not like you are buying something that is entirely inferior or unsuitable... it's still same class of performance. Every R7 purchase sends a pretty clear message to nVidia, and supports AMD. 

    I don't necessarily recommend that - I think you should get the best card, regardless of manufacturer, for the amount you have budgeted. But it's a plausible reason.
    [Deleted User]Quizzical
  • VrikaVrika Member LegendaryPosts: 7,990
    Ridelynn said:
    I thought the 2060 was faster than a 1660Ti. 2060 was 1070Ti level, 1660 is 1070 level - or so I thought.

    I mean, yeah, that isn't a very wide margin of performance there, but still, it's also not "the same except for RayTracing and DLSS". And there was about a $70 delta between the 1070 and 1070Ti MSRP (street price is a different matter), so the price difference between a 1660Ti and 2060 is historically appropriate (regardless of how otherwise "appropriate" we may feel about it).

    Also, I can think of a few reasons where very large memory bandwidth is a good benefit. But by and large, memory bandwidth isn't the ultimate indicator of gaming performance, which is what I care about, and the intended purpose of most of these GPU cards we are discussing.

    Apart from that, I can think of one debatable reason to buy a Radeon VII at MSRP over a 2080, and one legit purpose. The legit purpose is 4K video editing, where the 16GB of HBM shines. Here is a review, it focuses on gaming performance, but it mentions video editing specifically if you read the article. If you frequently edit a lot of video, and want to buy something cheaper than a $5,000+ Pro card, the VII is a steal.

    The debatable reason would be a protest purchase. The performance is slower, but only marginally so, than a 2080, so it's not like you are buying something that is entirely inferior or unsuitable... it's still same class of performance. Every R7 purchase sends a pretty clear message to nVidia, and supports AMD. 

    I don't necessarily recommend that - I think you should get the best card, regardless of manufacturer, for the amount you have budgeted. But it's a plausible reason.
    Multimedia encoding of any kind which includes streaming.

    One other thing, drivers, AMD Adrenline platform is hands down the best set of drivers i've used in all of my computing days. 

    From what I've seen of \/d@ , they are using the same driver formula from 2000....



    RVII out of stock :|  // RTX 2080 instock // RTX 2070 price slashed ......
    Newegg lists only two different Radeon VII models, it doesn't look like AMD has any intention of making more than just a few Radeon VII cards.

    There's no reliable data of sales numbers, but if you count Newegg reviews then NVidia has sold 70 RTX 2080 Ti's for each Radeon VII sold by AMD.
     
  • RidelynnRidelynn Member EpicPosts: 7,383
    I"ve had a suspicion that, ever since the day when AMD cranked out way to many Southern Island cards (and ended up with like 3 generations of rebadges), that they've been extremely conservative with production runs.

    You don't see nearly as many different SKUs from AIBs, and you don't see many overstock sales, and you see an awful lot of "sold out".

    [Deleted User][Deleted User]
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Ozmodan said:
    Ridelynn said:

    The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
    This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
    True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
    Please explain to me how the RTX 2060 is a better buy?  The ray tracing and DLSS on a 2060 is practically useless.  Hence it is not worth one penny more than the 1660.
    Sure - MSRP on the 1660ti is $280 on the 2060 is $350. Those are the lower ends so it varies on what card has your eye ie. Some 1660 Ti models go for $330. Nevertheless there is a basic $70 difference. So that is one piece understood on the other side what about performance? Well @1440p the 1660 Ti is 14% slower over 33 games see below - near the end. At 1080p there is 23% difference according to anandtech.
     
    https://www.techspot.com/review/1799-geforce-gtx-1660-mega-benchmark/

    so excluding RTX features like DLSS and Ray Tracing which I agree are in infancy stages but getting better and will get better on performance alone and the price difference it’s worth considering. As I mentioned if you are looking at a 1660 Ti higher end model at $330 and you compare that to a $350 2060 I’d take the $20 extra for a 2060 and the performance boost with it.
    $350 is in no way a "lower end" card except in a relative sense with much of the new lineup not yet released.  $350 is mid-range to upper mid-range.  $100 is a lower end card.

    DLSS is garbage and will always be garbage.  It's dead, and the only question is whether Nvidia knows it yet.  They probably do, but don't want to let their fanboys in on that secret until they come up with the next gimmick.

    Real-time ray tracing probably has a future.  DLSS doesn't.
Sign In or Register to comment.