Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

RTX2060 announced

RidelynnRidelynn Member EpicPosts: 7,383
$350 MSRP. Supposedly roughly equivalent to 1070Ti.
MadFrenchie
«13

Comments

  • RidelynnRidelynn Member EpicPosts: 7,383
    edited January 2019
    I honestly wish AMD could do a Ryzen type of turn around on their cards and have something to compete with the 2080+ or at least come close. Having a 1080Ti I have zero reason to be excited with this go around of cards but in the future would love to have some real competition for NVIDIA.

    Unless you have a 1060 or below though there is no reason to upgrade at least from a value perspective at this time.
    I don't think AMD needs anything close to a 2080.

    Right now, AMD has a solution that competes favorably with a 2060. That would be Vega. Problem is nVidia is able to cut the price now: MSRP on 2060 is $350 (we'll see if that stays once they release in a few weeks), versus MSRP on a Vega64 (which is what i think would be roughly equivalent) being $499.  That actually isn't horrible on the part of nVidia - it roughly matches a 1070Ti, which was MSRP $450. I don't have a lot of bad things to say about the 2060 (plenty to say about nVidias overall 20xx strategy, but not a lot about the 2060 itself).

    That being said, even that price point is on the upper end of where they need to be competitive to actually make a difference. The low and mid tier cards are were the volume occurs. If nothing else, Steam Surveys routinely have the low and mid tier cards in highest volume (nVidia 1060, 970, etc).

    The important price points, in my opinion, occur at <$180, $180-$300, and >$300 (in US dollars). Those, for me, are what I consider low, mid, and high tier brackets. And then you have the halo products that exist outside of that, which fall outside of any reasonable pricing structure (largely because they can).

    Now, maybe because of inflation and everything else, my completely unofficial price bracketing needs to be updated - I won't argue that. But even if we shift it by a few dollars up or down, we would have to shift it by a factor of at least x3 to get to anything close to where a 2080 is a relevant card.

    But on to my point, AMD should have an answer to the 2060: We know they can do that technically now, they just can't hit the same price point (yet). The RX590 competes pretty favorably versus the previous generation 1060 6G, but the new 2060 is considerably higher performance (as well as higher price, so I'm not knocking that too much). But I don't think they need to necessarily concern themselves with performance or pricing beyond that - everything above that point is halo, as far as I'm concerned.

    If AMD can compete against a 2060 in both performance and price, and flesh out the lineup between that and their 590 (or better yet, go ahead and look at retiring the Polaris lineup, which has had a great run, with new Vega+/Navi replacements), I think they will be doing well.

    I do agree with your sentiment that only those with 1060 or less should even consider upgrading... but then again, I don't necessarily see a great reason to upgrade each generation either, unless you are one of those people that has to be on the performance edge with top tier cards all the time. For most people, that's just too expensive to chase though. Quiz's thumbrule for upgrading GPU is a pretty good one to go by: unless it's broke, don't upgrade unless you can get at least double the performance. If you are looking at like tiers, that historically would be roughly every third generation.
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    edited January 2019
    The problem with using an RX Vega 56 to compete with an RTX 2060 is that it's a much more expensive card to build.  It's a bigger die, burns more power, and uses HBM2.  The goal isn't just to give gamers something to buy.  The goal is to make money, and selling Vega cards too cheaply just doesn't do that.

    The solution for AMD is Navi.  We'll have to wait and see just how good Navi is.  But barring something catastrophic, AMD will be in a much more competitive position after various Navi cards launch than they are today, especially if, as rumored, Nvidia isn't moving to 7 nm until 2020.
    ConstantineMerus
  • RidelynnRidelynn Member EpicPosts: 7,383
    Quizzical said:
    The problem with using an RX Vega 56 to compete with an RTX 2060 is that it's a much more expensive card to build.  It's a bigger die, burns more power, and uses HBM2.  The goal isn't just to give gamers something to buy.  The goal is to make money, and selling Vega cards too cheaply just doesn't do that.
    I wasn't implying that AMD just go out and sell Vega for a lower price.
  • gervaise1gervaise1 Member EpicPosts: 6,919
    edited January 2019
    Quizzical said:
    <snip>

    The solution for AMD is Navi.  We'll have to wait and see just how good Navi is.  But barring something catastrophic, AMD will be in a much more competitive position after various Navi cards launch than they are today, especially if, as rumored, Nvidia isn't moving to 7 nm until 2020.
    It will indeed be interesting - and should be good news - to see what the long awaited Navi brings. Although if rumours are correct that doesn't seenm likely before the second half of 2019 is likely.

    As for NVidia if Navi is H2 2019 I would expect NVidia to engage in some obfuscation: announce new mid-range 14nm Turing cards - that will be available - along with new 7nm cards - that won't be available (until 2020). 

    All that said the possible release dates will - to some extent - be out of the hands of both AMD and NVidia.

    They will depend on how well the new foundries perform; what the yields are; what demands are placed on the new foundries; do AMD / NVidia want to pay a premium to jump the queue - if that is even an option and so on. Indeed AMD themselves will have to decide whether they initially prioritise 7nm gpus or 7nm cpus. (The latter also being rumoured for H2 2019).

    As it stands we may find ourselves discussing Arcturus rumours before we see Navi!
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    gervaise1 said:

    As for NVidia if Navi is H2 2019 I would expect NVidia to engage in some obfuscation: announce new mid-range 14nm Turing cards - that will be available - along with new 7nm cards - that won't be available (until 2020). 
    If you're considering a GPU upgrade, Nvidia wants you to buy now.  They don't want to give you any reason at all to think that something better is coming soon and that you should wait.  If you wait too long, Navi will be out, and then you'll be a lot more likely to buy AMD than you are now--especially in the over $300 bracket, where AMD isn't very competitive at the moment.  That would even be the case if Nvidia had new, high end cards launching in a month or two.

    If Navi launches and is clearly superior to Turing--as seems reasonably likely because of the process node--then either either after it launches or possibly just before, that's when Nvidia will start making noise about their own 7 nm.  If someone who buys then seems like they'll probably buy AMD, then Nvidia will want to give you all sorts of reasons to wait.  But not yet.

    Right now, it's AMD who wants to give you reasons to wait, at least at the high end.  So I expect that AMD will say something or other about Navi this week.
    gervaise1
  • OzmodanOzmodan Member EpicPosts: 9,726
    Ridelynn said:
    I honestly wish AMD could do a Ryzen type of turn around on their cards and have something to compete with the 2080+ or at least come close. Having a 1080Ti I have zero reason to be excited with this go around of cards but in the future would love to have some real competition for NVIDIA.

    Unless you have a 1060 or below though there is no reason to upgrade at least from a value perspective at this time.
    I don't think AMD needs anything close to a 2080.

    Right now, AMD has a solution that competes favorably with a 2060. That would be Vega. Problem is nVidia is able to cut the price now: MSRP on 2060 is $350 (we'll see if that stays once they release in a few weeks), versus MSRP on a Vega64 (which is what i think would be roughly equivalent) being $499.  That actually isn't horrible on the part of nVidia - it roughly matches a 1070Ti, which was MSRP $450. I don't have a lot of bad things to say about the 2060 (plenty to say about nVidias overall 20xx strategy, but not a lot about the 2060 itself).

    That being said, even that price point is on the upper end of where they need to be competitive to actually make a difference. The low and mid tier cards are were the volume occurs. If nothing else, Steam Surveys routinely have the low and mid tier cards in highest volume (nVidia 1060, 970, etc).

    The important price points, in my opinion, occur at <$180, $180-$300, and >$300 (in US dollars). Those, for me, are what I consider low, mid, and high tier brackets. And then you have the halo products that exist outside of that, which fall outside of any reasonable pricing structure (largely because they can).

    Now, maybe because of inflation and everything else, my completely unofficial price bracketing needs to be updated - I won't argue that. But even if we shift it by a few dollars up or down, we would have to shift it by a factor of at least x3 to get to anything close to where a 2080 is a relevant card.

    But on to my point, AMD should have an answer to the 2060: We know they can do that technically now, they just can't hit the same price point (yet). The RX590 competes pretty favorably versus the previous generation 1060 6G, but the new 2060 is considerably higher performance (as well as higher price, so I'm not knocking that too much). But I don't think they need to necessarily concern themselves with performance or pricing beyond that - everything above that point is halo, as far as I'm concerned.

    If AMD can compete against a 2060 in both performance and price, and flesh out the lineup between that and their 590 (or better yet, go ahead and look at retiring the Polaris lineup, which has had a great run, with new Vega+/Navi replacements), I think they will be doing well.

    I do agree with your sentiment that only those with 1060 or less should even consider upgrading... but then again, I don't necessarily see a great reason to upgrade each generation either, unless you are one of those people that has to be on the performance edge with top tier cards all the time. For most people, that's just too expensive to chase though. Quiz's thumbrule for upgrading GPU is a pretty good one to go by: unless it's broke, don't upgrade unless you can get at least double the performance. If you are looking at like tiers, that historically would be roughly every third generation.
    No no no, a Vega 64 is not equivalent to a 2060, it is much closer to a 1080 and priced appropriately.
  • OzmodanOzmodan Member EpicPosts: 9,726
    the 2060 is rather funny as it as it has ray tracing capabilities, just that it is does not have enough performance to use it.

  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    Ozmodan said:
    the 2060 is rather funny as it as it has ray tracing capabilities, just that it is does not have enough performance to use it.

    The RTX 2080 Ti doesn't have enough performance to do a lot of ray tracing, either.  It has enough performance to do just a little, and the RTX 2060 to do just a little less.

    The real question is whether game developers can do anything cool with just a little bit of ray tracing.  I think that there is some real potential there in any game that wants to make you look at the game world to find something.  Done properly, the one object in the game world that is ray-traced can be made to jump out at you as looking different from everything else.  But I also think that most of the developers who implement ray tracing won't do anything important to gameplay with it, but will mainly use it to kill your performance.
  • RidelynnRidelynn Member EpicPosts: 7,383
    Remember Tessellation in 2010/2011 era? 

    It was supposed to be the BIG thing in DX11. And nVidia performance in tesssellation crushed AMD performance at the time. They even named part of architecture pipeline in Fermi specifically for it - Polymorph Engine.

    Sounds vaguely familiar....

    I'm sure tessellation is still around today. I'm sure it's built into a bunch of big game engines. It's just become transparent and non-newsworthy. 

    I think RTX/Raytracing is the new Tesselation - you ~need~ more hardware to run it, the effect is something less than awe-inspiring. And by the time we have the hardware to run it, everyone will largely be over it and no one will care about it any longer.
    [Deleted User]
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    edited January 2019
    Ridelynn said:
    Remember Tessellation in 2010/2011 era? 

    It was supposed to be the BIG thing in DX11. And nVidia performance in tesssellation crushed AMD performance at the time. They even named part of architecture pipeline in Fermi specifically for it - Polymorph Engine.

    Sounds vaguely familiar....

    I'm sure tessellation is still around today. I'm sure it's built into a bunch of big game engines. It's just become transparent and non-newsworthy. 

    I think RTX/Raytracing is the new Tesselation - you ~need~ more hardware to run it, the effect is something less than awe-inspiring. And by the time we have the hardware to run it, everyone will largely be over it and no one will care about it any longer.
    Tessellation was supposed to be the big thing in DirectX 10, not DirectX 11.  Then Nvidia saw that ATI had a good implementation of it and they didn't, threw a fit, and convinced Microsoft to remove it from DirectX 10, thus neutering the API and making it pointless.  Microsoft did add it back in DirectX 11, but it was too little, too late.

    Tessellation isn't about making graphics look better in the long run.  It's about making graphics look good for cheaper.  The idea is that you can use high polygon models when close up to look good, low polygon models when far away to save on processing, and interpolate between them smoothly enough for there to never be an obvious pop from one model to another.  This could have been a big deal if it was widespread a decade ago.

    Once hardware got powerful enough to just use more polygons more of the time, it didn't gain you nearly as much.  Thus, it joined the long list of formerly important performance optimizations that used to be necessary but aren't anymore.  Unfortunately, it wasn't really available to use until it was already past its prime.

    Fermi did put in massively more tessellation hardware than could ever have any reasonable use.  I think this was done for the sake of latency, not throughput, though it showed up in synthetic throughput benchmarks.
    Gdemami
  • CleffyCleffy Member RarePosts: 6,414
    I don't think AMDs issue is with hardware or software. It's with their industry support and how media outlets choose to test their equipment. From a pure spec sheet view, AMDs hardware decimates it's equivalent card from nVidia. Heck in a few games like Forza, the decimation is readily apparent. In GPGPU functionality for consumer cards, the AMD GPUs are simply better.

    The media issue is that review sites mostly use the same benchmark suites of popular games in DX11 and DX12. This makes nVidia look better as they reached out to those developers in order to favor their hardware and proprietary solutions. Looking at games outside of this group paints a different picture. AMDs drivers are much more generalized due to the simplicity of a single architecture over many years. AMD also tends to fair better with console ports as nearly all console games run on AMD hardware.

    For developer support. A lot of nVidia proprietary solutions are supported by major game engines that naturally hurt the performance of AMD hardware. nVidia works with major developers to implement their technology into games. Developers themselves are not readily willing to adopt new technology and tend to use simpler methods of reaching a solution.
  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    Cleffy said:
    I don't think AMDs issue is with hardware or software. It's with their industry support and how media outlets choose to test their equipment. From a pure spec sheet view, AMDs hardware decimates it's equivalent card from nVidia. Heck in a few games like Forza, the decimation is readily apparent. In GPGPU functionality for consumer cards, the AMD GPUs are simply better.

    The media issue is that review sites mostly use the same benchmark suites of popular games in DX11 and DX12. This makes nVidia look better as they reached out to those developers in order to favor their hardware and proprietary solutions. Looking at games outside of this group paints a different picture. AMDs drivers are much more generalized due to the simplicity of a single architecture over many years. AMD also tends to fair better with console ports as nearly all console games run on AMD hardware.

    For developer support. A lot of nVidia proprietary solutions are supported by major game engines that naturally hurt the performance of AMD hardware. nVidia works with major developers to implement their technology into games. Developers themselves are not readily willing to adopt new technology and tend to use simpler methods of reaching a solution.
    Yes and no.  A good review will disable all GameWorks features, for example.  Comparing performance in software written by Nvidia just isn't a clean comparison.

    But AMD's GCN and Polaris architectures are a lot more finicky about what you need to do to get good performance than Nvidia's Maxwell and Pascal.  (I exclude more recent architectures because I don't know for certain.)

    For example, AMD has a wavefront size of 64, while Nvidia has a warp size of 32.  That is, when you do any computations, on AMD, 64 "threads" have to do the same thing, while on Nvidia, only 32 do.  If you need 23 to do something, then on Nvidia, it's as expensive as if 32 did, while on AMD, it's as expensive as if you needed 64.  For many compute purposes, it's easy to make everything happen in multiples of 64, but graphics is a lot more pathological than that.

    As another example, both AMD and Nvidia have some instructions in exactly 1/4 of their shaders.  On AMD, this makes the operation four times as expensive, as 3/4 of the shaders sit there doing nothing, while the other 1/4 take four passes to do it for all of the threads.  On Nvidia, this only makes the operation twice as expensive, as the GPU can route stuff better.

    There are a number of other ways that this is true, too.  That doesn't mean that AMD GPUs are bad.  But it does mean that they won't perform as well relative to Nvidia as you might guess from the paper specs.  In CPUs, this is commonly called IPC, or instructions per clock.  The concept doesn't fit GPUs in the same way that it does CPUs, but Nvidia GPUs have an easier time taking advantage of the resources that they have available than AMD GPUs do.  
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    I found this interesting - The GTX 1080 FE had a launch price of $699 while the RTX 2060 FE has a launch price of $349. According to the extensive TechPowerUp review, the stock RTX 2060 is only 2% behind the GTX 1080 at 1440p - https://tpucdn.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/images/relative-performance_2560-1440.png
    Which is interesting while costing HALF aka 50% at launch of the GTX 1080.
    While that is impressive and interesting to me, the GTX 2060 pricing would of been better -$50 less.

    Ozmodan



  • RidelynnRidelynn Member EpicPosts: 7,383
    edited January 2019
    I found this interesting - The GTX 1080 FE had a launch price of $699 while the RTX 2060 FE has a launch price of $349. According to the extensive TechPowerUp review, the stock RTX 2060 is only 2% behind the GTX 1080 at 1440p - https://tpucdn.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/images/relative-performance_2560-1440.png
    Which is interesting while costing HALF aka 50% at launch of the GTX 1080.
    While that is impressive and interesting to me, the GTX 2060 pricing would of been better -$50 less.

    Yeah, well you expect performance to go up for a given price generation over generation... or for price to go down for a given performance (same thing, just worded the other way).

    I don't know if it will equal a 1080 yet or not. Everything I'm seeing is ~maybe~ your 2% is about right... closer to a 1070 Ti is what I am betting, but there isn't a lot of difference between a 1070Ti and a 1080 either. 1070 Ti MSRP was $450.
     
    The difference here is, that usually price will go down by an entire tier (or performance up a tier). With the 2000 series, all the tiers slid up to more expensive brackets. Based on that, in the past, you would have expected 2060 to cost $250 +/-, and to roughly equal a 1070 in performance.

    So yeah, your still getting more performance (or equal performance for less money) with a 2060, but it's not nearly as good as what we've seen in the past. An erosion of value.

    Maybe that's because Moore's law is dying. Maybe it's because of lack of competition/dominance in marketshare. Maybe it's because of Cryto going away and nVidia trying to preserve margins riding on a new set of consumer price expectations. Maybe it's just hubris. I guess consumer ultimately vote with their wallets and will deliver an answer. I know I haven't felt the need to throw my money at anything 2000 series, and Vega VII isn't making my wallet cry either.
    [Deleted User]
  • OzmodanOzmodan Member EpicPosts: 9,726
    edited January 2019
    I found this interesting - The GTX 1080 FE had a launch price of $699 while the RTX 2060 FE has a launch price of $349. According to the extensive TechPowerUp review, the stock RTX 2060 is only 2% behind the GTX 1080 at 1440p - https://tpucdn.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/images/relative-performance_2560-1440.png
    Which is interesting while costing HALF aka 50% at launch of the GTX 1080.
    While that is impressive and interesting to me, the GTX 2060 pricing would of been better -$50 less.

    I am going to wait for real tests before believing the 2060 is even close to a 1080.   As we all know the released benchmarks for the 2070, 2080's did not come close to the actual benchmaks,   It is probably closer to a 1070.  Secondly anyone attempting to do ray tracing on this card is in for a rude awakening.
  • VrikaVrika Member LegendaryPosts: 7,990
    Ozmodan said:
    I found this interesting - The GTX 1080 FE had a launch price of $699 while the RTX 2060 FE has a launch price of $349. According to the extensive TechPowerUp review, the stock RTX 2060 is only 2% behind the GTX 1080 at 1440p - https://tpucdn.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/images/relative-performance_2560-1440.png
    Which is interesting while costing HALF aka 50% at launch of the GTX 1080.
    While that is impressive and interesting to me, the GTX 2060 pricing would of been better -$50 less.

    I am going to wait for real tests before believing the 2060 is even close to a 1080.   As we all know the released benchmarks for the 2070, 2080's did not come close to the actual benchmaks,   It is probably closer to a 1070.  Secondly anyone attempting to do ray tracing on this card is in for a rude awakening.
    Those are already real tests.

    In reviews RTX 2060 seems to be equal to or a bit ahead of GTX 1070 Ti, but losing a bit to GTX 1080.
    Ozmodan
     
  • RidelynnRidelynn Member EpicPosts: 7,383
    I'll wait until the reviews hit my usual trusted channels before putting too much faith in anything pre-release. Especially from NDAvidia
    Ozmodan[Deleted User]
  • gervaise1gervaise1 Member EpicPosts: 6,919
    Ridelynn said:
    <snip>
    I think RTX/Raytracing is the new Tesselation - you ~need~ more hardware to run it, the effect is something less than awe-inspiring. And by the time we have the hardware to run it, everyone will largely be over it and no one will care about it any longer.
    Understand the sentiment but the big difference is that Ray Tracing is not the new kid on the block. Its the old timer that has been around forever. 

    If - to the extent that - their budget allows it ray tracing is used in manufacturing, design, scientific e.g. physics, cosmology; film, media - special effects etc. etc. And this has been the case for decades. Let me say that again - to extent that their budget allows it!  Let alone the budget for mere consumers.

    Now if - as and when - the hardware becomes powerful and affordable enough will it be available in games by default? I suspect that will depend on what impact, if any, it has on how long it takes to develop a game. 
  • OzmodanOzmodan Member EpicPosts: 9,726
    Vrika said:
    Ozmodan said:
    I found this interesting - The GTX 1080 FE had a launch price of $699 while the RTX 2060 FE has a launch price of $349. According to the extensive TechPowerUp review, the stock RTX 2060 is only 2% behind the GTX 1080 at 1440p - https://tpucdn.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/images/relative-performance_2560-1440.png
    Which is interesting while costing HALF aka 50% at launch of the GTX 1080.
    While that is impressive and interesting to me, the GTX 2060 pricing would of been better -$50 less.

    I am going to wait for real tests before believing the 2060 is even close to a 1080.   As we all know the released benchmarks for the 2070, 2080's did not come close to the actual benchmaks,   It is probably closer to a 1070.  Secondly anyone attempting to do ray tracing on this card is in for a rude awakening.
    Those are already real tests.

    In reviews RTX 2060 seems to be equal to or a bit ahead of GTX 1070 Ti, but losing a bit to GTX 1080.
    ROFL!!!

  • ForgrimmForgrimm Member EpicPosts: 3,069
    edited January 2019
    I got the release email from NewEgg today, they're up on the site. They come with a Battlefield 5 or Anthem gift bundle while supplies last. https://www.newegg.com/GEFORCE-RTX-2060?cm_sp=Homepage-Top2016-_-P1_nvidia%2f19-0079-_-https%3a%2f%2fpromotions.newegg.com%2fnvidia%2f19-0079%2f1920x360.jpg&icid=489668
    [Deleted User]
  • RidelynnRidelynn Member EpicPosts: 7,383
    The 2060 isn’t in a bad spot. But lets just consider a generation earlier.

    1060 6Gb roughly equaled the 980 (non-Ti) in performance.

    980 had an MSRP of $549 at release.
    1060 had an MSRP of $249 at release.

    And that is my issue with the 2000 series strategy in general: erosion of value. Not that value is entirely absent.

    Now I won’t discount that Pascal is a good architecture, and in general represented a decent value across the board — maybe one of the best nVidia has ever had. But before Pascal came out, you could say the same thing about Maxwell, and before that Keplar.

    Turing definitely reverses that trend to the detriment of the consumer, and in a very dramatic and glaring fashion.
    [Deleted User]
  • RidelynnRidelynn Member EpicPosts: 7,383
    DMKano said:
    Ridelynn said:
    The 2060 isn’t in a bad spot. But lets just consider a generation earlier.

    1060 6Gb roughly equaled the 980 (non-Ti) in performance.

    980 had an MSRP of $549 at release.
    1060 had an MSRP of $249 at release.

    And that is my issue with the 2000 series strategy in general: erosion of value. Not that value is entirely absent.

    Now I won’t discount that Pascal is a good architecture, and in general represented a decent value across the board — maybe one of the best nVidia has ever had. But before Pascal came out, you could say the same thing about Maxwell, and before that Keplar.

    Turing definitely reverses that trend to the detriment of the consumer, and in a very dramatic and glaring fashion.

    The reality is that all the video cards are more expensive now.

    I think the only thing that matters is the price for performance. 

    This is a 350 dollar card that is giving you the same performance as the 450 dollar previous gen.

    Just checked newegg- MSI has 6GB 1060s for $300-350.

    The prices just never fully recovered after the crypto mine hike 

    Right. Which is why I think it important to compare MSRP at launch rather than current prices when trying to evaluate generation over generation.

    nVidia is taking advantage of the price inflation that occurred with crypto. It established a new normal, and there isn’t enough competition to pull it back down. 

    That works to nVidias favor - they want higher margins of course. I don’t blame them at all and totally understand it. Doesn’t mean I like it.
    [Deleted User]
  • OzmodanOzmodan Member EpicPosts: 9,726
    DMKano said:
    Ozmodan said:
    Vrika said:
    Ozmodan said:
    I found this interesting - The GTX 1080 FE had a launch price of $699 while the RTX 2060 FE has a launch price of $349. According to the extensive TechPowerUp review, the stock RTX 2060 is only 2% behind the GTX 1080 at 1440p - https://tpucdn.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/images/relative-performance_2560-1440.png
    Which is interesting while costing HALF aka 50% at launch of the GTX 1080.
    While that is impressive and interesting to me, the GTX 2060 pricing would of been better -$50 less.

    I am going to wait for real tests before believing the 2060 is even close to a 1080.   As we all know the released benchmarks for the 2070, 2080's did not come close to the actual benchmaks,   It is probably closer to a 1070.  Secondly anyone attempting to do ray tracing on this card is in for a rude awakening.
    Those are already real tests.

    In reviews RTX 2060 seems to be equal to or a bit ahead of GTX 1070 Ti, but losing a bit to GTX 1080.
    ROFL!!!


    I don't think that people in the market for a 2060 plan on doing any ray-tracing at all, they are there because of the bang for the buck.

    1070ti - $450-470+
    1080s - $750

    So people are looking at RTX2060s as getting 1070ti performance for $350

    I don't know why that is - ROFL?

    Now consider those who also wanted to buy Anthem or Battlefield V ($50-$60) and you get that for free with an RTX2060 - that's not a bad deal

    I can see why some are going to jump and get RTX2060s - as you get decent performance at 2K and below - and the price compared to other nvidia cards is right.


    Where are you getting your prices, I can buy a 1080 for $500, a 1070 for 350.

  • QuizzicalQuizzical Member LegendaryPosts: 25,501
    edited January 2019
    DMKano said:
    Ridelynn said:
    The 2060 isn’t in a bad spot. But lets just consider a generation earlier.

    1060 6Gb roughly equaled the 980 (non-Ti) in performance.

    980 had an MSRP of $549 at release.
    1060 had an MSRP of $249 at release.

    And that is my issue with the 2000 series strategy in general: erosion of value. Not that value is entirely absent.

    Now I won’t discount that Pascal is a good architecture, and in general represented a decent value across the board — maybe one of the best nVidia has ever had. But before Pascal came out, you could say the same thing about Maxwell, and before that Keplar.

    Turing definitely reverses that trend to the detriment of the consumer, and in a very dramatic and glaring fashion.

    The reality is that all the video cards are more expensive now.

    I think the only thing that matters is the price for performance. 

    This is a 350 dollar card that is giving you the same performance as the 450 dollar previous gen.

    Just checked newegg- MSI has 6GB 1060s for $300-350.

    The prices just never fully recovered after the crypto mine hike 

    If all video cards are more expensive now, then why aren't Radeons?

    https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&amp;IsNodeId=1&amp;N=100007709 601296377

    MSRP is $229 for the 8 GB version.
  • ArglebargleArglebargle Member EpicPosts: 3,482
    Ozmodan said:
    DMKano said:
    Ozmodan said:
    Vrika said:
    Ozmodan said:
    I found this interesting - The GTX 1080 FE had a launch price of $699 while the RTX 2060 FE has a launch price of $349. According to the extensive TechPowerUp review, the stock RTX 2060 is only 2% behind the GTX 1080 at 1440p - https://tpucdn.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/images/relative-performance_2560-1440.png
    Which is interesting while costing HALF aka 50% at launch of the GTX 1080.
    While that is impressive and interesting to me, the GTX 2060 pricing would of been better -$50 less.

    I am going to wait for real tests before believing the 2060 is even close to a 1080.   As we all know the released benchmarks for the 2070, 2080's did not come close to the actual benchmaks,   It is probably closer to a 1070.  Secondly anyone attempting to do ray tracing on this card is in for a rude awakening.
    Those are already real tests.

    In reviews RTX 2060 seems to be equal to or a bit ahead of GTX 1070 Ti, but losing a bit to GTX 1080.
    ROFL!!!


    I don't think that people in the market for a 2060 plan on doing any ray-tracing at all, they are there because of the bang for the buck.

    1070ti - $450-470+
    1080s - $750

    So people are looking at RTX2060s as getting 1070ti performance for $350

    I don't know why that is - ROFL?

    Now consider those who also wanted to buy Anthem or Battlefield V ($50-$60) and you get that for free with an RTX2060 - that's not a bad deal

    I can see why some are going to jump and get RTX2060s - as you get decent performance at 2K and below - and the price compared to other nvidia cards is right.


    Where are you getting your prices, I can buy a 1080 for $500, a 1070 for 350.

    Yeah, I recently got a 1070ti on sale for $350.  Will probably just skip the 20xx line.

    If you are holding out for the perfect game, the only game you play will be the waiting one.

Sign In or Register to comment.