makes buying an VII a lot easier. 4096 bit memory bus LOL compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.
Vega 54 Memory bus = 2048bits @ Bandwidth 409.6 GB/s
Not even a contest anymore.
There's no reason to care about memory bus width for its own sake. What matters is memory bandwidth, capacity, the power it takes to get that, and occasionally space. Bus width is an input into those things, as is memory clock speed, but it's not something that you should care about in isolation.
But if you do want to play that game, then the memory on the Radeon VII is clocked at 1 GHz, while that of the GeForce GTX 1060 is clocked at 2 GHz, and 2 GHz is a lot more than 1 GHz. Right?
makes buying an VII a lot easier. 4096 bit memory bus LOL compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.
Vega 54 Memory bus = 2048bits @ Bandwidth 409.6 GB/s
Not even a contest anymore.
Huh? There is absolutely no meaningful reason, unless you mine, to get a Radeon VII at MRSP. In AMD's internal review guide numbers, by their own admission a 2080 will still be faster at the same price.
There are other compute purposes besides mining. If you're doing something where you know that you need 16 GB, then the RTX 2080 (or even 2080 Ti) is a non-starter, but the Radeon VII might be quite nice. Similarly if it's a compute thing that needs a ton of memory bandwidth.
There are also weird corner cases. The reason I bought a Vega 64 over a GTX 1080 Ti is the idle power consumption with three 144 Hz monitors attached. I'm not sure if that is fixed in Turing yet, but if not, the same issue would justify buying a Radeon VII over an RTX 2080 Ti if you're using my monitor setup.
I would agree that for most gamers, an RTX 2080 makes more sense than a Radeon VII. But "absolutely no meaningful reason" greatly overstates the case.
I thought the 2060 was faster than a 1660Ti. 2060 was 1070Ti level, 1660 is 1070 level - or so I thought.
I mean, yeah, that isn't a very wide margin of performance there, but still, it's also not "the same except for RayTracing and DLSS". And there was about a $70 delta between the 1070 and 1070Ti MSRP (street price is a different matter), so the price difference between a 1660Ti and 2060 is historically appropriate (regardless of how otherwise "appropriate" we may feel about it).
Also, I can think of a few reasons where very large memory bandwidth is a good benefit. But by and large, memory bandwidth isn't the ultimate indicator of gaming performance, which is what I care about, and the intended purpose of most of these GPU cards we are discussing.
Apart from that, I can think of one debatable reason to buy a Radeon VII at MSRP over a 2080, and one legit purpose. The legit purpose is 4K video editing, where the 16GB of HBM shines. Here is a review, it focuses on gaming performance, but it mentions video editing specifically if you read the article. If you frequently edit a lot of video, and want to buy something cheaper than a $5,000+ Pro card, the VII is a steal.
The debatable reason would be a protest purchase. The performance is slower, but only marginally so, than a 2080, so it's not like you are buying something that is entirely inferior or unsuitable... it's still same class of performance. Every R7 purchase sends a pretty clear message to nVidia, and supports AMD.
I don't necessarily recommend that - I think you should get the best card, regardless of manufacturer, for the amount you have budgeted. But it's a plausible reason.
Multimedia encoding of any kind which includes streaming.
One other thing, drivers, AMD Adrenline platform is hands down the best set of drivers i've used in all of my computing days.
From what I've seen of \/d@ , they are using the same driver formula from 2000....
RVII out of stock // RTX 2080 instock // RTX 2070 price slashed ......
Newegg lists only two different Radeon VII models, it doesn't look like AMD has any intention of making more than just a few Radeon VII cards.
There's no reliable data of sales numbers, but if you count Newegg reviews then NVidia has sold 70 RTX 2080 Ti's for each Radeon VII sold by AMD.
It's a soft launch, as AMD wanted to get the cards out there as soon as they possibly could, rather than waiting a couple of months until they had enough inventory to keep them in stock forever. There weren't a whole lot of GTX 1080s available shortly after launch, either, but that didn't mean it was a bad card.
AmazingAveryAge of Conan AdvocateMemberUncommonPosts: 7,188
The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
Please explain to me how the RTX 2060 is a better buy? The ray tracing and DLSS on a 2060 is practically useless. Hence it is not worth one penny more than the 1660.
Sure - MSRP on the 1660ti is $280 on the 2060 is $350. Those are the lower ends so it varies on what card has your eye ie. Some 1660 Ti models go for $330. Nevertheless there is a basic $70 difference. So that is one piece understood on the other side what about performance? Well @1440p the 1660 Ti is 14% slower over 33 games see below - near the end. At 1080p there is 23% difference according to anandtech.
so excluding RTX features like DLSS and Ray Tracing which I agree are in infancy stages but getting better and will get better on performance alone and the price difference it’s worth considering. As I mentioned if you are looking at a 1660 Ti higher end model at $330 and you compare that to a $350 2060 I’d take the $20 extra for a 2060 and the performance boost with it.
$350 is in no way a "lower end" card except in a relative sense with much of the new lineup not yet released. $350 is mid-range to upper mid-range. $100 is a lower end card.
DLSS is garbage and will always be garbage. It's dead, and the only question is whether Nvidia knows it yet. They probably do, but don't want to let their fanboys in on that secret until they come up with the next gimmick.
Real-time ray tracing probably has a future. DLSS doesn't.
I didn’t say that $350 is low end in the sense you’re saying. I was meaning this - 1660 Ti range $280 (low) to $330 (high) 2060 range $350 and up meant within the ranges of the products.
I 100% agree $350 is mid range and I’d also add Nvidia is shifting the pricing distribution segments this gen up, and that is not good at all
As for DLSS it works near perfect on metro exodus. I appreciate the innovation and implementation. It is new take on tech and has to start somewhere then iterate on it. It’s totally a great example of an agile approach, progress over perfection. Get something out the door and iterate on it which is the same approach game devs will have with it. And at least they are fiscally supporting in the tech.
AmazingAveryAge of Conan AdvocateMemberUncommonPosts: 7,188
makes buying an VII a lot easier. 4096 bit memory bus LOL compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.
Vega 54 Memory bus = 2048bits @ Bandwidth 409.6 GB/s
Not even a contest anymore.
Huh? There is absolutely no meaningful reason, unless you mine, to get a Radeon VII at MRSP. In AMD's internal review guide numbers, by their own admission a 2080 will still be faster at the same price.
There are other compute purposes besides mining. If you're doing something where you know that you need 16 GB, then the RTX 2080 (or even 2080 Ti) is a non-starter, but the Radeon VII might be quite nice. Similarly if it's a compute thing that needs a ton of memory bandwidth.
There are also weird corner cases. The reason I bought a Vega 64 over a GTX 1080 Ti is the idle power consumption with three 144 Hz monitors attached. I'm not sure if that is fixed in Turing yet, but if not, the same issue would justify buying a Radeon VII over an RTX 2080 Ti if you're using my monitor setup.
I would agree that for most gamers, an RTX 2080 makes more sense than a Radeon VII. But "absolutely no meaningful reason" greatly overstates the case.
makes buying an VII a lot easier. 4096 bit memory bus LOL compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.
Vega 54 Memory bus = 2048bits @ Bandwidth 409.6 GB/s
Not even a contest anymore.
Huh? There is absolutely no meaningful reason, unless you mine, to get a Radeon VII at MRSP. In AMD's internal review guide numbers, by their own admission a 2080 will still be faster at the same price.
There are other compute purposes besides mining. If you're doing something where you know that you need 16 GB, then the RTX 2080 (or even 2080 Ti) is a non-starter, but the Radeon VII might be quite nice. Similarly if it's a compute thing that needs a ton of memory bandwidth.
There are also weird corner cases. The reason I bought a Vega 64 over a GTX 1080 Ti is the idle power consumption with three 144 Hz monitors attached. I'm not sure if that is fixed in Turing yet, but if not, the same issue would justify buying a Radeon VII over an RTX 2080 Ti if you're using my monitor setup.
I would agree that for most gamers, an RTX 2080 makes more sense than a Radeon VII. But "absolutely no meaningful reason" greatly overstates the case.
Sure there are a few niche use cases. I had no idea on the multi monitor set up that is interesting!
For several generations Adobe video software favours Nvidia And still does. The extra memory is very helpful on the Radeon but the difference in testing is minimal between the two from what I’ve read. Places like VideoCopilot you can read how much more favorable Nvidia has been since Polaris on hardware acceleration.
Personally I was hoping Radeon VII would be great but it is just to loud and hot and I’m disappointed as expected more especially with the die shrink. But I’m equally disappointed with the RTX high end pricing shenanigans, very.
I didn’t say that $350 is low end in the sense you’re saying. I was meaning this - 1660 Ti range $280 (low) to $330 (high) 2060 range $350 and up meant within the ranges of the products.
I 100% agree $350 is mid range and I’d also add Nvidia is shifting the pricing distribution segments this gen up, and that is not good at all
As for DLSS it works near perfect on metro exodus. I appreciate the innovation and implementation. It is new take on tech and has to start somewhere then iterate on it. It’s totally a great example of an agile approach, progress over perfection. Get something out the door and iterate on it which is the same approach game devs will have with it. And at least they are fiscally supporting in the tech.
Ah, sorry to misunderstand your point about "lower end" meaning the price range of SKUs for a given card.
As for DLSS in Metro Exodus, I'd really want to see a comparison of it with traditional upscaling akin to what the video linked from this thread did with Battlefield 5:
Depending on the resolution at which a game is rendered for DLSS, it could be tuned for relatively higher frame rates at worst image quality or lower frame rates at better image quality. What you have to do is to compare it to simple upscaling at the same frame rate in the same portion of the same game and then compare the image quality. If DLSS can't beat traditional upscaling in image quality at the same frame rate--in spite of simple upscaling having more samples to work with--then it's garbage.
RT + DLSS “worked”, but the image was much softer, almost blurry, and the colors muted. And 20-50% FPS hit. I thought the standard rasterized image looked much better.
but apart from the FPS, everything else is my subjective opinion, and may differ from your own. The tech did work, I didn’t notice any shimmering or other obvious glitches.
The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
Please explain to me how the RTX 2060 is a better buy? The ray tracing and DLSS on a 2060 is practically useless. Hence it is not worth one penny more than the 1660.
Sure - MSRP on the 1660ti is $280 on the 2060 is $350. Those are the lower ends so it varies on what card has your eye ie. Some 1660 Ti models go for $330. Nevertheless there is a basic $70 difference. So that is one piece understood on the other side what about performance? Well @1440p the 1660 Ti is 14% slower over 33 games see below - near the end. At 1080p there is 23% difference according to anandtech.
so excluding RTX features like DLSS and Ray Tracing which I agree are in infancy stages but getting better and will get better on performance alone and the price difference it’s worth considering. As I mentioned if you are looking at a 1660 Ti higher end model at $330 and you compare that to a $350 2060 I’d take the $20 extra for a 2060 and the performance boost with it.
You cannot do ray tracing on a 2060 nor DLSS, unless of course you want FPS in the single digits. My friend at Microcenter tells me they are selling very few 2060's just does not seem to be a market for them. They already canceled a backup order for them, the stock is just not moving. It is a card that fits few buyers. Few gamers do 1440p. From what I have seen in benchmarks your percentages are quite high. I very much doubt there is that much difference.
As to the new Metro, I have seen the game with ray tracing and DLSS turned on with my friend's 2080. First off it majorly kills performance and it looked to both of us the images were more blurry than with it off. So far my friend has not found a game where turning ray tracing on is worth it. He is rather disappointed in Nvidia at this point, he got caught up in their gimmick.
I think both the 1660Ti and 2060 are very competitive for nVidia. 1660Ti forces AMD to move the RX590 and the 2060 forces AMD to move the Vega 56 to be competitive -- at least until they can show Navi off. Sure, I wish the nVidia cards were cheaper, but that's speaking as a consumer. AMD still has the <$250 market wrapped up with the 570/580, but that only lasts so long and even those cards have been out for a while (being minor bump refreshes of the 470/480, Polaris has been around since June 2016).
If I were an nVidia investor, I would say they are marketed about right, and the only thing i would be concerned about would be that the 2060 includes all the additional cost of RTX without being able to deliver the benefits... you could have put out a chip with the same rasterizing performance, minus RTX, and sold it for about the same price with a lot lower production cost.
Rumor is that a 1660 and 1650 will come out later in March/April, and hit the upper $100 and lower $200 price points, and that will fill out the lineup for nVidia on this generation.
I think the 2070 is a ridiculous card standing next to the 2060, and it didn't make a lot of sense in the lineup before the 2060 was announced anyway. I think the 2080 is priced to the upper end of what I would ever consider reasonable for a top tier card, and the 2080Ti/Titan are just out of any ballpark I plan to play in.
I think nVidia's overall strategy is to skew the entire GPU cost lineup higher - so that people no longer think of a "budget" GPU as the <$200 market, but rather somewhere north of $300, and the mid and upper tiers significantly higher than that even. I can see that making sense - they have a large majority marketshare so they can push the market around, it's before a major release by AMD, and well before a push by Intel to get into the market. If they can push the price points all higher before the competition can catch up, any resulting "price war" from competition is softened significantly, and margins can stay higher long term.
makes buying an VII a lot easier. 4096 bit memory bus LOL compared to nvidia's 256 bit LOL . Just dellusional people still buy dated tech.
Vega 54 Memory bus = 2048bits @ Bandwidth 409.6 GB/s
Not even a contest anymore.
There's no reason to care about memory bus width for its own sake. What matters is memory bandwidth, capacity, the power it takes to get that, and occasionally space. Bus width is an input into those things, as is memory clock speed, but it's not something that you should care about in isolation.
But if you do want to play that game, then the memory on the Radeon VII is clocked at 1 GHz, while that of the GeForce GTX 1060 is clocked at 2 GHz, and 2 GHz is a lot more than 1 GHz. Right?
I do multimedia encoding aand also work which include After Effects from Adobe, it is a huge advantage to have as much band with when rendering.
In your own statement you are look at 2ghz at 256 bit memory bus vs 1ghz at 4096 bit , you think a faster gpu clock with narrow bit width is better ? Kind of weird thing ti think that you can simply increase gpu clocks forever and not widen the memory bit width or speed. Using ddr 6 is nvidia's answer to the future ? that future look bleak imho.
If you care about memory bandwidth, then look at memory bandwidth. It's sensible to care about bandwidth for a lot of purposes. But don't use bus width as a proxy. Otherwise, you'd end up choosing a Radeon R9 Fury with 512 GB/sec of bandwidth and a 4096-bit memory bus over a GeForce RTX 2080 Ti with 616 GB/sec of bandwidth and a 352-bit memory bus.
The thing with the 1660 Ti is that there are so many variations. With MSRP cards at $280 ranging to mass market cards up to $320-$330 with more premium build quality, more fans, some OC boosting and larger heat sinks for temperature headroom when OC.
This is true of nearly every GPU though... I mean, just look at RTX 2060 - on Newegg there are 6 different SKUs just from Gigabyte alone, with a $50 spread between them. And there are at least 5 other manufacturers with similar numbers of various SKUs
True, but so far with the 1660 Ti there are 25 known SKU's, which is pretty nuts! sure some are regional markets but at higher end pricing a RTX 2060 is a better buy.
Please explain to me how the RTX 2060 is a better buy? The ray tracing and DLSS on a 2060 is practically useless. Hence it is not worth one penny more than the 1660.
Sure - MSRP on the 1660ti is $280 on the 2060 is $350. Those are the lower ends so it varies on what card has your eye ie. Some 1660 Ti models go for $330. Nevertheless there is a basic $70 difference. So that is one piece understood on the other side what about performance? Well @1440p the 1660 Ti is 14% slower over 33 games see below - near the end. At 1080p there is 23% difference according to anandtech.
so excluding RTX features like DLSS and Ray Tracing which I agree are in infancy stages but getting better and will get better on performance alone and the price difference it’s worth considering. As I mentioned if you are looking at a 1660 Ti higher end model at $330 and you compare that to a $350 2060 I’d take the $20 extra for a 2060 and the performance boost with it.
You cannot do ray tracing on a 2060 nor DLSS, unless of course you want FPS in the single digits. My friend at Microcenter tells me they are selling very few 2060's just does not seem to be a market for them. They already canceled a backup order for them, the stock is just not moving. It is a card that fits few buyers. Few gamers do 1440p. From what I have seen in benchmarks your percentages are quite high. I very much doubt there is that much difference.
As to the new Metro, I have seen the game with ray tracing and DLSS turned on with my friend's 2080. First off it majorly kills performance and it looked to both of us the images were more blurry than with it off. So far my friend has not found a game where turning ray tracing on is worth it. He is rather disappointed in Nvidia at this point, he got caught up in their gimmick.
Ray tracing today is about where rasterization was in the early 90s. It's not just a dumb gimmick, but it's also a long way away from being the solution to everything.
I think that the best way to do ray tracing is to pick much lighter computational loads for it. Don't try to do all the complexity of high end graphics, and then stack ray tracing on top of that. Rather, have much simpler models, with both fewer polygons and fewer models, perhaps about on the order of complexity that you'd have done on a Nintendo 64.
And then go full ray tracing for everything. Have quite a few reflections and proper shadows, and use them in a way that actually matters for gameplay. Maybe a GeForce RTX 2080 Ti would only be able to run the game at 1024x768 to get a good frame rate. So what. Show off what ray tracing can do by making a game that could not be made with rasterization. Make a game that is to ray tracing what StarFox was to rasterization.
Build the game for low resolutions, and make sure that the UI scales well to very low resolutions--like 320x240. Build the game to run in a window rather than assuming the full screen. Have a software version of ray tracing to allow the game to run on non-RTX hardware, but merely at very low frame rates or resolutions (or both).
Needing a top end GPU to run the game at a rather low resolution means that the game would only have a small niche. Game developers don't have much incentive to make a game that hardly anyone can play well. So make it not just a sponsored title in the sense that some other games have, but a fully paid for title with Nvidia funding the development of the entire game. This doesn't need to be an AAA title; you could do a lot with a few million dollars. It would really be a marketing expense for Nvidia, so if you spend $3 million to make a game that only gets a small fraction of that back, so what? Include a few such games for free with purchase of any RTX card.
A few shadows or reflections here and there are barely noticeable other than in their effects on your frame rate. An entire game full of them will jump out at you. If you want people to care about ray tracing, you have to make it matter, and that means that a game needs to go all in on it.
I was at Microcenter today picking up some parts and they had a new shipment of Nvidia 1660's come in. I really have to wonder about Nvidia's attempt at marketing this board as it just makes no sense for anyone to purchase one. For about $30 more you can get a 2060 which is a lot more powerful. Only a fool would pinch pennies and buy one.
Even the sales reps were skeptical about selling them, they certainly were not going to recommend them.
I was at Microcenter today picking up some parts and they had a new shipment of Nvidia 1660's come in. I really have to wonder about Nvidia's attempt at marketing this board as it just makes no sense for anyone to purchase one. For about $30 more you can get a 2060 which is a lot more powerful. Only a fool would pinch pennies and buy one.
Even the sales reps were skeptical about selling them, they certainly were not going to recommend them.
The price gap on New Egg right now is $70, not $30. I'd agree that it makes sense to spring for an RTX 2060 if the price difference is only $30, but not so much if it's $70.
Looks like Nvidia is wanting to establish a new price point. We'll have to see if AMD plays along. Why sell a card for $150 if you can be competitive at $210
I was at Microcenter today picking up some parts and they had a new shipment of Nvidia 1660's come in. I really have to wonder about Nvidia's attempt at marketing this board as it just makes no sense for anyone to purchase one. For about $30 more you can get a 2060 which is a lot more powerful. Only a fool would pinch pennies and buy one.
Even the sales reps were skeptical about selling them, they certainly were not going to recommend them.
The price gap on New Egg right now is $70, not $30. I'd agree that it makes sense to spring for an RTX 2060 if the price difference is only $30, but not so much if it's $70.
You could buy a 2060 for $349 at Microcenter and the cheapest 1660 was $299, that is not $70. I don't know why people have to quote Newegg all the time, never bought any computers or parts from them ever, they are just not price leaders.
I was at Microcenter today picking up some parts and they had a new shipment of Nvidia 1660's come in. I really have to wonder about Nvidia's attempt at marketing this board as it just makes no sense for anyone to purchase one. For about $30 more you can get a 2060 which is a lot more powerful. Only a fool would pinch pennies and buy one.
Even the sales reps were skeptical about selling them, they certainly were not going to recommend them.
The price gap on New Egg right now is $70, not $30. I'd agree that it makes sense to spring for an RTX 2060 if the price difference is only $30, but not so much if it's $70.
You could buy a 2060 for $349 at Microcenter and the cheapest 1660 was $299, that is not $70. I don't know why people have to quote Newegg all the time, never bought any computers or parts from them ever, they are just not price leaders.
Microcenter is one store with only 25 locations nationwide, not really indicative of what's available to the general population. For most people, the difference will be $70. Even with your Microcenter example that's a $50 difference, not $30. Also, the 1660's start at $219 and the highest one doesn't go up to $299. Unless you're talking about the 1660ti.
I was at Microcenter today picking up some parts and they had a new shipment of Nvidia 1660's come in. I really have to wonder about Nvidia's attempt at marketing this board as it just makes no sense for anyone to purchase one. For about $30 more you can get a 2060 which is a lot more powerful. Only a fool would pinch pennies and buy one.
Even the sales reps were skeptical about selling them, they certainly were not going to recommend them.
The price gap on New Egg right now is $70, not $30. I'd agree that it makes sense to spring for an RTX 2060 if the price difference is only $30, but not so much if it's $70.
You could buy a 2060 for $349 at Microcenter and the cheapest 1660 was $299, that is not $70. I don't know why people have to quote Newegg all the time, never bought any computers or parts from them ever, they are just not price leaders.
On New Egg, the 2060 was $350 and the 1660 Ti was $280. Probably not coincidentally, that's also the MSRP for both cards.
The reason I mostly quote New Egg is that they have a search function that works, which makes it easy to find prices. When all I want to do is quote prices, I don't want to spend half an hour fighting with Amazon's broken search function to get an answer.
I was at Microcenter today picking up some parts and they had a new shipment of Nvidia 1660's come in. I really have to wonder about Nvidia's attempt at marketing this board as it just makes no sense for anyone to purchase one. For about $30 more you can get a 2060 which is a lot more powerful. Only a fool would pinch pennies and buy one.
Even the sales reps were skeptical about selling them, they certainly were not going to recommend them.
The price gap on New Egg right now is $70, not $30. I'd agree that it makes sense to spring for an RTX 2060 if the price difference is only $30, but not so much if it's $70.
You could buy a 2060 for $349 at Microcenter and the cheapest 1660 was $299, that is not $70. I don't know why people have to quote Newegg all the time, never bought any computers or parts from them ever, they are just not price leaders.
On New Egg, the 2060 was $350 and the 1660 was $280. Probably not coincidentally, that's also the MSRP for both cards.
The reason I mostly quote New Egg is that they have a search function that works, which makes it easy to find prices. When all I want to do is quote prices, I don't want to spend half an hour fighting with Amazon's broken search function to get an answer.
1660s are at 220 on new egg
1660 TIs are at 280
You are correct. That is what I meant, and I've edited the post as such.
Yep sorry I was refering to the 1660ti, I have yet to see a plain 1660.
I just do not buy PC parts online, too much of a hassle if you have issues and shipping them back. I am lucky to have a Microcenter nearby. They match any price on Newegg if need be and their microprocessors are always cheaper than you can find online.
Comments
But if you do want to play that game, then the memory on the Radeon VII is clocked at 1 GHz, while that of the GeForce GTX 1060 is clocked at 2 GHz, and 2 GHz is a lot more than 1 GHz. Right?
There are also weird corner cases. The reason I bought a Vega 64 over a GTX 1080 Ti is the idle power consumption with three 144 Hz monitors attached. I'm not sure if that is fixed in Turing yet, but if not, the same issue would justify buying a Radeon VII over an RTX 2080 Ti if you're using my monitor setup.
I would agree that for most gamers, an RTX 2080 makes more sense than a Radeon VII. But "absolutely no meaningful reason" greatly overstates the case.
I was meaning this -
1660 Ti range $280 (low) to $330 (high)
2060 range $350 and up
meant within the ranges of the products.
I 100% agree $350 is mid range and I’d also add Nvidia is shifting the pricing distribution segments this gen up, and that is not good at all
As for DLSS it works near perfect on metro exodus. I appreciate the innovation and implementation. It is new take on tech and has to start somewhere then iterate on it. It’s totally a great example of an agile approach, progress over perfection. Get something out the door and iterate on it which is the same approach game devs will have with it. And at least they are fiscally supporting in the tech.
Sure there are a few niche use cases. I had no idea on the multi monitor set up that is interesting!
For several generations Adobe video software favours Nvidia And still does. The extra memory is very helpful on the Radeon but the difference in testing is minimal between the two from what I’ve read. Places like VideoCopilot you can read how much more favorable Nvidia has been since Polaris on hardware acceleration.
Personally I was hoping Radeon VII would be great but it is just to loud and hot and I’m disappointed as expected more especially with the die shrink. But I’m equally disappointed with the RTX high end pricing shenanigans, very.
As for DLSS in Metro Exodus, I'd really want to see a comparison of it with traditional upscaling akin to what the video linked from this thread did with Battlefield 5:
https://forums.mmorpg.com/discussion/479496/apparently-dlss-is-as-bad-as-we-thought-it-would-be#latest
Depending on the resolution at which a game is rendered for DLSS, it could be tuned for relatively higher frame rates at worst image quality or lower frame rates at better image quality. What you have to do is to compare it to simple upscaling at the same frame rate in the same portion of the same game and then compare the image quality. If DLSS can't beat traditional upscaling in image quality at the same frame rate--in spite of simple upscaling having more samples to work with--then it's garbage.
https://m.youtube.com/watch?v=5JczNqpqwfI&time_continue=2
RT + DLSS “worked”, but the image was much softer, almost blurry, and the colors muted. And 20-50% FPS hit. I thought the standard rasterized image looked much better.
but apart from the FPS, everything else is my subjective opinion, and may differ from your own. The tech did work, I didn’t notice any shimmering or other obvious glitches.
As to the new Metro, I have seen the game with ray tracing and DLSS turned on with my friend's 2080. First off it majorly kills performance and it looked to both of us the images were more blurry than with it off. So far my friend has not found a game where turning ray tracing on is worth it. He is rather disappointed in Nvidia at this point, he got caught up in their gimmick.
If I were an nVidia investor, I would say they are marketed about right, and the only thing i would be concerned about would be that the 2060 includes all the additional cost of RTX without being able to deliver the benefits... you could have put out a chip with the same rasterizing performance, minus RTX, and sold it for about the same price with a lot lower production cost.
Rumor is that a 1660 and 1650 will come out later in March/April, and hit the upper $100 and lower $200 price points, and that will fill out the lineup for nVidia on this generation.
I think the 2070 is a ridiculous card standing next to the 2060, and it didn't make a lot of sense in the lineup before the 2060 was announced anyway. I think the 2080 is priced to the upper end of what I would ever consider reasonable for a top tier card, and the 2080Ti/Titan are just out of any ballpark I plan to play in.
I think nVidia's overall strategy is to skew the entire GPU cost lineup higher - so that people no longer think of a "budget" GPU as the <$200 market, but rather somewhere north of $300, and the mid and upper tiers significantly higher than that even. I can see that making sense - they have a large majority marketshare so they can push the market around, it's before a major release by AMD, and well before a push by Intel to get into the market. If they can push the price points all higher before the competition can catch up, any resulting "price war" from competition is softened significantly, and margins can stay higher long term.
I think that the best way to do ray tracing is to pick much lighter computational loads for it. Don't try to do all the complexity of high end graphics, and then stack ray tracing on top of that. Rather, have much simpler models, with both fewer polygons and fewer models, perhaps about on the order of complexity that you'd have done on a Nintendo 64.
And then go full ray tracing for everything. Have quite a few reflections and proper shadows, and use them in a way that actually matters for gameplay. Maybe a GeForce RTX 2080 Ti would only be able to run the game at 1024x768 to get a good frame rate. So what. Show off what ray tracing can do by making a game that could not be made with rasterization. Make a game that is to ray tracing what StarFox was to rasterization.
Build the game for low resolutions, and make sure that the UI scales well to very low resolutions--like 320x240. Build the game to run in a window rather than assuming the full screen. Have a software version of ray tracing to allow the game to run on non-RTX hardware, but merely at very low frame rates or resolutions (or both).
Needing a top end GPU to run the game at a rather low resolution means that the game would only have a small niche. Game developers don't have much incentive to make a game that hardly anyone can play well. So make it not just a sponsored title in the sense that some other games have, but a fully paid for title with Nvidia funding the development of the entire game. This doesn't need to be an AAA title; you could do a lot with a few million dollars. It would really be a marketing expense for Nvidia, so if you spend $3 million to make a game that only gets a small fraction of that back, so what? Include a few such games for free with purchase of any RTX card.
A few shadows or reflections here and there are barely noticeable other than in their effects on your frame rate. An entire game full of them will jump out at you. If you want people to care about ray tracing, you have to make it matter, and that means that a game needs to go all in on it.
https://www.reddit.com/r/pcgaming/comments/ah9b2g/q2vkpt_quake_2_realtime_path_tracing_using_nvidia/
*edit -- mistook Doom for Q2, fixed and added links
Even the sales reps were skeptical about selling them, they certainly were not going to recommend them.
"Be water my friend" - Bruce Lee
The reason I mostly quote New Egg is that they have a search function that works, which makes it easy to find prices. When all I want to do is quote prices, I don't want to spend half an hour fighting with Amazon's broken search function to get an answer.
P.S. also 1650 TI @ $160, a 1650 @ $130, and a either 1650 SE/1630 Ti/1630 at $110
I just do not buy PC parts online, too much of a hassle if you have issues and shipping them back. I am lucky to have a Microcenter nearby. They match any price on Newegg if need be and their microprocessors are always cheaper than you can find online.