We’ll have to wait for the reviews to determine performance levels and then can factor in equation of does the “price + performance work for me in my budget” type thoughts
And while you're at it, may I recommend not pre-ordering until then.
I want to upgrade from my 970. It's not a bad card, but the memory limitation isn't aging well. It's not really aging well for x70 series card overall.
My concern is that addition of new tech to a first launch series can often have problems or kinks that won't show up until well later and won't be resolved until the next gen. Since I only game in 1080 now and only plan on moving to 1440p "someday" I'm hoping I'll see some deals on older cards this fall.
It kind of annoys me that the x80 series is now becoming the baseline for a new generation launch discussion. That positions the x60 series cards as weak entries in a gaming rig and that's kind of messed up.
Even if they completely botch ray tracing somehow, the rasterization that is relied heavily upon by all games the you'll play in the useful lifetime of the card will probably still work fine. New architectures sometimes don't have the greatest drivers at launch, but you can do driver updates months after launch once you figure out how to optimize better for the new architecture.
It's not clear how much of a departure this is from previous architectures. It's possible that the architecture is basically Pascal plus some extra stuff, and will still perform like Pascal if you don't use the tensor cores or ray tracing stuff. It's also possible that it's a major redesign of the architecture and will be as different from Pascal as it is from Vega.
Not a hardware guru, but my guess ray tracing will go as the PS5/XBOX??? goes. If the PS5 announces it will have a 2060 (example) ray tracing will be available in just about every AAA game developed.
I think the PS5 specs when they are announced will drive my interest in this more. Not because I plan on playing on one. It will be a great factor in determining what games have it. Let's face it, there are probably people who knew about this 12 months ago. If a PS5 or XBOX??? are releasing supporting this tech developers are developing for it now--or will be very soon.
I think the idea that games supporting ray tracing will take 3-4 years to mainstream assumes that everyone learned about this today. That is obviously not the case as some games are going to be coming out with support very soon.
“It's unwise to pay too much, but it's worse to pay too little. When you pay too much, you lose a little money - that's all. When you pay too little, you sometimes lose everything, because the thing you bought was incapable of doing the thing it was bought to do. The common law of business balance prohibits paying a little and getting a lot - it can't be done. If you deal with the lowest bidder, it is well to add something for the risk you run, and if you do that you will have enough to pay for something better.”
I'm not going to pretend I know what you guys are talking about, but I skimmed through and didn't see anything about Vulkan. I thought it was going to be a big deal.
Here is a bit more info mate - https://developer.nvidia.com/rtx/raytracing "NVIDIA is developing a ray-tracing extension to the Vulkan cross-platform graphics and compute API. Available soon, this extension will enable Vulkan developers to access the full power of RTX. NVIDIA is also contributing the design of this extension to the Khronos Group as an input to potentially bringing a cross-vendor ray-tracing capability to the Vulkan standard."
Mutterings that Witcher 3 and FFXV might get ray-tracing features
Sorry putting ray tracing in the Witcher 3 would slow it to a crawl even on the best GPU.
Not a hardware guru, but my guess ray tracing will go as the PS5/XBOX??? goes. If the PS5 announces it will have a 2060 (example) ray tracing will be available in just about every AAA game developed.
I think the PS5 specs when they are announced will drive my interest in this more. Not because I plan on playing on one. It will be a great factor in determining what games have it. Let's face it, there are probably people who knew about this 12 months ago. If a PS5 or XBOX??? are releasing supporting this tech developers are developing for it now--or will be very soon.
I think the idea that games supporting ray tracing will take 3-4 years to mainstream assumes that everyone learned about this today. That is obviously not the case as some games are going to be coming out with support very soon.
That is a good point, but it's important to remember that game consoles are very sensitive to the price tag, and therefore, to the die size. They're also fairly sensitive to power consumption, which also limits performance. It's completely acceptable for a high end gaming desktop to put out 300 W under gaming load, but that's wildly unacceptable for a game console.
If the fixed function logic intended for ray tracing takes a ton of die space, allocating that to ray tracing will mean that you have a lot less available for traditional graphics stuff. Wasting die space on stupid stuff burned Microsoft pretty badly on the Xbox One, and they didn't repeat that mistake with the Xbox One X. If they can do ray tracing well while adding minimal additional die space, that could be much more tempting. I don't expect that to be the case, however.
I'm not going to pretend I know what you guys are talking about, but I skimmed through and didn't see anything about Vulkan. I thought it was going to be a big deal.
Here is a bit more info mate - https://developer.nvidia.com/rtx/raytracing "NVIDIA is developing a ray-tracing extension to the Vulkan cross-platform graphics and compute API. Available soon, this extension will enable Vulkan developers to access the full power of RTX. NVIDIA is also contributing the design of this extension to the Khronos Group as an input to potentially bringing a cross-vendor ray-tracing capability to the Vulkan standard."
Mutterings that Witcher 3 and FFXV might get ray-tracing features
Sorry putting ray tracing in the Witcher 3 would slow it to a crawl even on the best GPU.
You could make ray tracing run however fast you need to just by having a simpler game world with a lot fewer objects to draw. You might be able to get acceptable performance in a game like Street Fighter, where you have only two characters in a wide open arena, even if you can't do so in an MMORPG.
I do not see ray tracing in consoles in the near future. Unless Sony and Microsoft postpone their consoles because the design phase is pretty far along for the next generation. Going to be rather hard to put one of these 20xx chips in a 400-500 box.
Since ray tracing will effect fps, I do not see developers using it very much at all.
I do not see ray tracing in consoles in the near future. Unless Sony and Microsoft postpone their consoles because the design phase is pretty far along for the next generation. Going to be rather hard to put one of these 20xx chips in a 400-500 box.
Since ray tracing will effect fps, I do not see developers using it very much at all.
I think it depends on what you mean by near future. If you mean the next console launch, I think that is still extremely possible. Consoles always well at a loss in the beginning, the life cycle of a console is ~6 years so they sell at a profit further down the line. Moreover, my guess is that console manufactures would have know about this months ago. Furthermore, I don't think a different video card is going to change the system all that much. They can probably spin up a new driver in a matter of weeks if not a month.
Also if the consoles don't jump on board, ray tracing is probably dead. I fully expect one the reasons this is announced now is because NVIDIA sees the console launches coming up.
Thinks about it. If NVIDIA gets 20xx cards into both consoles they will basically crush AMD for the next half decade, even more so than they are doing now. I mean, NVIDIA is stilling on an overstock of cards and has the best cards on the market. Why would they release now? The fact that this came within weeks of the PS5 leaks I find kind of telling. Or more to the point both of these were leaked about the same time.
I think we will see Ray tracing in consoles and I think it is a very bad thing for AMD. The only thing that I think can save AMD right now is is they can find some way of bringing value with freesync. I think there is too much money to be maid in TVs that are Gsync or Freesync enabled for console gaming. I suspect Sony will want to get some of that money in TV sales and a way to push that is by sticking one brand in their console.
“It's unwise to pay too much, but it's worse to pay too little. When you pay too much, you lose a little money - that's all. When you pay too little, you sometimes lose everything, because the thing you bought was incapable of doing the thing it was bought to do. The common law of business balance prohibits paying a little and getting a lot - it can't be done. If you deal with the lowest bidder, it is well to add something for the risk you run, and if you do that you will have enough to pay for something better.”
--John Ruskin
AmazingAveryAge of Conan AdvocateMemberUncommonPosts: 7,188
I'm not going to pretend I know what you guys are talking about, but I skimmed through and didn't see anything about Vulkan. I thought it was going to be a big deal.
Here is a bit more info mate - https://developer.nvidia.com/rtx/raytracing "NVIDIA is developing a ray-tracing extension to the Vulkan cross-platform graphics and compute API. Available soon, this extension will enable Vulkan developers to access the full power of RTX. NVIDIA is also contributing the design of this extension to the Khronos Group as an input to potentially bringing a cross-vendor ray-tracing capability to the Vulkan standard."
Mutterings that Witcher 3 and FFXV might get ray-tracing features
Sorry putting ray tracing in the Witcher 3 would slow it to a crawl even on the best GPU.
You could make ray tracing run however fast you need to just by having a simpler game world with a lot fewer objects to draw. You might be able to get acceptable performance in a game like Street Fighter, where you have only two characters in a wide open arena, even if you can't do so in an MMORPG.
The point NVIDIA are making is that you can make it all run fine on the RTX platform which has the required hardware to do this. That’s the whole point of this gen of top end cards. The in game function will be for folks with RTX cards. For how well all that is going to be will be in the reviews and not to take their word at this point in time.
Again,it’s not ala Game works. There are two extra processors on the die RT and Tensor that deal with this. I suspect you’re not going to get a RT option for legacy hardware but you wouldn’t really want to enable anyways which goes back to the point of having RTX cards for that.
We will have to wait for reviews to see performance without Ray Tracing and with Ray Tracing all I’ve see for a test so far will be the Battlefield V beta.
Confirmed Ray Tracing enabled games so far: Battlefield V Metro Exodus Shadow of the Tomb Raider
Other games with planned NVIDIA RTX support so far:
I do not see ray tracing in consoles in the near future. Unless Sony and Microsoft postpone their consoles because the design phase is pretty far along for the next generation. Going to be rather hard to put one of these 20xx chips in a 400-500 box.
Since ray tracing will effect fps, I do not see developers using it very much at all.
I think it depends on what you mean by near future. If you mean the next console launch, I think that is still extremely possible. Consoles always well at a loss in the beginning, the life cycle of a console is ~6 years so they sell at a profit further down the line. Moreover, my guess is that console manufactures would have know about this months ago. Furthermore, I don't think a different video card is going to change the system all that much. They can probably spin up a new driver in a matter of weeks if not a month.
Also if the consoles don't jump on board, ray tracing is probably dead. I fully expect one the reasons this is announced now is because NVIDIA sees the console launches coming up.
Thinks about it. If NVIDIA gets 20xx cards into both consoles they will basically crush AMD for the next half decade, even more so than they are doing now. I mean, NVIDIA is stilling on an overstock of cards and has the best cards on the market. Why would they release now? The fact that this came within weeks of the PS5 leaks I find kind of telling. Or more to the point both of these were leaked about the same time.
I think we will see Ray tracing in consoles and I think it is a very bad thing for AMD. The only thing that I think can save AMD right now is is they can find some way of bringing value with freesync. I think there is too much money to be maid in TVs that are Gsync or Freesync enabled for console gaming. I suspect Sony will want to get some of that money in TV sales and a way to push that is by sticking one brand in their console.
I think you overvalue how much having the consoles matters to PC gaming. AMD has had all the major consoles for a number of years, at least until the Nintendo Switch. Did that lead to AMD dominating GPUs in PCs? One could argue that it's been the opposite, as engineering resources assigned to developing console chips weren't available for other projects--such as creating a substantially new GPU architecture for the first time since GCN arrived around the end of 2011.
Furthermore, there will never again be a game console with a discrete GPU. We already haven't seen a game console launch with a discrete GPU since the Wii U. An integrated GPU saves money in a ton of ways, from needing only one memory pool rather than two to not needing to bother with a PCI Express bus at all to only needing one major heatsink. If the entire console needs to sell for $400, you can't even have a $200 discrete GPU in there, let alone a $500 one.
That gives AMD a considerable advantage in bidding for game consoles, as they can offer high performance x86 cores. Nvidia simply can't do that. Now that both Microsoft and Sony are using x86 cores in their consoles, if the PS5 or next Xbox go with AMD, backward compatibility to support the last two generations of consoles (if you count the PS4 Pro and Xbox One X as a separate generation) is very easy. Go with Nvidia and ARM cores and you've got no backward compatibility at all apart from game streaming.
AMD has also demonstrated a willingness to make custom chips for game consoles. It's possible that Nvidia would do that for the right price, but they haven't. Even the Nintendo Switch, the first significant Nvidia-powered console since the PS3, just uses an off the shelf Tegra X1.
My thoughts after going back over the press releases and the day 1 hype again:
RTX/Ray tracing is neat. But it's not necessarily better than rasterization in all cases, and the choice between rasterizing or ray tracing isn't an either/or binary decision.
I think RTX will be the next PhysX or Gameworks - it has some really neat and impactful situations where it can be used to great effect, but it isn't something where games are going to be written entirely in RTX, just like games aren't just written using PhysX or Gameworks. They are tools developers can use, and a complex system like a complete game often requires dozens of various tools for different things. And I think that just like PhysX and Gameworks, developers are going to have to code assuming that not all gamers will have RTX, therefore it's impact will be extremely limited to just optional eye candy. It sure will look great in the screen shots and livestreams, but it won't be something you ~have~ to have in order to play the game.
Right now, RTX is "the thing" because nVidia doesn't have anything else to push. There were no head to head benchmarks of Pascal vs Turing - in anything except raytracing. nVidia just did a paper launch of an upgraded product, where the product line they are upgrading from still didn't have any serious competition.
I still think this has more to do with keeping the investors happy than anything else. Gaming is nVidia's biggest cash cow, by far. But Wall Street's motto is "If you aren't growing, your dying". There isn't a lot of marketshare left for nVidia to grab in the gaming sector.
nVidia has been investing a lot into data centers, into Automotive/AI, and into SOCs (Workstation/Pro lines are more or less just derivative byproducts of their R&D into gaming).
SOCs finally won a decent contract - Nintendo. It looked like it was going well, until a huge back door was discovered. Automotive/AI has been getting hammered lately: infotainment isn't the big dollars, self-driving is, and that has had the brakes put on it in a big way lately. It's even rumored that Tesla, which was one of nVidia's high profile clients, is dropping nV to develop their own solution. And the icing on this cake: revenue from mining is finally drying up. There are only so many super computers that the world needs right now, and nVidia is already in most all of them. And let's not forget potential trade wars going on between your biggest marketplace and your largest point of manufacture. And your biggest competition is seeing it's largest revival in almost two decades, and could potentially be nipping at your heels with new cash flow any day now.
So how do you convince Wall Street that you aren't dying, and there's still plenty of potential for growth? Particularly after being one of the darlings of Wall Street and setting very high expectations, in no small part due to the recent windfall with cryptomining and surge in AI/Self-driving applications? Keep in mind, Wall Street isn't driven by just facts, it's driven by speculation, emotion, and rumor. People drive the market, and that makes the market subject to all the same whims that people are.
You either announce some venture into a new sector that looks underserved or ready for disruption (and nVidia has certainly done it's share of that over the past few years), or you launch some buzz at your main market and try to shake that up and get people to migrate upward. Whatever you do, you do it as flashy as possible so it gets a lot of press and gets the attention of a lot of investors. It appears to me that nVidia chose the latter here, and I think that apart from RTX (which I think is going to get used about like PhysX gets used), we aren't going to see a whole lot to really get excited about as gamers. As a gamer, I'm not particularly excited about RTX either, especially not with current adoption/use in games and with it's current pricetag.
Maybe that is to be expected. Pixel count has more or less stagnated around 1080P and pushes to get into 1440, 4K and beyond (where more actual rasterization power could be used) are met with a lot of apathy right now. A lot of folks talk about the benefits of high refresh rate, but consoles still chuck away at 30fps in most titles and it doesn't seem to be hurting them. And HDR is a hot mess on PC right now - both hardware and software. With all of that - who really needs a faster rasterizing GPU right now? In the current market, without a big shakeup of some sort that requires drastically increased, or different type of, computing power, we really are looking at dedicated GPUs going the way of dedicated sound - only for niche enthusiasts, and for nearly everyone else, something integrated being more than sufficient for 99% of anything you'd care to do with it.
I don't think Raytracing is the thing that will save discrete GPUs. In fact, I welcome the elimination of the discrete GPU. But hey, if anyone right now can move the industry, it would be nVidia.
How does GTX 2080 compare to GTX 1080 in terms of games without RTX support? I couldn't understand that from the presentation.
I play on 4K and the GTX1080 doesn't run it comfortably. Would an upgrade solve all my issues and more?
It's likely going to be about as fast or a bit faster than GTX 1080 Ti would be, but you should wait for reviews before making purchase decisions because in terms of performance gains per dollar spent it might be a terrible upgrade.
Wait for the card to launch and reviews to see how much of an upgrade it is.
How does GTX 2080 compare to GTX 1080 in terms of games without RTX support? I couldn't understand that from the presentation.
I play on 4K and the GTX1080 doesn't run it comfortably. Would an upgrade solve all my issues and more?
What "more" are you hoping a 2080 will solve?
Your first question there is the $1,199 question that nVidia casually ignored during their announcement yesterday, and is probably the most appropriate question to be asked right now.
All anyone can do is speculate on that, at least until the review embargo is up in a month or so. nVidia says "Six times more performance", but (intentionally?) doesn't say exactly what that means. I highly, strongly doubt it means that a 2080 will be 6 times faster than a 1080 outside of RTX support....
I suspect: 2080Ti will be slower than Titan V (but costs significantly less) 2080Ti will be faster than Titan xP (and cost about the same) 2080 will be slower than 1080Ti (but costs significantly more) 2070 will be slower than a 1080, and roughly equal to 1070Ti (but costs more)
My guess is 20-25% generation to generation improvement, based almost entirely on higher core counts and higher memory bandwidth with few IPC, tech, or process improvements. I think nVidia will attempt to justify the fact that a 2080/2070 will cost more than a 1080Ti/1080 yet be slower in most games with the fact that it has "OMG RTX!"
We will see though. Those are my guesses and assumptions - given the silence from nVidia, it could be way off base... will have to wait and see how close to the mark I got.
So as you know, @Quizzical, I'm not nearly the hardware SME you are. I did see where Nvidia officially announced the new GeForce's with Turing.. I'm guessing from your OP these won't provide the jump in fidelity and performance Nvidia wants us to believe?
I'm honestly just kinda hoping it brings the 1080 series down a bit so I can snag one of those for a cheaper price.
I had a look at one supplier which is also an NVidia partner. Some of the 1080s had 10-15% albeit only a few currently. So hopefully there will be some downward pressure. They also had pre-order prices for the new 2080 cards from multiple assemblers (Asus, MSI, Gigabyte, EVGA, Palit and others) at say a third more that sort of ballpark. Nothing for the 2070 though.
It's a ways off, but it would be nice to see it result in some heavy fall deals (cyber Monday/holiday season stuff) due to the new series announcement.
It just doesn't seem worth the money currently compared to my OC'd 970.
Nvidia isn't going to slash prices to compete with other Nvidia cards. AMD having something in that performance range and being inclined to start a price war could lead to price drops, but I don't see that happening soon.
Really, I think you're going to have to wait for 7 nm to show up before we see any huge price cuts. 754 mm^2 dies aren't cheap to build on recent process nodes. When it's a 300 mm^2 die on a 7 nm process node mature enough to have good yields, that's a lot more likely to lead to the price cuts you want--especially when both vendors have those chips to offer.
That's disappointing. I haven't felt it worth it to spend $500 for a 1080.
For that much cash, I'd prefer to add an extra $100-200 as necessary to get the latest generation.
EDIT- considering the 2070 will only be $599 for a founder's edition, how does the performance compare? If the 1080 prices don't budge, the 2070 seems like a no-brainer unless the 1080 is just significantly better.
I do not see ray tracing in consoles in the near future. Unless Sony and Microsoft postpone their consoles because the design phase is pretty far along for the next generation. Going to be rather hard to put one of these 20xx chips in a 400-500 box.
Since ray tracing will effect fps, I do not see developers using it very much at all.
I think it depends on what you mean by near future. If you mean the next console launch, I think that is still extremely possible. Consoles always well at a loss in the beginning, the life cycle of a console is ~6 years so they sell at a profit further down the line. Moreover, my guess is that console manufactures would have know about this months ago. Furthermore, I don't think a different video card is going to change the system all that much. They can probably spin up a new driver in a matter of weeks if not a month.
Also if the consoles don't jump on board, ray tracing is probably dead. I fully expect one the reasons this is announced now is because NVIDIA sees the console launches coming up.
Thinks about it. If NVIDIA gets 20xx cards into both consoles they will basically crush AMD for the next half decade, even more so than they are doing now. I mean, NVIDIA is stilling on an overstock of cards and has the best cards on the market. Why would they release now? The fact that this came within weeks of the PS5 leaks I find kind of telling. Or more to the point both of these were leaked about the same time.
I think we will see Ray tracing in consoles and I think it is a very bad thing for AMD. The only thing that I think can save AMD right now is is they can find some way of bringing value with freesync. I think there is too much money to be maid in TVs that are Gsync or Freesync enabled for console gaming. I suspect Sony will want to get some of that money in TV sales and a way to push that is by sticking one brand in their console.
I think you overvalue how much having the consoles matters to PC gaming.
Madden is a console only game. NBA2k is primarily console. None of the games on the list are PC only, I think. Most of the games are primarily console games. I know this is a MMO site and PC heavy crowd, but consoles drive the AAA market. Yeah the indy games are much more prevalent on PC, but the AAA titles are very console conscious.
“It's unwise to pay too much, but it's worse to pay too little. When you pay too much, you lose a little money - that's all. When you pay too little, you sometimes lose everything, because the thing you bought was incapable of doing the thing it was bought to do. The common law of business balance prohibits paying a little and getting a lot - it can't be done. If you deal with the lowest bidder, it is well to add something for the risk you run, and if you do that you will have enough to pay for something better.”
How does GTX 2080 compare to GTX 1080 in terms of games without RTX support? I couldn't understand that from the presentation.
I play on 4K and the GTX1080 doesn't run it comfortably. Would an upgrade solve all my issues and more?
I suspect: 2080Ti will be slower than Titan V (but costs significantly less) 2080Ti will be faster than Titan xP (and cost about the same) 2080 will be slower than 1080Ti (but costs significantly more) 2070 will be slower than a 1080, and roughly equal to 1070Ti (but costs more)
My guess is 20-25% generation to generation improvement, based almost entirely on higher core counts and higher memory bandwidth with few IPC, tech, or process improvements.
I think the gains will be below what you have listed for a vast majority of the market that does not take advantage of the ray tracing tech. I am not sure how they can stay on the same nm, dedicate compute to ray tracing and still get 25% gains in FPS on titles that do not support ray tracing.
This is why I don't consider this a normal video card launch. This seems like NVIDIAs attempt get developers to design for their proprietary system. If games are not designed specifically for ray tracing the differences will be very small in on screen appearance.
Furthermore, as AMD pushes more FPS, without the compute to ray tracing they will reduce the performance gap and possibly be able to overtake NVIDIA in non-ray tracing games when it comes to pure FPS. If AMD can do that at a cheaper price they could be very disruptive to NVIDIA.
This is why I think this has to be part of a larger strategy. If developers don't develop specifically for this, NVIDIA is wasting compute and power consumption. If NVIDIA is able to get this tech into the new consoles, in some way, then AMD will basically be forced to pay the licensing fees or be completely shut out of gaming for the next ~6 years.
I know many people are going to think I am wrong. I very well might be. Yes, I am going out on a limb I just hope it sturdy. I see this closer to VR. Where hardware developers are trying to change the way games are designed at a fundamental level. When you do that, early and heavy adaption is needed. Early and Heavy adaption is driven by consoles.
“It's unwise to pay too much, but it's worse to pay too little. When you pay too much, you lose a little money - that's all. When you pay too little, you sometimes lose everything, because the thing you bought was incapable of doing the thing it was bought to do. The common law of business balance prohibits paying a little and getting a lot - it can't be done. If you deal with the lowest bidder, it is well to add something for the risk you run, and if you do that you will have enough to pay for something better.”
How does GTX 2080 compare to GTX 1080 in terms of games without RTX support? I couldn't understand that from the presentation.
I play on 4K and the GTX1080 doesn't run it comfortably. Would an upgrade solve all my issues and more?
I suspect: 2080Ti will be slower than Titan V (but costs significantly less) 2080Ti will be faster than Titan xP (and cost about the same) 2080 will be slower than 1080Ti (but costs significantly more) 2070 will be slower than a 1080, and roughly equal to 1070Ti (but costs more)
My guess is 20-25% generation to generation improvement, based almost entirely on higher core counts and higher memory bandwidth with few IPC, tech, or process improvements.
I think the gains will be below what you have listed for a vast majority of the market that does not take advantage of the ray tracing tech. I am not sure how they can stay on the same nm, dedicate compute to ray tracing and still get 25% gains in FPS on titles that do not support ray tracing.
This is why I don't consider this a normal video card launch. This seems like NVIDIAs attempt get developers to design for their proprietary system. If games are not designed specifically for ray tracing the differences will be very small in on screen appearance.
Furthermore, as AMD pushes more FPS, without the compute to ray tracing they will reduce the performance gap and possibly be able to overtake NVIDIA in non-ray tracing games when it comes to pure FPS. If AMD can do that at a cheaper price they could be very disruptive to NVIDIA.
This is why I think this has to be part of a larger strategy. If developers don't develop specifically for this, NVIDIA is wasting compute and power consumption. If NVIDIA is able to get this tech into the new consoles, in some way, then AMD will basically be forced to pay the licensing fees or be completely shut out of gaming for the next ~6 years.
I know many people are going to think I am wrong. I very well might be. Yes, I am going out on a limb I just hope it sturdy. I see this closer to VR. Where hardware developers are trying to change the way games are designed at a fundamental level. When you do that, early and heavy adaption is needed. Early and Heavy adaption is driven by consoles.
I agree that this isn't a typical video card launch. And I agree that it's got to be part of a larger strategy.
My gains also don't take into account ray tracing. They do take into account the increase in shaders, or CUDA cores, or whatever they are calling them this week. That is how nVidia will get an increase in performance even if they are on the same process node.
I don't think AMD, at least with their current leaked roadmaps, will have anything concrete to push against nVidia any time soon - at least in the next 12 months or so.
This is a compelling read, from a not-necessarily-tech news outlet.
It's still based on a lot of speculation, but makes some salient points rather clearly.
That article is excellent.
It has several good points. The benchmark point is the most important in a pragmatic sense. The TL;DR of "wait until 2019 and 7nm" is what people should take away.
A point in your previous post touched on 1080 and the push to 1440 and difficulty in promoting adoption. I've been thinking a lot about that lately. Since I only plan on gaming at those resolutions top of the line cards aren't even that relevant to me. It changes the entire perspective I even look at GPU upgrades now.
Nothing in the 20x series looks attractive even with the focus on RT. Once benchmarks take that out of the equation I see a compelling reason to pass over every card and either buy backstock (if I don't want to wait) or wait until 2019.
By late 2019 Intel could possible start entering the market, or maybe 2020 (which isn't far off). At my gaming resolution AMD is also a strong contender. I'd keep my 970 if it weren't for the 3.5GB of RAM.
That Forbes article reinforced my my suspicion that is a generation to skip all around. We'll see if benchmarks confirm or shred that thinking.
Yea, I'm convinced right now that, unless the 2070 offers superior performance over the 1080 somehow, you're still getting the best bang for your buck sticking with the 10 series.
I do not see ray tracing in consoles in the near future. Unless Sony and Microsoft postpone their consoles because the design phase is pretty far along for the next generation. Going to be rather hard to put one of these 20xx chips in a 400-500 box.
Since ray tracing will effect fps, I do not see developers using it very much at all.
I think it depends on what you mean by near future. If you mean the next console launch, I think that is still extremely possible. Consoles always well at a loss in the beginning, the life cycle of a console is ~6 years so they sell at a profit further down the line. Moreover, my guess is that console manufactures would have know about this months ago. Furthermore, I don't think a different video card is going to change the system all that much. They can probably spin up a new driver in a matter of weeks if not a month.
Also if the consoles don't jump on board, ray tracing is probably dead. I fully expect one the reasons this is announced now is because NVIDIA sees the console launches coming up.
Thinks about it. If NVIDIA gets 20xx cards into both consoles they will basically crush AMD for the next half decade, even more so than they are doing now. I mean, NVIDIA is stilling on an overstock of cards and has the best cards on the market. Why would they release now? The fact that this came within weeks of the PS5 leaks I find kind of telling. Or more to the point both of these were leaked about the same time.
I think we will see Ray tracing in consoles and I think it is a very bad thing for AMD. The only thing that I think can save AMD right now is is they can find some way of bringing value with freesync. I think there is too much money to be maid in TVs that are Gsync or Freesync enabled for console gaming. I suspect Sony will want to get some of that money in TV sales and a way to push that is by sticking one brand in their console.
I think you overvalue how much having the consoles matters to PC gaming.
Madden is a console only game. NBA2k is primarily console. None of the games on the list are PC only, I think. Most of the games are primarily console games. I know this is a MMO site and PC heavy crowd, but consoles drive the AAA market. Yeah the indy games are much more prevalent on PC, but the AAA titles are very console conscious.
And why, pray tell, do console only games matter to PC gaming? I didn't say that consoles don't matter to a company's bottom line, though Microsoft and Sony can and did negotiate a deal that gives AMD a modest profit for providing the console chips but not an enormous one. And I definitely didn't say that game consoles don't matter to game developers.
My argument isn't that game consoles don't matter. My argument is that having the GPU in game consoles doesn't give your GPUs in PCs a major advantage.
How does GTX 2080 compare to GTX 1080 in terms of games without RTX support? I couldn't understand that from the presentation.
I play on 4K and the GTX1080 doesn't run it comfortably. Would an upgrade solve all my issues and more?
I suspect: 2080Ti will be slower than Titan V (but costs significantly less) 2080Ti will be faster than Titan xP (and cost about the same) 2080 will be slower than 1080Ti (but costs significantly more) 2070 will be slower than a 1080, and roughly equal to 1070Ti (but costs more)
My guess is 20-25% generation to generation improvement, based almost entirely on higher core counts and higher memory bandwidth with few IPC, tech, or process improvements.
I think the gains will be below what you have listed for a vast majority of the market that does not take advantage of the ray tracing tech. I am not sure how they can stay on the same nm, dedicate compute to ray tracing and still get 25% gains in FPS on titles that do not support ray tracing.
This is why I don't consider this a normal video card launch. This seems like NVIDIAs attempt get developers to design for their proprietary system. If games are not designed specifically for ray tracing the differences will be very small in on screen appearance.
Furthermore, as AMD pushes more FPS, without the compute to ray tracing they will reduce the performance gap and possibly be able to overtake NVIDIA in non-ray tracing games when it comes to pure FPS. If AMD can do that at a cheaper price they could be very disruptive to NVIDIA.
This is why I think this has to be part of a larger strategy. If developers don't develop specifically for this, NVIDIA is wasting compute and power consumption. If NVIDIA is able to get this tech into the new consoles, in some way, then AMD will basically be forced to pay the licensing fees or be completely shut out of gaming for the next ~6 years.
I know many people are going to think I am wrong. I very well might be. Yes, I am going out on a limb I just hope it sturdy. I see this closer to VR. Where hardware developers are trying to change the way games are designed at a fundamental level. When you do that, early and heavy adaption is needed. Early and Heavy adaption is driven by consoles.
A 754 mm^2 chip can have a whole lot of wasted die space and still offer awesome performance. That's about why the Titan V is the top of the line graphics part, even though for gaming purposes, a lot of die space is wasted on useless stuff like tensor cores and double precision compute.
How does GTX 2080 compare to GTX 1080 in terms of games without RTX support? I couldn't understand that from the presentation.
I play on 4K and the GTX1080 doesn't run it comfortably. Would an upgrade solve all my issues and more?
I suspect: 2080Ti will be slower than Titan V (but costs significantly less) 2080Ti will be faster than Titan xP (and cost about the same) 2080 will be slower than 1080Ti (but costs significantly more) 2070 will be slower than a 1080, and roughly equal to 1070Ti (but costs more)
My guess is 20-25% generation to generation improvement, based almost entirely on higher core counts and higher memory bandwidth with few IPC, tech, or process improvements.
I think the gains will be below what you have listed for a vast majority of the market that does not take advantage of the ray tracing tech. I am not sure how they can stay on the same nm, dedicate compute to ray tracing and still get 25% gains in FPS on titles that do not support ray tracing.
This is why I don't consider this a normal video card launch. This seems like NVIDIAs attempt get developers to design for their proprietary system. If games are not designed specifically for ray tracing the differences will be very small in on screen appearance.
Furthermore, as AMD pushes more FPS, without the compute to ray tracing they will reduce the performance gap and possibly be able to overtake NVIDIA in non-ray tracing games when it comes to pure FPS. If AMD can do that at a cheaper price they could be very disruptive to NVIDIA.
This is why I think this has to be part of a larger strategy. If developers don't develop specifically for this, NVIDIA is wasting compute and power consumption. If NVIDIA is able to get this tech into the new consoles, in some way, then AMD will basically be forced to pay the licensing fees or be completely shut out of gaming for the next ~6 years.
I know many people are going to think I am wrong. I very well might be. Yes, I am going out on a limb I just hope it sturdy. I see this closer to VR. Where hardware developers are trying to change the way games are designed at a fundamental level. When you do that, early and heavy adaption is needed. Early and Heavy adaption is driven by consoles.
I agree that this isn't a typical video card launch. And I agree that it's got to be part of a larger strategy.
My gains also don't take into account ray tracing. They do take into account the increase in shaders, or CUDA cores, or whatever they are calling them this week. That is how nVidia will get an increase in performance even if they are on the same process node.
I don't think AMD, at least with their current leaked roadmaps, will have anything concrete to push against nVidia any time soon - at least in the next 12 months or so.
AMD could build a 754 mm^2 die to offer stronger competition to Nvidia if they wanted to. I think they'd rather put their engineering resources toward parts on 7 nm.
Nvidia has traditionally been willing to build much larger dies than AMD. For that matter, Nvidia has been willing to build much larger dies than just about any other company in the world. By my count, Nvidia has made nine GPU dies of 500 mm^2 or larger: GT200 (2008), GF100 (2010), GF110 (2010), GK110 (2013), GK210 (2014), GM200 (2015), GP100 (2017), GV100 (2018), and now the big Turing (2018, probably). Dates are when you could buy the GPUs, not necessarily when they were announced.
AMD has only made one GPU die of 500 mm^2 or larger in the history of the company: Fiji, which came a full 3 1/2 years after AMD launched Tahiti on the same process node.
Interesting reddit thread. 2080ti in the new Tomb Raider with RT only hits 30 - 60FPS. Which could be "good enough" to play, but... yeah. That's the "Ti" which has become our new baseline?
So you see why rasterization won out in the 3D rendering wars twenty years ago, even though they knew all about ray tracing and knew that it looked better.
Comments
It's not clear how much of a departure this is from previous architectures. It's possible that the architecture is basically Pascal plus some extra stuff, and will still perform like Pascal if you don't use the tensor cores or ray tracing stuff. It's also possible that it's a major redesign of the architecture and will be as different from Pascal as it is from Vega.
--John Ruskin
If the fixed function logic intended for ray tracing takes a ton of die space, allocating that to ray tracing will mean that you have a lot less available for traditional graphics stuff. Wasting die space on stupid stuff burned Microsoft pretty badly on the Xbox One, and they didn't repeat that mistake with the Xbox One X. If they can do ray tracing well while adding minimal additional die space, that could be much more tempting. I don't expect that to be the case, however.
Since ray tracing will effect fps, I do not see developers using it very much at all.
--John Ruskin
Again,it’s not ala Game works. There are two extra processors on the die RT and Tensor that deal with this. I suspect you’re not going to get a RT option for legacy hardware but you wouldn’t really want to enable anyways which goes back to the point of having RTX cards for that.
We will have to wait for reviews to see performance without Ray Tracing and with Ray Tracing all I’ve see for a test so far will be the Battlefield V beta.
Confirmed Ray Tracing enabled games so far:
Battlefield V
Metro Exodus
Shadow of the Tomb Raider
Other games with planned NVIDIA RTX support so far:
Furthermore, there will never again be a game console with a discrete GPU. We already haven't seen a game console launch with a discrete GPU since the Wii U. An integrated GPU saves money in a ton of ways, from needing only one memory pool rather than two to not needing to bother with a PCI Express bus at all to only needing one major heatsink. If the entire console needs to sell for $400, you can't even have a $200 discrete GPU in there, let alone a $500 one.
That gives AMD a considerable advantage in bidding for game consoles, as they can offer high performance x86 cores. Nvidia simply can't do that. Now that both Microsoft and Sony are using x86 cores in their consoles, if the PS5 or next Xbox go with AMD, backward compatibility to support the last two generations of consoles (if you count the PS4 Pro and Xbox One X as a separate generation) is very easy. Go with Nvidia and ARM cores and you've got no backward compatibility at all apart from game streaming.
AMD has also demonstrated a willingness to make custom chips for game consoles. It's possible that Nvidia would do that for the right price, but they haven't. Even the Nintendo Switch, the first significant Nvidia-powered console since the PS3, just uses an off the shelf Tegra X1.
거북이는 목을 내밀 때 안 움직입니다
RTX/Ray tracing is neat. But it's not necessarily better than rasterization in all cases, and the choice between rasterizing or ray tracing isn't an either/or binary decision.
I think RTX will be the next PhysX or Gameworks - it has some really neat and impactful situations where it can be used to great effect, but it isn't something where games are going to be written entirely in RTX, just like games aren't just written using PhysX or Gameworks. They are tools developers can use, and a complex system like a complete game often requires dozens of various tools for different things. And I think that just like PhysX and Gameworks, developers are going to have to code assuming that not all gamers will have RTX, therefore it's impact will be extremely limited to just optional eye candy. It sure will look great in the screen shots and livestreams, but it won't be something you ~have~ to have in order to play the game.
Right now, RTX is "the thing" because nVidia doesn't have anything else to push. There were no head to head benchmarks of Pascal vs Turing - in anything except raytracing. nVidia just did a paper launch of an upgraded product, where the product line they are upgrading from still didn't have any serious competition.
I still think this has more to do with keeping the investors happy than anything else. Gaming is nVidia's biggest cash cow, by far. But Wall Street's motto is "If you aren't growing, your dying". There isn't a lot of marketshare left for nVidia to grab in the gaming sector.
nVidia has been investing a lot into data centers, into Automotive/AI, and into SOCs (Workstation/Pro lines are more or less just derivative byproducts of their R&D into gaming).
SOCs finally won a decent contract - Nintendo. It looked like it was going well, until a huge back door was discovered. Automotive/AI has been getting hammered lately: infotainment isn't the big dollars, self-driving is, and that has had the brakes put on it in a big way lately. It's even rumored that Tesla, which was one of nVidia's high profile clients, is dropping nV to develop their own solution. And the icing on this cake: revenue from mining is finally drying up. There are only so many super computers that the world needs right now, and nVidia is already in most all of them. And let's not forget potential trade wars going on between your biggest marketplace and your largest point of manufacture. And your biggest competition is seeing it's largest revival in almost two decades, and could potentially be nipping at your heels with new cash flow any day now.
So how do you convince Wall Street that you aren't dying, and there's still plenty of potential for growth? Particularly after being one of the darlings of Wall Street and setting very high expectations, in no small part due to the recent windfall with cryptomining and surge in AI/Self-driving applications? Keep in mind, Wall Street isn't driven by just facts, it's driven by speculation, emotion, and rumor. People drive the market, and that makes the market subject to all the same whims that people are.
You either announce some venture into a new sector that looks underserved or ready for disruption (and nVidia has certainly done it's share of that over the past few years), or you launch some buzz at your main market and try to shake that up and get people to migrate upward. Whatever you do, you do it as flashy as possible so it gets a lot of press and gets the attention of a lot of investors. It appears to me that nVidia chose the latter here, and I think that apart from RTX (which I think is going to get used about like PhysX gets used), we aren't going to see a whole lot to really get excited about as gamers. As a gamer, I'm not particularly excited about RTX either, especially not with current adoption/use in games and with it's current pricetag.
Maybe that is to be expected. Pixel count has more or less stagnated around 1080P and pushes to get into 1440, 4K and beyond (where more actual rasterization power could be used) are met with a lot of apathy right now. A lot of folks talk about the benefits of high refresh rate, but consoles still chuck away at 30fps in most titles and it doesn't seem to be hurting them. And HDR is a hot mess on PC right now - both hardware and software. With all of that - who really needs a faster rasterizing GPU right now? In the current market, without a big shakeup of some sort that requires drastically increased, or different type of, computing power, we really are looking at dedicated GPUs going the way of dedicated sound - only for niche enthusiasts, and for nearly everyone else, something integrated being more than sufficient for 99% of anything you'd care to do with it.
I don't think Raytracing is the thing that will save discrete GPUs. In fact, I welcome the elimination of the discrete GPU. But hey, if anyone right now can move the industry, it would be nVidia.
I play on 4K and the GTX1080 doesn't run it comfortably. Would an upgrade solve all my issues and more?
Wait for the card to launch and reviews to see how much of an upgrade it is.
Your first question there is the $1,199 question that nVidia casually ignored during their announcement yesterday, and is probably the most appropriate question to be asked right now.
All anyone can do is speculate on that, at least until the review embargo is up in a month or so. nVidia says "Six times more performance", but (intentionally?) doesn't say exactly what that means. I highly, strongly doubt it means that a 2080 will be 6 times faster than a 1080 outside of RTX support....
I suspect:
2080Ti will be slower than Titan V (but costs significantly less)
2080Ti will be faster than Titan xP (and cost about the same)
2080 will be slower than 1080Ti (but costs significantly more)
2070 will be slower than a 1080, and roughly equal to 1070Ti (but costs more)
My guess is 20-25% generation to generation improvement, based almost entirely on higher core counts and higher memory bandwidth with few IPC, tech, or process improvements. I think nVidia will attempt to justify the fact that a 2080/2070 will cost more than a 1080Ti/1080 yet be slower in most games with the fact that it has "OMG RTX!"
We will see though. Those are my guesses and assumptions - given the silence from nVidia, it could be way off base... will have to wait and see how close to the mark I got.
For that much cash, I'd prefer to add an extra $100-200 as necessary to get the latest generation.
EDIT- considering the 2070 will only be $599 for a founder's edition, how does the performance compare? If the 1080 prices don't budge, the 2070 seems like a no-brainer unless the 1080 is just significantly better.
This is a compelling read, from a not-necessarily-tech news outlet.
It's still based on a lot of speculation, but makes some salient points rather clearly.
I think you are undervaluing it.
--John Ruskin
--John Ruskin
My gains also don't take into account ray tracing. They do take into account the increase in shaders, or CUDA cores, or whatever they are calling them this week. That is how nVidia will get an increase in performance even if they are on the same process node.
I don't think AMD, at least with their current leaked roadmaps, will have anything concrete to push against nVidia any time soon - at least in the next 12 months or so.
My argument isn't that game consoles don't matter. My argument is that having the GPU in game consoles doesn't give your GPUs in PCs a major advantage.
Nvidia has traditionally been willing to build much larger dies than AMD. For that matter, Nvidia has been willing to build much larger dies than just about any other company in the world. By my count, Nvidia has made nine GPU dies of 500 mm^2 or larger: GT200 (2008), GF100 (2010), GF110 (2010), GK110 (2013), GK210 (2014), GM200 (2015), GP100 (2017), GV100 (2018), and now the big Turing (2018, probably). Dates are when you could buy the GPUs, not necessarily when they were announced.
AMD has only made one GPU die of 500 mm^2 or larger in the history of the company: Fiji, which came a full 3 1/2 years after AMD launched Tahiti on the same process node.