It looks like you're new here. If you want to get involved, click one of these buttons!
Third-party Radeon RX 5700 XT are finally beginning to arrive. We were able to go hands-on with the with one of the most interesting from XFX with the Radeon RX 5700 XT THICC II. At only $429, it features a beefy heatsink, a slick design, and impressive clocks but is it enough to compete in today's market? Join us as we find out!
Comments
Ray Tracing is not a very mature technology. Few games take full advantage of it. If most of your gameplay includes these, then a Nvidia is the winner. If you are not planning to use Ray Tracing, The 5700Xt probably wins in Performance per dollar. One of the games I am getting soon, Mech 5: Mercs is a Ray Tracing game. So I am debating between an RTX2060 Super and a 5700XT. PP$ is definitely towards the 5700 XT, but Ray Tracing...
4K monitors are ridiculously expensive. While gaming on a TV has disavantages, a 43" 4K TV would be a pretty good option. For those that prefer a monitor, 1440p is a good choice. In this realm, the 5700 XT does very well, but the 2070s and 5700XT are not that far apart. Price would probably rule in this comparison. My 61 year old eyes require a little lower rez, so I am quite happy with 1080p. I am probably getting a 34" or 35" 2560x1080 monitor when I get my new video card.
And, THICC... *SNIRK*
The world is going to the dogs, which is just how I planned it!
61!! Represent man do what u love, I'll be there in 23 years provided we don't obliterate the planet befor than. All that aside, it's a little late to the show and rtx 3000 series cards are less than a year out, with probably one more refresh in the 2000 series to come. AMD always feels a day late and a dollar short.
What the article was referring to is that Nvidia has a handful of very expensive GPUs out that can do a little bit of real-time ray tracing in games. AMD technically could do so, too, but without hardware acceleration, can do much less of it.
I don't see Turing's ray tracing as a forward-looking thing that will give the cards a longer useful lifetime. If real-time ray tracing in games ever becomes common, then most likely:
1) By the time it does, the hardware requirements to do it well will massively overwhelm even a GeForce RTX 2080 Ti, and
2) Hardware acceleration of ray tracing will have changed enough that Turing cards won't have the proper hardware to accelerate it.
That's not to say that ray tracing hardware was a dumb idea. It has to start somewhere if it's ever going to catch on, and if it means a much larger GPU market a decade from now than the mostly integrated market that would have happened otherwise, then Nvidia will probably benefit tremendously from that. They can also benefit by making mistakes now and figuring out how to fix them in time for future generations where the ray tracing is actually important, rather than making their big first-time mistakes when it really matters.
But that's a longer term bet, and really not a very compelling reason to buy particular hardware today. For comparison, the hardware tessellator in a Radeon HD 2900 XT wasn't a very good reason to buy that card. But some of the mistakes AMD (or ATI) made and learned from in designing that card would end up being good reasons to buy a Radeon HD 5000 series card a few years later.
On the other hand, DLSS is completely stupid. It manages to be a very expensive way to upscale images in a worse way than cheap GPUs could do 15 years ago. It's really just Nvidia looking for a marketing excuse to convince their fanboys that they should care about some feature that Nvidia has and AMD doesn't, even though it's completely useless for consumers.
A bigger consideration is budget. I won't spend more than $350 on a video card, and I am waiting for the holiday season to buy. So, the RTX 2080 Ti is the king? So what? It is way out of my budget. Right now the only things in the latest generations that are in my budget are the stock 5700 and the older model RTX 2060. Those two match up pretty well in PP$. I am SLIGHTLY leaning towards the 2060 because one of my upcoming games, Mech 5: Mercs, is a ray tracing supported game. But it is a minor consideration, since the tech is so new.
I am planning on reevaluating during the holiday season. The one that it under $350 (I may cave in and raise that a bit, but no more than $400) with the highest PP$ will get my money.
The world is going to the dogs, which is just how I planned it!
Interestingly Intel's answer is Intel. Xe is rumoured to be a big (very big even) gpu - but fit still won't be an issue. Given their cpu issues have to see how that plays out!
I think current RTX cards should be thought of as first generation products for new technology: It's nice to have if you've got extra money, but it'll take a couple of generations before the hardware is good enough and there are enough games supporting it that average gamer should buy a GPU with ray tracing.
Ray tracing is the fix for this. Ray traced lighting effects (shadows, reflections, making things lighter when near a light source, etc.) don't require all sorts of fakery. Rather, they actually look right. Comparing pictures and not agreeing that the ray traced version looks better than the rasterized version could be taken as a definition of legally blind. That doesn't mean that you still wouldn't be able to tell the difference between a screenshot and a photograph, but it would certainly be a lot harder.
But that's assuming that you go full ray tracing for everything and drop rasterization entirely, except for stuff like the UI that doesn't exist at some particular spot in the game world. The case for having a little bit of ray tracing here and there is much weaker.
Ray tracing isn't a new thing. It has been around for decades. In fact, many games have used ray tracing in some form for decades. Any models in a game that have lighting effects built into the model were probably generated by ray tracing. You can do stuff like that if you know that you have a very static game world where the sun will always be in a particular spot and stuff like that, and it does make the game look better.
What's new is real-time ray tracing in games. Rather than only using ray tracing to generate your models, actually ray trace things moving around as you play the game. That makes it possible to get correct lighting effects of moving objects in your game world, not just the static game world.
The problem with ray tracing is that it's very computationally intensive. If you're making a big budget movie and aren't bothered by it taking an hour per frame to render each frame on a GPU (and for some movies, it was considerably longer than that), then of course you use ray tracing. That will make the movie look a lot better, and everyone knows it. But an hour per frame, or even a second per frame, is a non-starter for gaming.
Real-time ray tracing becoming ubiquitous and replacing rasterization entirely in games isn't just a couple of generations away, unless perhaps you mean generations of humans and not hardware. Battlezone (1983) didn't herald widespread 3D graphics within a few years, nor did the Virtual Boy (1995) herald a coming era of widespread virtual reality games within a few years. The former would take about 20 years to arrive, and the latter might yet be coming, but isn't here yet. The hardware was nowhere near powerful enough to do what people actually wanted to do when they made the first attempts, and that's the state of real-time ray tracing in games now.