May have missed in the charts but why not include 3080 benchmarks to compare against?
Nvidia themselves said this was only 10-15% better than the 3080 and 8k is not even really on the horizon at this point.
Guess I am just trying to understand the value of double the price and a score that high.
Hi! That is a great question and I will add an Editor's Note regarding it. Like many gaming sites, our sample was delayed due to allocation and customs issues. That review will be coming but we'll have to approach it out of order, unfortunately.
May have missed in the charts but why not include 3080 benchmarks to compare against?
Nvidia themselves said this was only 10-15% better than the 3080 and 8k is not even really on the horizon at this point.
Guess I am just trying to understand the value of double the price and a score that high.
The short answer is because, just like many other people at launch, we never got one. We have word that some will be inbound, but we had to work with what we had.
I did mention in this review that, by all accounts of everything we're seeing, the 3080 is the better value overall. It is way more GPU than the average user needs. However, if someone was already considering a 2080ti, the extra $300 for the 3090 is a solid for the gains.
So, who is this for? Ultra rich who don't care about any sort of efficiency?
Would you say it's necessary or a good idea to buy it right now?
This is a "prosumer" GPU like the TITAN series. If you are using your gaming rig to do GPU-intensive work like animation or scientific compute or anything leveraging CUDA, really, this is the GPU for that... or if you have a stupid amount of money burning a hole in your pocket. If the latter is the case, I have several causes I could point you to...
As far as necessity, no - it's not food or water.
A good idea to buy? Only if you were already looking to / had the money to buy a 2080ti. However, from all accounts (and we'll know once we have one in for review), the 3080 is still a better value-for-dollar. We, also, have yet to see what the 3070 might do or what AMD has up their sleeves with Big Navi. We could be surprised.
I would say this: wait a couple months until the hype dies down and to see what the field actually looks like. You'll save more money and have less regret in the long run.
The reason i just can't get into new hardware is because it only matters on the games i play.So imagine how disappointed i would be if it ran so so on my games.Furthermore,i don't NEED more than 30 frames per second,so if i paid 1500+tax=1750-1800 just to attain more frames,i would be dumb for doing so.
So to be considered an intelligent purchase i would buy the new high end stuff because i NEED to do and not just because,it looks cool or can jump my frame rates.
The one game that gives me trouble is the one i play the most,Atlas and there are no testing benchmarks for that game.Point being that it could just be very poor optimization by the team and would only give me minimal results with a high end gpu.
Bad memories stick with me,i once went all in on a high end machine and was the first time i changed from Intel to AMD.I was all hyped and then when i started using it was frustration all over the place.So since then if i upgrade i'll just buy 2-3 generations in the past and still play anything i want without wasting an added 1000 bucks or more.In reality that is all that matters,that we can play our games,the COST of the PC or CPU or GPU doesn't really factor in at all.
Never forget 3 mile Island and never trust a government official or company spokesman.
The reason i just can't get into new hardware is because it only matters on the games i play.So imagine how disappointed i would be if it ran so so on my games.Furthermore,i don't NEED more than 30 frames per second,so if i paid 1500+tax=1750-1800 just to attain more frames,i would be dumb for doing so.
So to be considered an intelligent purchase i would buy the new high end stuff because i NEED to do and not just because,it looks cool or can jump my frame rates.
The one game that gives me trouble is the one i play the most,Atlas and there are no testing benchmarks for that game.Point being that it could just be very poor optimization by the team and would only give me minimal results with a high end gpu.
Bad memories stick with me,i once went all in on a high end machine and was the first time i changed from Intel to AMD.I was all hyped and then when i started using it was frustration all over the place.So since then if i upgrade i'll just buy 2-3 generations in the past and still play anything i want without wasting an added 1000 bucks or more.In reality that is all that matters,that we can play our games,the COST of the PC or CPU or GPU doesn't really factor in at all.
Where do you live that they charge you 20% sales tax?
Its not only about frames. How about people that want smooth gaming at 4k with rtx enabled, this card is a dream and UE5 is right around the corner. If you are still gaming at 1080 and dont want to push your games to the max quality well thats on you. Personally when I buy a AAA title i dont want it to look like a budget game.
Also its not hard to buy a card like this when ppl say that you need to be a 1%'er im like really. Im not rich by any means but i do have a 2080ti sitting in my rig. How did i do it? Put aside some money each month till i could afford it.
Im glad what you have works for you but in no way does it change the worth of this card for what it does for gaming.
The Titan naming scheme was dumb. I'm glad to see Nvidia move to *90 naming instead. I just hope that they keep doing that.
Thus far, there are very few 3080 or 3090 cards in the world. That's why Nvidia has had trouble even seeding them to review sites, let alone making them widely available to consumers. That's the problem with relying on a memory type that isn't yet in mass production.
If they wanted to, Nvidia could have waited until they had enough inventory to do a hard launch, or at least only a somewhat soft launch. They decided to press ahead with a paper launch for whatever reasons. I'm guessing that they didn't want to launch the RTX 3070 first and have people say, "Hey, the new generation is slower than the old!" They may also have wanted to keep Navi 2X out of the reviews.
The reason i just can't get into new hardware is because it only matters on the games i play.So imagine how disappointed i would be if it ran so so on my games.Furthermore,i don't NEED more than 30 frames per second,so if i paid 1500+tax=1750-1800 just to attain more frames,i would be dumb for doing so.
So to be considered an intelligent purchase i would buy the new high end stuff because i NEED to do and not just because,it looks cool or can jump my frame rates.
The one game that gives me trouble is the one i play the most,Atlas and there are no testing benchmarks for that game.Point being that it could just be very poor optimization by the team and would only give me minimal results with a high end gpu.
Bad memories stick with me,i once went all in on a high end machine and was the first time i changed from Intel to AMD.I was all hyped and then when i started using it was frustration all over the place.So since then if i upgrade i'll just buy 2-3 generations in the past and still play anything i want without wasting an added 1000 bucks or more.In reality that is all that matters,that we can play our games,the COST of the PC or CPU or GPU doesn't really factor in at all.
Where do you live that they charge you 20% sales tax?
Its not only about frames. How about people that want smooth gaming at 4k with rtx enabled, this card is a dream and UE5 is right around the corner. If you are still gaming at 1080 and dont want to push your games to the max quality well thats on you. Personally when I buy a AAA title i dont want it to look like a budget game.
Also its not hard to buy a card like this when ppl say that you need to be a 1%'er im like really. Im not rich by any means but i do have a 2080ti sitting in my rig. How did i do it? Put aside some money each month till i could afford it.
Im glad what you have works for you but in no way does it change the worth of this card for what it does for gaming.
Germany, for example, has a 19% VAT on all purchases in the country. When I first moved there (I've since moved again) it was a bit of a shock. But then again the price they show is the price you pay: you don't have to figure out your taxes and such to see what the actual price would be.
Wizardry, it's great that you are okay playing games at 30fps, but for the vast majority of PC gamers they want at least 60. You're absolutely right that if you're playing at 1080p and very arguably even at 1440p, the 3090 just plain isn't worth it compared to the 3080. You're paying like 114% more for an average of 13% extra performance at 1440p. Even at 4k, I think the biggest gap I've seen was 20% with an average of 15%. Those seem small but when that extra 15% puts you over 60fps at 4k, it makes all the difference in the world.
Atlas is a horribly optimized game, same as Ark. The development team doesn't have a lot of incentive to improve the optimization of the games because people still throw money at it or, if they're blessed with a fluid budget, just build rigs that can power through the poor rendering pipeline they have going.
One bad experience in upgrading to high end machines, while unfortunate and I'm sorry you had to deal with frustrating things, isn't a big enough sample size to definitely make the statement that it's not worth it - not for most people at least. The PC world has come a long way in compatibility between hardware components. These days the only thing you really need to do is make sure your CPU works with the motherboard you're going to buy, unless you're one of the small faction of people that are enthusiast overclockers and such. With RTX starting to gain more prominence and the RTX cards finally being able to deliver an acceptable experience in path tracing and absolutely bounding forwards in rasterization, the day may soon come where buying 2-3 gens behind won't be enough.
At the end of the day though, you're the consumer. Buy whatever it is you feel like buying for the price you think it's worth it at. Others will do the same and just because you wouldn't doesn't mean it's a poor decision.
The Titan naming scheme was dumb. I'm glad to see Nvidia move to *90 naming instead. I just hope that they keep doing that.
Thus far, there are very few 3080 or 3090 cards in the world. That's why Nvidia has had trouble even seeding them to review sites, let alone making them widely available to consumers. That's the problem with relying on a memory type that isn't yet in mass production.
If they wanted to, Nvidia could have waited until they had enough inventory to do a hard launch, or at least only a somewhat soft launch. They decided to press ahead with a paper launch for whatever reasons. I'm guessing that they didn't want to launch the RTX 3070 first and have people say, "Hey, the new generation is slower than the old!" They may also have wanted to keep Navi 2X out of the reviews.
It's not a paper launch. In paper launches there isn't any product available to consumers. That's not the case. Running out of stock isn't the same as developing something, making claims about it, then not releasing it. The demand is insanely huge, and there are plenty of people who were able to pick up cards around the world. While it's true, asshats with bots gobbled up the vast majority of online purchases, I know plenty of people back in the states and here in Korea where I live that were able to walk into a store and pick up one at launch.
The Titan naming scheme was dumb. I'm glad to see Nvidia move to *90 naming instead. I just hope that they keep doing that.
Thus far, there are very few 3080 or 3090 cards in the world. That's why Nvidia has had trouble even seeding them to review sites, let alone making them widely available to consumers. That's the problem with relying on a memory type that isn't yet in mass production.
If they wanted to, Nvidia could have waited until they had enough inventory to do a hard launch, or at least only a somewhat soft launch. They decided to press ahead with a paper launch for whatever reasons. I'm guessing that they didn't want to launch the RTX 3070 first and have people say, "Hey, the new generation is slower than the old!" They may also have wanted to keep Navi 2X out of the reviews.
It's not a paper launch. In paper launches there isn't any product available to consumers. That's not the case. Running out of stock isn't the same as developing something, making claims about it, then not releasing it. The demand is insanely huge, and there are plenty of people who were able to pick up cards around the world. While it's true, asshats with bots gobbled up the vast majority of online purchases, I know plenty of people back in the states and here in Korea where I live that were able to walk into a store and pick up one at launch.
If not strictly a paper launch, it's a very, very soft launch. "But the demand was really high!" is always what vendors say when they have very little stock at launch.
The Titan naming scheme was dumb. I'm glad to see Nvidia move to *90 naming instead. I just hope that they keep doing that.
Thus far, there are very few 3080 or 3090 cards in the world. That's why Nvidia has had trouble even seeding them to review sites, let alone making them widely available to consumers. That's the problem with relying on a memory type that isn't yet in mass production.
If they wanted to, Nvidia could have waited until they had enough inventory to do a hard launch, or at least only a somewhat soft launch. They decided to press ahead with a paper launch for whatever reasons. I'm guessing that they didn't want to launch the RTX 3070 first and have people say, "Hey, the new generation is slower than the old!" They may also have wanted to keep Navi 2X out of the reviews.
It's not a paper launch. In paper launches there isn't any product available to consumers. That's not the case. Running out of stock isn't the same as developing something, making claims about it, then not releasing it. The demand is insanely huge, and there are plenty of people who were able to pick up cards around the world. While it's true, asshats with bots gobbled up the vast majority of online purchases, I know plenty of people back in the states and here in Korea where I live that were able to walk into a store and pick up one at launch.
If not strictly a paper launch, it's a very, very soft launch. "But the demand was really high!" is always what vendors say when they have very little stock at launch.
Newegg.com tweeted that RTX 3080 launch traffic exceeded Black Friday morning. This isn't just something that retailers say, they really have insane demand. RTX 2000 -generation wasn't that much of an improvement, especially if you compare the price/performance ratio, and with AMD failing to produce anything good in the high end this is the first really good improvement since GTX 1000 -generation was released years ago.
That's not to say that the supply wouldn't be really low. But there's also so high demand that no matter how good their supply were, right now they'd still be sold out.
Claims that there is unusually high demand for the RTX 3080 are absurd on their face. The market for $700 consumer GPUs isn't very big. It just isn't. It never has been, and unless inflation brings that price point down into the mid-range, it probably never will be. Over the course of the lifetime of the card, Nvidia will sell far fewer RTX 3080 cards than they did, say, GTX 1060s.
It is precisely because the market for $700 consumer GPUs isn't very big that, in two of the last three generations, AMD hasn't even tried to compete in it. In the Radeon RX 400 series, AMD's flagship was Polaris 10, which had a die size of 232 mm^2, making it firmly a mid-range card. In the Radeon RX 5000 series, the flagship was Navi 10, with a die size of 251 mm^2. That AMD was even able to charge $400 for that is fundamentally a statement about the weakness of Nvidia's contemporary mid-range.
In this coming generation, AMD will have a big die, rumored to be the second largest GPU die in the history of the company. So in this coming generation, Nvidia won't even have the high end all to themselves. And you expect us to believe that there is far more demand for a $700 Nvidia GPU when AMD is likely to have a competitive product that splits the market than when Nvidia had the $500+ market all to themselves? Color me skeptical.
This card is just fascinating to me. Of course, most of us will never have access to an 8k screen this gen, but it's a cool boast. What people aren't talking about is that a lot of people with enthusiast setups have multiple monitors. This is the first card to properly allow for a triple monitor 4k setup. That in itself is pretty cool!
The second thing that is really interesting is that this might be the first GPU I know of that bottlenecks the top performing CPUs on the market. Devs are going to have to program games to make use of more threads or we're gonna need to push pretty deep into 5ghz to get stable 8k/60 on games in the future when the tech become common place.
Claims that there is unusually high demand for the RTX 3080 are absurd on their face. The market for $700 consumer GPUs isn't very big. It just isn't....
It's true that the market isn't big.
But right now NVidia would have to be pushing out absurd number of cards compared to usual market size of these cards to meet the demand. That kind of number is not realistic.
Imho no matter what they did, RTX 3080 would now be sold out until it'll get competition from RTX 3070, next gen consoles, and possibly AMD's next gen GPU.
The second thing that is really interesting is that this might be the first GPU I know of that bottlenecks the top performing CPUs on the market. Devs are going to have to program games to make use of more threads or we're gonna need to push pretty deep into 5ghz to get stable 8k/60 on games in the future when the tech become common place.
Increasing resolution increases CPU usage only marginally. RTX 3090 manages to bottleneck high end CPUs on 1080p resolution, but not at 4K or 8K resolutions.
The second thing that is really interesting is that this might be the first GPU I know of that bottlenecks the top performing CPUs on the market. Devs are going to have to program games to make use of more threads or we're gonna need to push pretty deep into 5ghz to get stable 8k/60 on games in the future when the tech become common place.
Increasing resolution increases CPU usage only marginally. RTX 3090 manages to bottleneck high end CPUs on 1080p resolution, but not at 4K or 8K resolutions.
Then why are reviewers showing examples of the card bottle necking with a 10900k? Why does Nvidia warn the same reviewers that you have to use an i9 or you'll bottle neck? Is everyone lying but you? I have yet to see anyone do any benches lower than 4k...
I was looking to do a complete update on my rig since it was just over 4 years old (not counting the video card) and figured it was reaching the end of its life-cycle. Not two hours after I sent an initial contact request to a custom build company... poof, it died. Would start, run for about 5-10 seconds then power down.
Well that moved up the order window by a bit...
I used to build my own pc's but stopped about 10 years ago. It just became more worth my time and money to let someone else do it.
So long boring story short, I should have a new shiny rig with a GTX3080 coming soon. Thankfully some of the bigger name pc shops get their own orders of the cards and if the stars align properly (was going to buy a new one anyway, just not quite so soon) you can get one.
Obviously not ideal for most people, it just worked out for me.
Claims that there is unusually high demand for the RTX 3080 are absurd on their face. The market for $700 consumer GPUs isn't very big. It just isn't....
It's true that the market isn't big.
But right now NVidia would have to be pushing out absurd number of cards compared to usual market size of these cards to meet the demand. That kind of number is not realistic.
Imho no matter what they did, RTX 3080 would now be sold out until it'll get competition from RTX 3070, next gen consoles, and possibly AMD's next gen GPU.
They absolutely could have had a hard launch just by delaying the launch until they had enough stock to meet demand. For example, let's suppose that the cards finally become widely available at MSRP around the end of this year. Suppose that they had delayed the launch until the end of this year. Don't you think that would have made it a hard launch? Do you really think day one sales under a delayed launch would exceed the total sales over several months after starting with a very soft launch?
I'm not saying that Nvidia should have done that. But they absolutely could have had a hard launch if they really wanted to.
Makes almost no sense to buy a 3090. Wait for the 3080 20gb instead. Unless money is no issue at all. I dont have money issues but i also am not filthy rich and can just waste money for nothing. This card is 100% more than the 3080 and only 10% gains. the 3080 20gb will be much better and i cant wait for it
Be sure you all go and watch Gamer's Nexus review on this overpriced, barely better than a 3080 piece of hardware. That is marketed as doing 8k, but can't actually do that in 99% of titles out there. Least not at a remotely acceptable framerate.
"The People's Titan" I think that is a misnomer. Does the average gamer layout $1500.00 for a video card? I think not. This is for those who have more disposable income than the average gamer. Nice card if you have the $$$.
Comments
Hi! That is a great question and I will add an Editor's Note regarding it. Like many gaming sites, our sample was delayed due to allocation and customs issues. That review will be coming but we'll have to approach it out of order, unfortunately.
The short answer is because, just like many other people at launch, we never got one. We have word that some will be inbound, but we had to work with what we had.
I did mention in this review that, by all accounts of everything we're seeing, the 3080 is the better value overall. It is way more GPU than the average user needs. However, if someone was already considering a 2080ti, the extra $300 for the 3090 is a solid for the gains.
Thanks for reading!
Would you say it's necessary or a good idea to buy it right now?
This is a "prosumer" GPU like the TITAN series. If you are using your gaming rig to do GPU-intensive work like animation or scientific compute or anything leveraging CUDA, really, this is the GPU for that... or if you have a stupid amount of money burning a hole in your pocket. If the latter is the case, I have several causes I could point you to...
As far as necessity, no - it's not food or water.
A good idea to buy? Only if you were already looking to / had the money to buy a 2080ti. However, from all accounts (and we'll know once we have one in for review), the 3080 is still a better value-for-dollar. We, also, have yet to see what the 3070 might do or what AMD has up their sleeves with Big Navi. We could be surprised.
I would say this: wait a couple months until the hype dies down and to see what the field actually looks like. You'll save more money and have less regret in the long run.
So to be considered an intelligent purchase i would buy the new high end stuff because i NEED to do and not just because,it looks cool or can jump my frame rates.
The one game that gives me trouble is the one i play the most,Atlas and there are no testing benchmarks for that game.Point being that it could just be very poor optimization by the team and would only give me minimal results with a high end gpu.
Bad memories stick with me,i once went all in on a high end machine and was the first time i changed from Intel to AMD.I was all hyped and then when i started using it was frustration all over the place.So since then if i upgrade i'll just buy 2-3 generations in the past and still play anything i want without wasting an added 1000 bucks or more.In reality that is all that matters,that we can play our games,the COST of the PC or CPU or GPU doesn't really factor in at all.
Never forget 3 mile Island and never trust a government official or company spokesman.
Where do you live that they charge you 20% sales tax?
Its not only about frames. How about people that want smooth gaming at 4k with rtx enabled, this card is a dream and UE5 is right around the corner. If you are still gaming at 1080 and dont want to push your games to the max quality well thats on you. Personally when I buy a AAA title i dont want it to look like a budget game.
Also its not hard to buy a card like this when ppl say that you need to be a 1%'er im like really. Im not rich by any means but i do have a 2080ti sitting in my rig. How did i do it? Put aside some money each month till i could afford it.
Im glad what you have works for you but in no way does it change the worth of this card for what it does for gaming.
Thus far, there are very few 3080 or 3090 cards in the world. That's why Nvidia has had trouble even seeding them to review sites, let alone making them widely available to consumers. That's the problem with relying on a memory type that isn't yet in mass production.
If they wanted to, Nvidia could have waited until they had enough inventory to do a hard launch, or at least only a somewhat soft launch. They decided to press ahead with a paper launch for whatever reasons. I'm guessing that they didn't want to launch the RTX 3070 first and have people say, "Hey, the new generation is slower than the old!" They may also have wanted to keep Navi 2X out of the reviews.
Germany, for example, has a 19% VAT on all purchases in the country. When I first moved there (I've since moved again) it was a bit of a shock. But then again the price they show is the price you pay: you don't have to figure out your taxes and such to see what the actual price would be.
Wizardry, it's great that you are okay playing games at 30fps, but for the vast majority of PC gamers they want at least 60. You're absolutely right that if you're playing at 1080p and very arguably even at 1440p, the 3090 just plain isn't worth it compared to the 3080. You're paying like 114% more for an average of 13% extra performance at 1440p. Even at 4k, I think the biggest gap I've seen was 20% with an average of 15%. Those seem small but when that extra 15% puts you over 60fps at 4k, it makes all the difference in the world.
Atlas is a horribly optimized game, same as Ark. The development team doesn't have a lot of incentive to improve the optimization of the games because people still throw money at it or, if they're blessed with a fluid budget, just build rigs that can power through the poor rendering pipeline they have going.
One bad experience in upgrading to high end machines, while unfortunate and I'm sorry you had to deal with frustrating things, isn't a big enough sample size to definitely make the statement that it's not worth it - not for most people at least. The PC world has come a long way in compatibility between hardware components. These days the only thing you really need to do is make sure your CPU works with the motherboard you're going to buy, unless you're one of the small faction of people that are enthusiast overclockers and such. With RTX starting to gain more prominence and the RTX cards finally being able to deliver an acceptable experience in path tracing and absolutely bounding forwards in rasterization, the day may soon come where buying 2-3 gens behind won't be enough.
At the end of the day though, you're the consumer. Buy whatever it is you feel like buying for the price you think it's worth it at. Others will do the same and just because you wouldn't doesn't mean it's a poor decision.
It's not a paper launch. In paper launches there isn't any product available to consumers. That's not the case. Running out of stock isn't the same as developing something, making claims about it, then not releasing it. The demand is insanely huge, and there are plenty of people who were able to pick up cards around the world. While it's true, asshats with bots gobbled up the vast majority of online purchases, I know plenty of people back in the states and here in Korea where I live that were able to walk into a store and pick up one at launch.
That's not to say that the supply wouldn't be really low. But there's also so high demand that no matter how good their supply were, right now they'd still be sold out.
It is precisely because the market for $700 consumer GPUs isn't very big that, in two of the last three generations, AMD hasn't even tried to compete in it. In the Radeon RX 400 series, AMD's flagship was Polaris 10, which had a die size of 232 mm^2, making it firmly a mid-range card. In the Radeon RX 5000 series, the flagship was Navi 10, with a die size of 251 mm^2. That AMD was even able to charge $400 for that is fundamentally a statement about the weakness of Nvidia's contemporary mid-range.
In this coming generation, AMD will have a big die, rumored to be the second largest GPU die in the history of the company. So in this coming generation, Nvidia won't even have the high end all to themselves. And you expect us to believe that there is far more demand for a $700 Nvidia GPU when AMD is likely to have a competitive product that splits the market than when Nvidia had the $500+ market all to themselves? Color me skeptical.
The second thing that is really interesting is that this might be the first GPU I know of that bottlenecks the top performing CPUs on the market. Devs are going to have to program games to make use of more threads or we're gonna need to push pretty deep into 5ghz to get stable 8k/60 on games in the future when the tech become common place.
But right now NVidia would have to be pushing out absurd number of cards compared to usual market size of these cards to meet the demand. That kind of number is not realistic.
Imho no matter what they did, RTX 3080 would now be sold out until it'll get competition from RTX 3070, next gen consoles, and possibly AMD's next gen GPU.
Then why are reviewers showing examples of the card bottle necking with a 10900k? Why does Nvidia warn the same reviewers that you have to use an i9 or you'll bottle neck? Is everyone lying but you? I have yet to see anyone do any benches lower than 4k...
Well that moved up the order window by a bit...
I used to build my own pc's but stopped about 10 years ago. It just became more worth my time and money to let someone else do it.
So long boring story short, I should have a new shiny rig with a GTX3080 coming soon. Thankfully some of the bigger name pc shops get their own orders of the cards and if the stars align properly (was going to buy a new one anyway, just not quite so soon) you can get one.
Obviously not ideal for most people, it just worked out for me.
https://www.amazon.com/NVIDIA-Quadro-GV100-Volta-Graphics/dp/B07JBQ4DBV
Or is it cheating to link professional cards?
I'm not saying that Nvidia should have done that. But they absolutely could have had a hard launch if they really wanted to.