Its kind of this way on most new GPU series launch.
They start with serious gamers where getting those extra few settings maxed or fps are important and noticeable.
The more casual gamer cards start releasing steadily after.
Core i5 13600KF, BeQuiet Pure Loop FX 360, 32gb DDR5-6000 XPG, WD SN850 NVMe ,PNY 3090 XLR8, Asus Prime Z790-A, Lian-Li O11 PCMR case (limited ed 1045/2000), 32" LG Ultragear 4k Monitor, Logitech G560 LightSync Sound, Razer Deathadder V2 and Razer Blackwidow V3 Keyboard
Fuck dat, dont support such prices, they are going the Apple route. More people on earth=more stupids. I would never pay more then 300 for a graphic card. What most dont realize, the new cards are promoted as 4k cards. Do you know ANYBODY who has a 4k ips with 144hz tv? Me not, most use still 1080p and for that resolution, you can still get a freakimg 6y old gf 1080ti or rt 5700 xt for 300 bucks and still use it for the next 4 years.
What happened to the budget modern gpu's? Even the RTX 3060 is going to be over $400; these used to be around $250.
The most I ever paid for a GPU was 220 USD for a GTX 1070 8gb (used) in Feb 2019, replacing a GTX 750 TI 2gb (new), for which I paid 140 USD in November 2015. Prior to that a 9800 GT 1gb (new) that I bought for 90 USD in 2011. Prior to that GeForce 7600 GT AGP that I bought in 2007, Geforce 5600 in 2004, GeForce3 Ti200 in 2002, Matrox G400 in 1999 - all less than 150 USD. Before that I used Mac Quadro and Mac LC computers. Before that Commodore 64.
I'm happy to pay 200-300 for a mid range card. I am kind of shocked that I am starting to think that the RTX 3070 sounds like a great product for 499!
In 1992 I bought a Vermont Microsystems 3D GPU with 1meg of ram. At the time it cost $2700 in 1992 dollars. My 19" monitor cost $2300 and weighed what must have been a good 50lbs.
It was part of a 3D cad system. Everything was in wire frames and If you wanted, you could get a photographic image as long as you were prepared to wait the 4-6 hours that it took to generate.
They didn't show any ray tracing performance... I wonder why?
I know ray tracing is not all that important yet - but it's pretty imressive in a handful of titles - I'd like to see how AMD does here
What do they need to show ? You can see Big Navi RT on the consoles running 60fps.
I'm more interested in whatever AMD's version of DLSS is. I read a while ago that it was called FX Fidelity but i don't think i've ever seen that in action.
They didn't show any ray tracing performance... I wonder why?
I know ray tracing is not all that important yet - but it's pretty imressive in a handful of titles - I'd like to see how AMD does here
What do they need to show ? You can see Big Navi RT on the consoles running 60fps.
A benchmark. Console devs limit their games based on what the console's hardware can handle, so achieving 60 FPS (or whatever the target framerate is) on console is optimization job well done by the game developer. It does not tell how powerful the console hardware is.
What happened to the budget modern gpu's? Even the RTX 3060 is going to be over $400; these used to be around $250.
They're coming.
The vendors generally start with their high end, and then work their way down the stack. I'm not sure how low they'll go with their new cards. Nvidia might just decide to leave the GeForce 1600 series as their lineup for the sub-$300 market, or they might offer Ampere replacements. Or maybe they'll offer a $200 Ampere card, but keep the GTX 1650 around for the lower end. I somewhat expect AMD to offer an RDNA2 replacement for the Radeon RX 5500 XT, but they might not, and even if they do, it's likely several months away.
There is still the question of just how low end the vendors are willing to go with discrete cards. After all, excluding Ampere and RDNA2, the "current" lineups of cards that may still be actively being produced at the lower end goes:
Note that those are different generations. Some generations they decide it's cheaper to keep producing an older card than to build a new card that goes below a particular price threshold. Integrated GPUs are eating up more and more of the low end market.
They didn't show any ray tracing performance... I wonder why?
I know ray tracing is not all that important yet - but it's pretty imressive in a handful of titles - I'd like to see how AMD does here
What do they need to show ? You can see Big Navi RT on the consoles running 60fps.
I'm more interested in whatever AMD's version of DLSS is. I read a while ago that it was called FX Fidelity but i don't think i've ever seen that in action.
I only used it on death stranding and gained like 15fps.
the new gpus look good but ugh...a bit overpriced , cool thing is that old series will lower its prices !
Fuck dat, dont support such prices, they are going the Apple route. More people on earth=more stupids. I would never pay more then 300 for a graphic card. What most dont realize, the new cards are promoted as 4k cards. Do you know ANYBODY who has a 4k ips with 144hz tv? Me not, most use still 1080p and for that resolution, you can still get a freakimg 6y old gf 1080ti or rt 5700 xt for 300 bucks and still use it for the next 4 years.
If you don't want to pay $650 for a video card, then don't. There are plenty of cheaper, lower end cards, and you can readily buy one that fits your budget.
It seems that when AMD vacates the high end and makes their top end card cost $300, people complain that it isn't as fast as Nvidia's $700 card. When AMD makes a high end card and charges accordingly, people complain that it's priced like a high end card and not still $300. High end cards require large GPU dies, and that makes them expensive to buy because they're expensive to build.
I think the pricing is a statement about yields. By performance, the Radeon RX 6800 XT is much closer to the 6900 XT than to the 6800. By price, it's the other way around. The 6800 XT has considerably higher performance per dollar than either of the other cards. That's the one that AMD is implicitly pushing most of the people who will buy any of these cards to choose.
The 6900 XT is basically there for people who are willing to pay a large premium for AMD to bin out the very best dies for them. For most gamers, it's way overpriced, and the performance per dollar is pretty terrible. The same is true of the GeForce RTX 3090.
Meanwhile, the 6800 non-XT is a much larger drop in performance than in price as compared to the 6800 XT, at least unless AMD really botched the design and the 6800 XT is going to be heavily memory bandwidth bottlenecked. Most of the people considering a 6800 should just spend a little more for the 6800 XT (or spend less on something else or buy Nvidia). That tells me that AMD doesn't have that many dies that they need to get rid of that can fit 6800 specs but not 6800 XT specs.
We don't yet know whether the Big Navi cards will have a hard launch. AMD already has a lot of experience on the process node, however, and neither GDDR6 memory nor a 300 W TDP should be a problem, as AMD has plenty of prior experience with both of those.
That said, depending on when AMD placed the order for mass production, there may or may not be a shortage at launch. There's also the question of how much of capacity they can get at TSMC, and how much of that they're willing to devote to Big Navi. In spite of the high prices, because the GPUs are so large, they get far less gross profit per mm^2 than they do out of Zen 3 chips. It's rumored that the reason Nvidia went with Samsung was precisely because they couldn't get enough capacity on TSMC's 7 nm node.
It seems AMD isn't trying to compete for a market share with those prices - $499 vs $579? I will not think twice and buy NVidia 3070 FE today (I was supporting AMD since their 486DX100 CPU in their competition with Intel and on the GPU market).
What happened to the budget modern gpu's? Even the RTX 3060 is going to be over $400; these used to be around $250.
The most I ever paid for a GPU was 220 USD for a GTX 1070 8gb (used) in Feb 2019, replacing a GTX 750 TI 2gb (new), for which I paid 140 USD in November 2015. Prior to that a 9800 GT 1gb (new) that I bought for 90 USD in 2011. Prior to that GeForce 7600 GT AGP that I bought in 2007, Geforce 5600 in 2004, GeForce3 Ti200 in 2002, Matrox G400 in 1999 - all less than 150 USD. Before that I used Mac Quadro and Mac LC computers. Before that Commodore 64.
I'm happy to pay 200-300 for a mid range card. I am kind of shocked that I am starting to think that the RTX 3070 sounds like a great product for 499!
In 1992 I bought a Vermont Microsystems 3D GPU with 1meg of ram. At the time it cost $2700 in 1992 dollars. My 19" monitor cost $2300 and weighed what must have been a good 50lbs.
It was part of a 3D cad system. Everything was in wire frames and If you wanted, you could get a photographic image as long as you were prepared to wait the 4-6 hours that it took to generate.
Dont be fooled. Higher prices are to please the investors. Its not the problem pf the high end cards, we had always had high end cards but today a mid range card is on the price point of a high range card-back in the days. The could sell for the same srice but seeing how stupid human retards behave with iphones- why not do the same. Well, they do. Also lets not forcet the miners wich brought us the higher prices due the huge demand.
What happened to the budget modern gpu's? Even the RTX 3060 is going to be over $400; these used to be around $250.
The most I ever paid for a GPU was 220 USD for a GTX 1070 8gb (used) in Feb 2019, replacing a GTX 750 TI 2gb (new), for which I paid 140 USD in November 2015. Prior to that a 9800 GT 1gb (new) that I bought for 90 USD in 2011. Prior to that GeForce 7600 GT AGP that I bought in 2007, Geforce 5600 in 2004, GeForce3 Ti200 in 2002, Matrox G400 in 1999 - all less than 150 USD. Before that I used Mac Quadro and Mac LC computers. Before that Commodore 64.
I'm happy to pay 200-300 for a mid range card. I am kind of shocked that I am starting to think that the RTX 3070 sounds like a great product for 499!
In 1992 I bought a Vermont Microsystems 3D GPU with 1meg of ram. At the time it cost $2700 in 1992 dollars. My 19" monitor cost $2300 and weighed what must have been a good 50lbs.
It was part of a 3D cad system. Everything was in wire frames and If you wanted, you could get a photographic image as long as you were prepared to wait the 4-6 hours that it took to generate.
Dont be fooled. Higher prices are to please the investors. Its not the problem pf the high end cards, we had always had high end cards but today a mid range card is on the price point of a high range card-back in the days. The could sell for the same srice but seeing how stupid human retards behave with iphones- why not do the same. Well, they do. Also lets not forcet the miners wich brought us the higher prices due the huge demand.
There is a simple rule in business:
Charge as much as people are willing to pay.
I couldn't make those those kinds of assessments and judgements unless I knew what the cost to mfg. is. New tech is always expensive no matter what the field.
We have to think about recouping RnD and production line costs. It cost a hell of a lot of money to tool up.
My original comment was more about reminiscing about the past rather then justifying/criticizing the cost of todays GPU.
I would have gone with AMD but my home gaming system with a Nvidia Shield streaming games to my TV require GeForce cards. For that alone they win my business. And I really like the AMD line up.
I would have gone with AMD but my home gaming system with a Nvidia Shield streaming games to my TV require GeForce cards. For that alone they win my business. And I really like the AMD line up.
Have you tried Steam Link? Seems to work well but I’m not a couch gamer so haven’t used it much.
I would have gone with AMD but my home gaming system with a Nvidia Shield streaming games to my TV require GeForce cards. For that alone they win my business. And I really like the AMD line up.
Have you tried Steam Link? Seems to work well but I’m not a couch gamer so haven’t used it much.
Ya thats what you use on the Shield to stream from your PC. I would use a cheap PC in the living room and bypass using a Shield but my wife and made a rule. No more PC in the living room.
EDIT: Also games I dont have on steam I can still stream to the Shield with the GeForce app.
I would have gone with AMD but my home gaming system with a Nvidia Shield streaming games to my TV require GeForce cards. For that alone they win my business. And I really like the AMD line up.
Have you tried Steam Link? Seems to work well but I’m not a couch gamer so haven’t used it much.
Ya thats what you use on the Shield to stream from your PC. I would use a cheap PC in the living room and bypass using a Shield but my wife and made a rule. No more PC in the living room.
EDIT: Also games I dont have on steam I can still stream to the Shield with the GeForce app.
You can stream anything to the raspberry pi using parsec.
I would have gone with AMD but my home gaming system with a Nvidia Shield streaming games to my TV require GeForce cards. For that alone they win my business. And I really like the AMD line up.
Have you tried Steam Link? Seems to work well but I’m not a couch gamer so haven’t used it much.
Ya thats what you use on the Shield to stream from your PC. I would use a cheap PC in the living room and bypass using a Shield but my wife and made a rule. No more PC in the living room.
EDIT: Also games I dont have on steam I can still stream to the Shield with the GeForce app.
You can stream anything to the raspberry pi using parsec.
Comments
What happened to the budget modern gpu's? Even the RTX 3060 is going to be over $400; these used to be around $250.
They start with serious gamers where getting those extra few settings maxed or fps are important and noticeable.
The more casual gamer cards start releasing steadily after.
Core i5 13600KF, BeQuiet Pure Loop FX 360, 32gb DDR5-6000 XPG, WD SN850 NVMe ,PNY 3090 XLR8, Asus Prime Z790-A, Lian-Li O11 PCMR case (limited ed 1045/2000), 32" LG Ultragear 4k Monitor, Logitech G560 LightSync Sound, Razer Deathadder V2 and Razer Blackwidow V3 Keyboard
It was part of a 3D cad system. Everything was in wire frames and If you wanted, you could get a photographic image as long as you were prepared to wait the 4-6 hours that it took to generate.
"Be water my friend" - Bruce Lee
I'm more interested in whatever AMD's version of DLSS is. I read a while ago that it was called FX Fidelity but i don't think i've ever seen that in action.
Top 3 MMO's PRE-CU SWG GW1 GW2
Worst 2 wow and Lotro Under standing stones it went woke
The vendors generally start with their high end, and then work their way down the stack. I'm not sure how low they'll go with their new cards. Nvidia might just decide to leave the GeForce 1600 series as their lineup for the sub-$300 market, or they might offer Ampere replacements. Or maybe they'll offer a $200 Ampere card, but keep the GTX 1650 around for the lower end. I somewhat expect AMD to offer an RDNA2 replacement for the Radeon RX 5500 XT, but they might not, and even if they do, it's likely several months away.
There is still the question of just how low end the vendors are willing to go with discrete cards. After all, excluding Ampere and RDNA2, the "current" lineups of cards that may still be actively being produced at the lower end goes:
Nvidia:
GeForce GTX 1650 super
GeForce GT 1030
GeForce GT 710
AMD:
Radeon RX 5500 XT
Radeon RX 550
Note that those are different generations. Some generations they decide it's cheaper to keep producing an older card than to build a new card that goes below a particular price threshold. Integrated GPUs are eating up more and more of the low end market.
I only used it on death stranding and gained like 15fps.
the new gpus look good but ugh...a bit overpriced , cool thing is that old series will lower its prices !
It seems that when AMD vacates the high end and makes their top end card cost $300, people complain that it isn't as fast as Nvidia's $700 card. When AMD makes a high end card and charges accordingly, people complain that it's priced like a high end card and not still $300. High end cards require large GPU dies, and that makes them expensive to buy because they're expensive to build.
The 6900 XT is basically there for people who are willing to pay a large premium for AMD to bin out the very best dies for them. For most gamers, it's way overpriced, and the performance per dollar is pretty terrible. The same is true of the GeForce RTX 3090.
Meanwhile, the 6800 non-XT is a much larger drop in performance than in price as compared to the 6800 XT, at least unless AMD really botched the design and the 6800 XT is going to be heavily memory bandwidth bottlenecked. Most of the people considering a 6800 should just spend a little more for the 6800 XT (or spend less on something else or buy Nvidia). That tells me that AMD doesn't have that many dies that they need to get rid of that can fit 6800 specs but not 6800 XT specs.
https://www.servers4less.com/graphics-cards/video-cards/xfx-pv-t42e-ydf3
https://harddiskdirect.com/128-a8-n374-ar-evga-video-card.html
https://www.ebay.com/i/274415179590
(Don't buy those, by the way. It's a joke.)
That said, depending on when AMD placed the order for mass production, there may or may not be a shortage at launch. There's also the question of how much of capacity they can get at TSMC, and how much of that they're willing to devote to Big Navi. In spite of the high prices, because the GPUs are so large, they get far less gross profit per mm^2 than they do out of Zen 3 chips. It's rumored that the reason Nvidia went with Samsung was precisely because they couldn't get enough capacity on TSMC's 7 nm node.
Dont be fooled. Higher prices are to please the investors. Its not the problem pf the high end cards, we had always had high end cards but today a mid range card is on the price point of a high range card-back in the days. The could sell for the same srice but seeing how stupid human retards behave with iphones- why not do the same. Well, they do. Also lets not forcet the miners wich brought us the higher prices due the huge demand.
Charge as much as people are willing to pay.
I couldn't make those those kinds of assessments and judgements unless I knew what the cost to mfg. is. New tech is always expensive no matter what the field.
We have to think about recouping RnD and production line costs. It cost a hell of a lot of money to tool up.
My original comment was more about reminiscing about the past rather then justifying/criticizing the cost of todays GPU.
"Be water my friend" - Bruce Lee
EDIT: Also games I dont have on steam I can still stream to the Shield with the GeForce app.