Anyone else sick of hearing about freaking Ray Tracing. Yes its going to be big 3+ years from now!! We get it already!!! show me real improvements in MMOs that doesn't include crap about light and shadows!!!!! (assuming you are playing 1 of only 20 games that use/planned to have this feature, wth!)
not all games will use ray tracing so its kinda useless unless the MMO you play uses it which it wont for the next 5 years or sth. ray tracing is like 0.1% mainstream in games nowadays. why not waste our time with it 5 years from now when game devs actually might utilize it. blehhhhhhhh
If i was at the NVIDIA RTX announcement event, i would have been like "can you go 2 minutes without using the words ray tracing??? what if we dont give a crap about it??? what if i want more CUDA CORES DAMMIT!!!!!!
Ok rant over
IMPORTANT: Please keep all replies to my posts about GAMING. Please no negative or backhanded comments directed at me personally. If you are going to post a reply that includes how you feel about me, please don't bother replying & just ignore my post instead. I'm on this forum to talk about GAMING. Thank you.
Comments
Hell, it took YEARS for games to start using multiple CPU cores, and a few more beyond to start using them properly.
AN' DERE AIN'T NO SUCH FING AS ENUFF DAKKA, YA GROT! Enuff'z more than ya got an' less than too much an' there ain't no such fing as too much dakka. Say dere is, and me Squiggoff'z eatin' tonight!
We are born of the blood. Made men by the blood. Undone by the blood. Our eyes are yet to open. FEAR THE OLD BLOOD.
#IStandWithVic
"We all do the best we can based on life experience, point of view, and our ability to believe in ourselves." - Naropa "We don't see things as they are, we see them as we are." SR Covey
¯\_(ツ)_/¯
I just hope AMD has a bargain answer for the RTX series or we'll see a price blowoff that's bigger than the cryptocurrency bubble. 1200$ for a non-titan GPU, jesus christ. I remember when 600$ was the best of the best GPU for it's generation. Now for 600$ you get mid-tier.
No, it's not inflation. Inflation didn't double the prices in the past 5 years.
Good thing few games will require it.
"True friends stab you in the front." | Oscar Wilde
"I need to finish" - Christian Wolff: The Accountant
Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm
Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV
Don't just play games, inhabit virtual worlds™
"This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon
Of course TVs are the furthest along with 4K but even then most of them are psuedo-4K/HDR
A person with good sense has the best 1080p 60hz display w/ the best dynamic range and color vibrance they can (if possible).
Right now the most important visual features on a TV/PC display that's actually viable are HDR, OLED, and display port. If you can find that at 1080p (you won't) Everything will look incredible and your graphics cards, and other content peripherals actually provide viable content for it.
Manufacturers purposely stagger features to sell the new hotness.
Here's another funny one; we don't get the choice of 4k being scaled down to 1080p because hardware people don't want general customers to realize how awesome it looks. The software checks your resolution and gives you content "specified" for it. Meanwhile all major film/tv production are shot at twice or 4 times the resolution and scaled down.
¯\_(ツ)_/¯
However, if it beats out the 1080 somehow, it will be the only one of the new series worth a buy since it's so similarly priced to the 1080. If it, say, provides an extra 20% of performance benchmark for only about $100-150 extra, that's something worth considering if you don't already have a 1080.
EDIT: Also, since 2070 will cost $600 at launch, I think it's very unlikely that 2050 would also cost $600.
MadFrenchie had what amounted to a typo, Vrika provided the correction. Nothing in that was pro or con any company.
Tracing pics of Ray......yeah, ok......nvm...
Gut Out!
What, me worry?
Smart as a bag of hammers, that's how.
AMD has promised the launch of their first 7 nm card by the end of this year, though they've mostly talked about it for compute and it might not have a Radeon version. I'm not sure where Nvidia is with 7 nm. As soon as 7 nm GPUs are available, they're going to wipe out any 12/14/16 nm ones with comparable performance.
Maybe you won't be able to build an enormous 7 nm GPU at first. But they will be able to build small ones. If a 200 mm^2 die on 12 nm will get whipped by a 100 mm^2 one on 7 nm, why bother creating the former at all? You don't create the former unless the latter is a long way off.
I remember when Borland brought out professional level compilers for $50 breaking the grips of microsoft compilers at $500. Again didn't last.
Things will be priced at what they can get away with. One difference though today -- a computer doesn't entirely obsolete itself in 3 years.
The only issue with computers back then was the fact that they were developing too fast and each year you had to buy new stuff, however in the past 10 years that's not even an issue. People still rock on with i5-2500k for example. I wouldn't be surprised if someone is still daily gaming on Radeon 7xxx or GTX 6xx