Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Ray Tracing and how uber it is

time007time007 Member UncommonPosts: 1,062
edited August 2018 in Hardware
Anyone else sick of hearing about freaking Ray Tracing.  Yes its going to be big 3+ years from now!! We get it already!!!  show me real improvements in MMOs that doesn't include crap about light and shadows!!!!!  (assuming you are playing 1 of only 20 games that use/planned to have this feature, wth!) 

not all games will use ray tracing so its kinda useless unless the MMO you play uses it which it wont for the next 5 years or sth.  ray tracing is like 0.1% mainstream in games nowadays.  why not waste our time with it 5 years from now when game devs actually might utilize it.  blehhhhhhhh

If i was at the NVIDIA RTX announcement event, i would have been like "can you go 2 minutes without using the words ray tracing??? what if we dont give a crap about it??? what if i want more CUDA CORES DAMMIT!!!!!!

Ok rant over

IMPORTANT:  Please keep all replies to my posts about GAMING.  Please no negative or backhanded comments directed at me personally.  If you are going to post a reply that includes how you feel about me, please don't bother replying & just ignore my post instead.  I'm on this forum to talk about GAMING.  Thank you.
Scotjimmywolfwanderica
«1

Comments

  • Solar_ProphetSolar_Prophet Member EpicPosts: 1,960
    Agreed. In ten years or so we'll start seeing some really cool stuff, but there's little point pushing hardware for it now. The software has to catch up first. Given how long the development cycle is these days, most games released in the next five to seven years won't be using it at all. 

    Hell, it took YEARS for games to start using multiple CPU cores, and a few more beyond to start using them properly. 

    AN' DERE AIN'T NO SUCH FING AS ENUFF DAKKA, YA GROT! Enuff'z more than ya got an' less than too much an' there ain't no such fing as too much dakka. Say dere is, and me Squiggoff'z eatin' tonight!

    We are born of the blood. Made men by the blood. Undone by the blood. Our eyes are yet to open. FEAR THE OLD BLOOD. 

    #IStandWithVic

  • Octagon7711Octagon7711 Member LegendaryPosts: 9,004
    I'm sure one day ray tracing and vr will be the norm but not for awhile.

    "We all do the best we can based on life experience, point of view, and our ability to believe in ourselves." - Naropa      "We don't see things as they are, we see them as we are."  SR Covey

  • CleffyCleffy Member RarePosts: 6,414
    I doubt any games will seriously push a nVidia proprietary solution. So far none of them have stuck and with good reason.
  • centkincentkin Member RarePosts: 1,527
    Well, you won't see anything really use it until it has been on the market for a good while and in a bunch of PCs.  It has to start somewhere, or it will never start at all.  This is more to create a future market than it is to give the current buyers anything special at this time.
  • FlyByKnightFlyByKnight Member EpicPosts: 3,967
    edited August 2018
    People are silly and will always buy unsupported tech and make pretend that it won't be outdated when there's actually mainstream use case.
    time007
    "As far as the forum code of conduct, I would think it's a bit outdated and in need of a refre *CLOSED*" 

    ¯\_(ツ)_/¯
  • SplitStream13SplitStream13 Member UncommonPosts: 253
    edited August 2018
    People are silly and will always buy unsupported tech and make pretend that it won't be outdated when there's actually mainstream use case.
    I mean if you can sink the money for 1200$ MSRP card today, you probably won't have issues doing so 5 years from now when it becomes mainstream so they'll won't be pretending they'll just buy it. 

    I just hope AMD has a bargain answer for the RTX series or we'll see a price blowoff that's bigger than the cryptocurrency bubble. 1200$ for a non-titan GPU, jesus christ. I remember when 600$ was the best of the best GPU for it's generation. Now for 600$ you get mid-tier.

    No, it's not inflation. Inflation didn't double the prices in the past 5 years.
    GorweKylerantime007
  • KyleranKyleran Member LegendaryPosts: 44,070
    Paying $1200 for a video card is just somewhere I will never go....guess it's the middle tiers for me going forward.

    Good thing few games will require it.




    Avarix[Deleted User]bartoni33

    "True friends stab you in the front." | Oscar Wilde 

    "I need to finish" - Christian Wolff: The Accountant

    Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm

    Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV

    Don't just play games, inhabit virtual worlds™

    "This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon






  • ScotScot Member LegendaryPosts: 24,459
    The chips are not getting faster so apart from tweaking them you only have software. Silicon is now so much part of our infrastructure even if they did find a faster alternative it would take a five years to come in.
  • KajidourdenKajidourden Member EpicPosts: 3,030
    It's impressive tech but it's kinda like 4K right now.....none of the outputs (Monitors, TVs, or in this case games) really do it all the way or take advantage of it at all.

    Of course TVs are the furthest along with 4K but even then most of them are psuedo-4K/HDR
  • FlyByKnightFlyByKnight Member EpicPosts: 3,967
    edited August 2018
    4k resolution and 120hz were marketing ploys for ignorant people. The content for them isn't even mainstream supported yet and when they are there will be cheaper better iterations.

    A person with good sense has the best 1080p 60hz display w/ the best dynamic range and color vibrance they can (if possible).

    Right now the most important visual features on a TV/PC display that's actually viable are HDR, OLED, and display port. If you can find that at 1080p (you won't) Everything will look incredible and your graphics cards, and other content peripherals actually provide viable content for it.

    Manufacturers purposely stagger features to sell the new hotness.

    Here's another funny one; we don't get the choice of 4k being scaled down to 1080p because hardware people don't want general customers to realize how awesome it looks. The software checks your resolution and gives you content "specified" for it. Meanwhile all major film/tv production are shot at twice or 4 times the resolution and scaled down.
    "As far as the forum code of conduct, I would think it's a bit outdated and in need of a refre *CLOSED*" 

    ¯\_(ツ)_/¯
  • MadFrenchieMadFrenchie Member LegendaryPosts: 8,505
    Gorwe said:
    Yes, me too. I get it, they developed something new and are now promoting it. But! Given that a VERY small number of games will even use that AND, let's not forget it, HUGE PRICE at launch(2050 will cost 600$!)...I don't see this going well for nVidia in a short term. If AMD ATi was smart, they'd release something "optimal" for gaming and perhaps even lower their prices by 10% or so. And would promptly crush the shit out of RTX.
    Unless the 2050 somehow exceeds the 1080 to performance I'm not sure why anyone would purchase a 2050.

    However, if it beats out the 1080 somehow, it will be the only one of the new series worth a buy since it's so similarly priced to the 1080.  If it, say, provides an extra 20% of performance benchmark for only about $100-150 extra, that's something worth considering if you don't already have a 1080.

    image
  • VrikaVrika Member LegendaryPosts: 7,991
    edited August 2018
    Gorwe said:
    Gorwe said:
    Yes, me too. I get it, they developed something new and are now promoting it. But! Given that a VERY small number of games will even use that AND, let's not forget it, HUGE PRICE at launch(2050 will cost 600$!)...I don't see this going well for nVidia in a short term. If AMD ATi was smart, they'd release something "optimal" for gaming and perhaps even lower their prices by 10% or so. And would promptly crush the shit out of RTX.
    Unless the 2050 somehow exceeds the 1080 to performance I'm not sure why anyone would purchase a 2050.

    However, if it beats out the 1080 somehow, it will be the only one of the new series worth a buy since it's so similarly priced to the 1080.  If it, say, provides an extra 20% of performance benchmark for only about $100-150 extra, that's something worth considering if you don't already have a 1080.
    No. It's not about the performance right now. The nVidia is on my black list right now. The list comprised of various elitist scum who gouge people just because. I wouldn't even think of purchasing a 2050 even if it had 2x the performance of a 1080 because it sends a SERIOUSLY WRONG MESSAGE to the market. Think of the consequences! The RTX ought to be completely boycotted.
    How do you even know what 2050 will cost? They only released details for 2070, 2080 and 2080 Ti.

    EDIT: Also, since 2070 will cost $600 at launch, I think it's very unlikely that 2050 would also cost $600.
    [Deleted User]
     
  • MadFrenchieMadFrenchie Member LegendaryPosts: 8,505
    edited August 2018
    Vrika said:
    Gorwe said:
    Gorwe said:
    Yes, me too. I get it, they developed something new and are now promoting it. But! Given that a VERY small number of games will even use that AND, let's not forget it, HUGE PRICE at launch(2050 will cost 600$!)...I don't see this going well for nVidia in a short term. If AMD ATi was smart, they'd release something "optimal" for gaming and perhaps even lower their prices by 10% or so. And would promptly crush the shit out of RTX.
    Unless the 2050 somehow exceeds the 1080 to performance I'm not sure why anyone would purchase a 2050.

    However, if it beats out the 1080 somehow, it will be the only one of the new series worth a buy since it's so similarly priced to the 1080.  If it, say, provides an extra 20% of performance benchmark for only about $100-150 extra, that's something worth considering if you don't already have a 1080.
    No. It's not about the performance right now. The nVidia is on my black list right now. The list comprised of various elitist scum who gouge people just because. I wouldn't even think of purchasing a 2050 even if it had 2x the performance of a 1080 because it sends a SERIOUSLY WRONG MESSAGE to the market. Think of the consequences! The RTX ought to be completely boycotted.
    How do you even know what 2050 will cost? They only released details for 2070, 2080 and 2080 Ti.

    EDIT: Also, since 2070 will cost $600 at launch, I think it's very unlikely that 2050 would also cost $600.
    Apologies, I confused the 2050 and 2070.  If the 2070 is a substantial increase in performance over a 1080, first-gen woes aside for the new tech, it's worth considering if you're thinking about getting a 1080 like I have been.

    image
  • RidelynnRidelynn Member EpicPosts: 7,383
    Gorwe said:
    Vrika said:
    Gorwe said:
    Gorwe said:
    Yes, me too. I get it, they developed something new and are now promoting it. But! Given that a VERY small number of games will even use that AND, let's not forget it, HUGE PRICE at launch(2050 will cost 600$!)...I don't see this going well for nVidia in a short term. If AMD ATi was smart, they'd release something "optimal" for gaming and perhaps even lower their prices by 10% or so. And would promptly crush the shit out of RTX.
    Unless the 2050 somehow exceeds the 1080 to performance I'm not sure why anyone would purchase a 2050.

    However, if it beats out the 1080 somehow, it will be the only one of the new series worth a buy since it's so similarly priced to the 1080.  If it, say, provides an extra 20% of performance benchmark for only about $100-150 extra, that's something worth considering if you don't already have a 1080.
    No. It's not about the performance right now. The nVidia is on my black list right now. The list comprised of various elitist scum who gouge people just because. I wouldn't even think of purchasing a 2050 even if it had 2x the performance of a 1080 because it sends a SERIOUSLY WRONG MESSAGE to the market. Think of the consequences! The RTX ought to be completely boycotted.
    How do you even know what 2050 will cost? They only released details for 2070, 2080 and 2080 Ti.

    EDIT: Also, since 2070 will cost $600 at launch, I think it's very unlikely that 2050 would also cost $600.
    Oh look, an apologist. With that said, a smart move by nVidia. Why not simply milk the shit out of fanboys while you can? Beautiful business move, if it works. Which it most likely won't because the overall profit will be ~ to 1000 series given that it will cost more, but it will sell less. We'll see what happens once 7nm comes out. Until then, good luck nVidia. Milk those fools, yeah?

    (I already see ATi brewing their version of Ryzen)
    I don't see how correcting a mistake = an apologist.

    MadFrenchie had what amounted to a typo, Vrika provided the correction. Nothing in that was pro or con any company.
  • GutlardGutlard Member RarePosts: 1,019
    edited August 2018
    *Slowly puts away pics of Ray Liotta, and backs away from thread*

    Tracing pics of Ray......yeah, ok......nvm...  :(

    Gut Out!
    Post edited by Gutlard on

    What, me worry?

  • KajidourdenKajidourden Member EpicPosts: 3,030
    edited August 2018
    Ridelynn said:
    Gorwe said:
    Vrika said:
    Gorwe said:
    Gorwe said:
    Yes, me too. I get it, they developed something new and are now promoting it. But! Given that a VERY small number of games will even use that AND, let's not forget it, HUGE PRICE at launch(2050 will cost 600$!)...I don't see this going well for nVidia in a short term. If AMD ATi was smart, they'd release something "optimal" for gaming and perhaps even lower their prices by 10% or so. And would promptly crush the shit out of RTX.
    Unless the 2050 somehow exceeds the 1080 to performance I'm not sure why anyone would purchase a 2050.

    However, if it beats out the 1080 somehow, it will be the only one of the new series worth a buy since it's so similarly priced to the 1080.  If it, say, provides an extra 20% of performance benchmark for only about $100-150 extra, that's something worth considering if you don't already have a 1080.
    No. It's not about the performance right now. The nVidia is on my black list right now. The list comprised of various elitist scum who gouge people just because. I wouldn't even think of purchasing a 2050 even if it had 2x the performance of a 1080 because it sends a SERIOUSLY WRONG MESSAGE to the market. Think of the consequences! The RTX ought to be completely boycotted.
    How do you even know what 2050 will cost? They only released details for 2070, 2080 and 2080 Ti.

    EDIT: Also, since 2070 will cost $600 at launch, I think it's very unlikely that 2050 would also cost $600.
    Oh look, an apologist. With that said, a smart move by nVidia. Why not simply milk the shit out of fanboys while you can? Beautiful business move, if it works. Which it most likely won't because the overall profit will be ~ to 1000 series given that it will cost more, but it will sell less. We'll see what happens once 7nm comes out. Until then, good luck nVidia. Milk those fools, yeah?

    (I already see ATi brewing their version of Ryzen)
    I don't see how correcting a mistake = an apologist.

    MadFrenchie had what amounted to a typo, Vrika provided the correction. Nothing in that was pro or con any company.

    Smart as a bag of hammers, that's how.
  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Gorwe said:
    Yes, me too. I get it, they developed something new and are now promoting it. But! Given that a VERY small number of games will even use that AND, let's not forget it, HUGE PRICE at launch(2050 will cost 600$!)...I don't see this going well for nVidia in a short term. If AMD ATi was smart, they'd release something "optimal" for gaming and perhaps even lower their prices by 10% or so. And would promptly crush the shit out of RTX.
    You are assuming that there is going to be a 2050.

    AMD has promised the launch of their first 7 nm card by the end of this year, though they've mostly talked about it for compute and it might not have a Radeon version.  I'm not sure where Nvidia is with 7 nm.  As soon as 7 nm GPUs are available, they're going to wipe out any 12/14/16 nm ones with comparable performance.

    Maybe you won't be able to build an enormous 7 nm GPU at first.  But they will be able to build small ones.  If a 200 mm^2 die on 12 nm will get whipped by a 100 mm^2 one on 7 nm, why bother creating the former at all?  You don't create the former unless the latter is a long way off.
  • centkincentkin Member RarePosts: 1,527
    Computers used to cost $3000.  It is only a recent phenomenon that decent gaming-capable computers became cheaper than that. 

    I remember when Borland brought out professional level compilers for $50 breaking the grips of microsoft compilers at $500.  Again didn't last.

    Things will be priced at what they can get away with.  One difference though today -- a computer doesn't entirely obsolete itself in 3 years.
  • RenoakuRenoaku Member EpicPosts: 3,157
    Personally I would wait awhile unless you have enough money IRL and don't care otherwise wait until the FPS drops in games to like below 100, currently I still get 100+ FPS in all games I play with a cap of 165 in Overwatch so right now don't need the new RX card anyways but based off what i've seen it's supposed to be much faster but how many games actually use that technology and have it optimized yet?
  • wandericawanderica Member UncommonPosts: 371
    To be fair, real time ray tracing is a huge achievement, and I'm glad we're finally there, but I agree with you, OP.  They are really wearing it out for me right now.  I'm less concerned with ray tracing on the new Nvidia hardware at this point, and more looking forward to what Vulkan manages to do with it.  Enlisted, a new MMO shooter in development, is supposed to be using Vulkan and will support ray tracing.  If Vulkan manages to do it well, then that might be a path to bring it to consoles, which will is what will really bring it to mainstream.


  • SplitStream13SplitStream13 Member UncommonPosts: 253
    edited August 2018
    centkin said:
    Computers used to cost $3000.  It is only a recent phenomenon that decent gaming-capable computers became cheaper than that. 

    I remember when Borland brought out professional level compilers for $50 breaking the grips of microsoft compilers at $500.  Again didn't last.

    Things will be priced at what they can get away with.  One difference though today -- a computer doesn't entirely obsolete itself in 3 years.
    What? I'm 28 years old. In my childhood a decent gaming computer was ~ 500-600$ What did you pay 3000$ for? The electricity bill for the next 10 years? 

    The only issue with computers back then was the fact that they were developing too fast and each year you had to buy new stuff, however in the past 10 years that's not even an issue. People still rock on with i5-2500k for example. I wouldn't be surprised if someone is still daily gaming on Radeon 7xxx or GTX 6xx 
  • HellscreamHellscream Member UncommonPosts: 98
    Years ago when new electronics came out computers cell phones etc it was like worth getting the next one cause it was a lot better now a days its like a few extra features and more of a side grade rather then a actual upgrade. Why? cause they are coming out with things pushing things a lot faster now then years ago just to milk your wallet.
  • OzmodanOzmodan Member EpicPosts: 9,726
    Vrika said:
    Gorwe said:
    Gorwe said:
    Yes, me too. I get it, they developed something new and are now promoting it. But! Given that a VERY small number of games will even use that AND, let's not forget it, HUGE PRICE at launch(2050 will cost 600$!)...I don't see this going well for nVidia in a short term. If AMD ATi was smart, they'd release something "optimal" for gaming and perhaps even lower their prices by 10% or so. And would promptly crush the shit out of RTX.
    Unless the 2050 somehow exceeds the 1080 to performance I'm not sure why anyone would purchase a 2050.

    However, if it beats out the 1080 somehow, it will be the only one of the new series worth a buy since it's so similarly priced to the 1080.  If it, say, provides an extra 20% of performance benchmark for only about $100-150 extra, that's something worth considering if you don't already have a 1080.
    No. It's not about the performance right now. The nVidia is on my black list right now. The list comprised of various elitist scum who gouge people just because. I wouldn't even think of purchasing a 2050 even if it had 2x the performance of a 1080 because it sends a SERIOUSLY WRONG MESSAGE to the market. Think of the consequences! The RTX ought to be completely boycotted.
    How do you even know what 2050 will cost? They only released details for 2070, 2080 and 2080 Ti.

    EDIT: Also, since 2070 will cost $600 at launch, I think it's very unlikely that 2050 would also cost $600.
    Apologies, I confused the 2050 and 2070.  If the 2070 is a substantial increase in performance over a 1080, first-gen woes aside for the new tech, it's worth considering if you're thinking about getting a 1080 like I have been.
    From what I have seen, the new 2070 will be very close to a 1080,  not a substantial increase as you said.  You would be better off getting a marked down 1080, they are going for under $500 now.
  • MadFrenchieMadFrenchie Member LegendaryPosts: 8,505
    Ozmodan said:
    Vrika said:
    Gorwe said:
    Gorwe said:
    Yes, me too. I get it, they developed something new and are now promoting it. But! Given that a VERY small number of games will even use that AND, let's not forget it, HUGE PRICE at launch(2050 will cost 600$!)...I don't see this going well for nVidia in a short term. If AMD ATi was smart, they'd release something "optimal" for gaming and perhaps even lower their prices by 10% or so. And would promptly crush the shit out of RTX.
    Unless the 2050 somehow exceeds the 1080 to performance I'm not sure why anyone would purchase a 2050.

    However, if it beats out the 1080 somehow, it will be the only one of the new series worth a buy since it's so similarly priced to the 1080.  If it, say, provides an extra 20% of performance benchmark for only about $100-150 extra, that's something worth considering if you don't already have a 1080.
    No. It's not about the performance right now. The nVidia is on my black list right now. The list comprised of various elitist scum who gouge people just because. I wouldn't even think of purchasing a 2050 even if it had 2x the performance of a 1080 because it sends a SERIOUSLY WRONG MESSAGE to the market. Think of the consequences! The RTX ought to be completely boycotted.
    How do you even know what 2050 will cost? They only released details for 2070, 2080 and 2080 Ti.

    EDIT: Also, since 2070 will cost $600 at launch, I think it's very unlikely that 2050 would also cost $600.
    Apologies, I confused the 2050 and 2070.  If the 2070 is a substantial increase in performance over a 1080, first-gen woes aside for the new tech, it's worth considering if you're thinking about getting a 1080 like I have been.
    From what I have seen, the new 2070 will be very close to a 1080,  not a substantial increase as you said.  You would be better off getting a marked down 1080, they are going for under $500 now.
    Yea, if it's roughly the same, saving the cash and grabbing a 1080 is a no-brainer imo.  Something like a 20% performance increase over the 1080 would seem worth the extra $100-150 to me, but I had my doubts that would be the case.

    image
  • RidelynnRidelynn Member EpicPosts: 7,383
    centkin said:
    Computers used to cost $3000.  It is only a recent phenomenon that decent gaming-capable computers became cheaper than that. 

    I remember when Borland brought out professional level compilers for $50 breaking the grips of microsoft compilers at $500.  Again didn't last.

    Things will be priced at what they can get away with.  One difference though today -- a computer doesn't entirely obsolete itself in 3 years.
    What? I'm 28 years old. In my childhood a decent gaming computer was ~ 500-600$ What did you pay 3000$ for? The electricity bill for the next 10 years
    Gotta go back a bit farther than that - before there was such a thing as a “gaming computer”  — there were basically three brands. Commodore was cheap but no one used it for much apart from hobbyists. Macintosh was out, insanely great, insanely niche, and insanely expensive. And the PC, still pricey, ran Lotus 1-2-3 and WordPerfect, and also a game or two on the side. Wasn’t hard to drop a lot more than $3k by the time you got the monitor, the dual side by side floppy attachment, the dot matrix printer, the cassette tape storage, the upgrade to 64k RAM, the upgrade from monochrome to four color, etc.
    OzmodanKyutaSyuko
Sign In or Register to comment.