Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD ATi RX 7800 "XT"

GorweGorwe Member Posts: 1,609
First off, it has been announced on Gamescom as per schedule. Alongside its overpriced little baby brother 7700 XT. Find out more here: https://www.eurogamer.net/digitalfoundry-2023-amd-announces-rx-7800-xt-7700-xt-graphics-cards and on the official site: https://www.amd.com/en/products/graphics/amd-radeon-rx-7800-xt . Now, why did I put the XT moniker in the quotes? Because it is NOT, in all honesty, an XT product. It is just your run of the mill RX 7800. Here are some comparisons for your pleasure:

7800 "XT": https://www.techpowerup.com/gpu-specs/radeon-rx-7800-xt.c3839
6800 XT: https://www.techpowerup.com/gpu-specs/radeon-rx-6800-xt.c3694 (this is a REAL XT product)
6800: https://www.techpowerup.com/gpu-specs/radeon-rx-6800.c3713 (just compare 7800 and this ... /facepalm)

But also do notice the launch prices:

7800: 500$ (amazing, finally!)
6800 XT: 650$
6800: 580$

So, the good news is that it's affordable. And that it's most likely going to be the flagship for the FSR3 launch. The bad news is that we are being conned. This is NOT an XT product by any possible metric. But it is a bloody good product nonetheless.

What do you think?
«1

Comments

  • VrikaVrika Member LegendaryPosts: 7,988
    Gorwe said:
    Because it is NOT, in all honesty, an XT product. It is just your run of the mill RX 7800...
    Does the XT even mean anything? My opinion is that AMD is free to add XT to all their product names if they want since it's not like it actually means anything.
     
  • NanfoodleNanfoodle Member LegendaryPosts: 10,900
    I just dont get excited about this kind of stuff till I see some real world benchmarks, not just ideal situations on cherry picked games. 
  • GorweGorwe Member Posts: 1,609
    Vrika said:
    Gorwe said:
    Because it is NOT, in all honesty, an XT product. It is just your run of the mill RX 7800...
    Does the XT even mean anything? My opinion is that AMD is free to add XT to all their product names if they want since it's not like it actually means anything.
    Honestly, it does. Because it signifies honesty and such. Being flogged an "XT" product but actually getting a non XT product ... you understand. But maybe they are moving away from the "1234 / 1234 xt" nomenclature to "1234 xt / 1234 xtx" nomenclature?

    Nanfoodle said:
    I just dont get excited about this kind of stuff till I see some real world benchmarks, not just ideal situations on cherry picked games. 
    Honestly, I haven't even looked at those. The performance is almost random(imo), so cherry picked results on systems custom built to give those exact results don't mean that much. But yeah, even as a vanilla 7800, at 500$ MSRP, it's gonna be an interesting time seeing how it performs in the market.

    Note: It's not just 500$, it's actually more 500(+60$) because you get free Starfield. Now, whether a buyer cares or not about SF, don't matter because it's giving even more value to already quite high value. The value you are getting is at least 650$. At 500$ MSRP. Amazing imo, they priced it perfectly.
  • NanfoodleNanfoodle Member LegendaryPosts: 10,900
    Gorwe said:
    Vrika said:
    Gorwe said:
    Because it is NOT, in all honesty, an XT product. It is just your run of the mill RX 7800...
    Does the XT even mean anything? My opinion is that AMD is free to add XT to all their product names if they want since it's not like it actually means anything.
    Honestly, it does. Because it signifies honesty and such. Being flogged an "XT" product but actually getting a non XT product ... you understand. But maybe they are moving away from the "1234 / 1234 xt" nomenclature to "1234 xt / 1234 xtx" nomenclature?

    Nanfoodle said:
    I just dont get excited about this kind of stuff till I see some real world benchmarks, not just ideal situations on cherry picked games. 
    Honestly, I haven't even looked at those. The performance is almost random(imo), so cherry picked results on systems custom built to give those exact results don't mean that much. But yeah, even as a vanilla 7800, at 500$ MSRP, it's gonna be an interesting time seeing how it performs in the market.

    Note: It's not just 500$, it's actually more 500(+60$) because you get free Starfield. Now, whether a buyer cares or not about SF, don't matter because it's giving even more value to already quite high value. The value you are getting is at least 650$. At 500$ MSRP. Amazing imo, they priced it perfectly.
    My guess is the 4070 standard will still be the better option and in the long run the 7800 will age better because of the high bus and VRAM. 
    Gorwedragonlee66
  • GorweGorwe Member Posts: 1,609
    Nanfoodle said:
    Gorwe said:
    Vrika said:
    Gorwe said:
    Because it is NOT, in all honesty, an XT product. It is just your run of the mill RX 7800...
    Does the XT even mean anything? My opinion is that AMD is free to add XT to all their product names if they want since it's not like it actually means anything.
    Honestly, it does. Because it signifies honesty and such. Being flogged an "XT" product but actually getting a non XT product ... you understand. But maybe they are moving away from the "1234 / 1234 xt" nomenclature to "1234 xt / 1234 xtx" nomenclature?

    Nanfoodle said:
    I just dont get excited about this kind of stuff till I see some real world benchmarks, not just ideal situations on cherry picked games. 
    Honestly, I haven't even looked at those. The performance is almost random(imo), so cherry picked results on systems custom built to give those exact results don't mean that much. But yeah, even as a vanilla 7800, at 500$ MSRP, it's gonna be an interesting time seeing how it performs in the market.

    Note: It's not just 500$, it's actually more 500(+60$) because you get free Starfield. Now, whether a buyer cares or not about SF, don't matter because it's giving even more value to already quite high value. The value you are getting is at least 650$. At 500$ MSRP. Amazing imo, they priced it perfectly.
    My guess is the 4070 standard will still be the better option and in the long run the 7800 will age better because of the high bus and VRAM. 
    Provided there is no XTX version. But overall, yes.
  • fineflufffinefluff Member RarePosts: 561
    edited August 2023
    From their slide it seems to have similar or better than RTX 4070 performance. It costs $100 less, has 16gb VRAM, and comes with Starfield for free. It's tempting tbh.

    https://www.tomshardware.com/news/amd-rx-7800-xt-rx-7700-xt-announced
    ValdemarJ
  • QuizzicalQuizzical Member LegendaryPosts: 25,498
    Did you have the same complaint about the low end Radeon RX 5300 XT, or even the 6500 XT?  All that the "XT" really means is that if there are XT and non-XT cards with the same number, the XT version is better.  I think it would be clearer if AMD just had the tens digit in the number vary, but maybe that's why I'm not a marketing person.
  • QuizzicalQuizzical Member LegendaryPosts: 25,498
    The 7700 XT is probably just a low volume product that gives AMD a way to get rid of a relative handful of defective dies.  Between waiting this long to launch and having smaller chiplet dies, the yields are probably excellent.
  • QuizzicalQuizzical Member LegendaryPosts: 25,498
    The Radeon RX 7800 XT is basically 2/3 of a Radeon RX 7900 XTX in most ways, but for 1/2 of the price.  As such, we have a pretty good guess on performance:  it will typically be faster than an RTX 4060 Ti, but slower than an RTX 4070, except that it could beat the latter when the extra memory capacity and bandwidth make a huge difference.
  • GorweGorwe Member Posts: 1,609
    edited August 2023
    Quizzical said:
    Did you have the same complaint about the low end Radeon RX 5300 XT, or even the 6500 XT?  All that the "XT" really means is that if there are XT and non-XT cards with the same number, the XT version is better.  I think it would be clearer if AMD just had the tens digit in the number vary, but maybe that's why I'm not a marketing person.
    Yes, but XT in the 7800 "XT" no longer means that. Even if it did just a generation ago(6800 and 6800 XT?). It's not about performance(at all), but more about clarity and PR. This is conmen PR(=create megaconfusion) and don't instill confidence in them. I like clarity, what can I say?

    Regardless, I think this is a good, futureproof product for 1440. Will probably only get better when Sapphire(et al) get their hands on it. Which reminds me.

    Sapphire and such should really start using color coding. Like how Weller uses that famous teal hue of theirs. Or Uni-T and red. Fluke and Amber. Zotac used to use mostly black + amber too, but that's way back. It would be so awesome to have a Sapphire Blue Sapphire AMD card.
    Asm0deus
  • VrikaVrika Member LegendaryPosts: 7,988
    Gorwe said:
    Quizzical said:
    Did you have the same complaint about the low end Radeon RX 5300 XT, or even the 6500 XT?  All that the "XT" really means is that if there are XT and non-XT cards with the same number, the XT version is better.  I think it would be clearer if AMD just had the tens digit in the number vary, but maybe that's why I'm not a marketing person.
    Yes, but XT in the 7800 "XT" no longer means that. Even if it did just a generation ago(6800 and 6800 XT?). It's not about performance(at all), but more about clarity and PR. This is conmen PR(=create megaconfusion) and don't instill confidence in them. I like clarity, what can I say?
    The XT still means that if there is non-XT version it's less powerful. I fail to see how the lack of non-XT version would cause anyone buy a wrong product.

    If we take for example 7800 XT, which GPU do you think buyers will accidentally buy because there's XT in that name?

    The XT may be meaningless and unnecessary (albeit perhaps cool looking), but it's not liable to confuse anyone.
    ValdemarJ
     
  • mklinicmklinic Member RarePosts: 2,014
    I won't claim to be a hardware enthusiast so sorry if this was answered in some way, but does "XT" have a concrete definition to where it can be said that this "XT" card is not actually "XT"? If not, then just seems like the complaint is about something that can be arbitrarily applied anyhow.

    -mklinic

    "Do something right, no one remembers.
    Do something wrong, no one forgets"
    -from No One Remembers by In Strict Confidence

  • NanfoodleNanfoodle Member LegendaryPosts: 10,900
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
  • OG_SolareusOG_Solareus Member RarePosts: 1,041
    AMd leads in rasterization in mid class, they always have. AMD drivers been solid since the HD series.

    Nvidia is milking their comsumers always have , always will.
    Ridelynn
  • Asm0deusAsm0deus Member EpicPosts: 4,617
    edited August 2023
    Vrika said:
    Gorwe said:
    Quizzical said:
    Did you have the same complaint about the low end Radeon RX 5300 XT, or even the 6500 XT?  All that the "XT" really means is that if there are XT and non-XT cards with the same number, the XT version is better.  I think it would be clearer if AMD just had the tens digit in the number vary, but maybe that's why I'm not a marketing person.
    Yes, but XT in the 7800 "XT" no longer means that. Even if it did just a generation ago(6800 and 6800 XT?). It's not about performance(at all), but more about clarity and PR. This is conmen PR(=create megaconfusion) and don't instill confidence in them. I like clarity, what can I say?
    The XT still means that if there is non-XT version it's less powerful. I fail to see how the lack of non-XT version would cause anyone buy a wrong product.

    If we take for example 7800 XT, which GPU do you think buyers will accidentally buy because there's XT in that name?

    The XT may be meaningless and unnecessary (albeit perhaps cool looking), but it's not liable to confuse anyone.

    He is just saying its a bit shady as they got people to think XT means one particular thing then switched it up to something else.

    Kind of like how back in the day when you said I have a dope sound system in my car its 100w of window vibrating power  and I have added a capacitor for the system bla bla....

    however..... back in the day 100w we meant watts in RMS but than that was too easy so marketing came up which a bunch of BS non sense to confuse and con people into thinking they were buying 500w rms when really they were buying 500w peak or PP etc etc.

    That's a bit of the crap they pull sometimes with the naming conventions on gpu's.

    Kind of like the 4080 12gb/4070ti rebranding thing.

    Atleast that is how I am reading what Gorwe is saying.
    Gorwe

    Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.





  • QuizzicalQuizzical Member LegendaryPosts: 25,498
    Nanfoodle said:
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
    If you're running TensorFlow a lot on your GPU, then fine.  That would be a good reason to go Nvidia, even if it's unusual for consumer use.  But if you're not doing something like that, then I think that being better at running something that you know that you'll never run is a really dumb reason to pick a card.

    If an RTX 4070 isn't breaking 100 W, then you're not pushing it very hard.  And that's fine:  it's good that GPUs can clock down to save power when you only need 1/4 of the performance that it offers.  But that doesn't make it into a sub-100 W card in demanding games.
  • GorweGorwe Member Posts: 1,609
    edited August 2023
    Quizzical said:
    Nanfoodle said:
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
    If you're running TensorFlow a lot on your GPU, then fine.  That would be a good reason to go Nvidia, even if it's unusual for consumer use.  But if you're not doing something like that, then I think that being better at running something that you know that you'll never run is a really dumb reason to pick a card.

    If an RTX 4070 isn't breaking 100 W, then you're not pushing it very hard.  And that's fine:  it's good that GPUs can clock down to save power when you only need 1/4 of the performance that it offers.  But that doesn't make it into a sub-100 W card in demanding games.
    An upgrade from Vega56 to 7800(no longer calling this XT lol) seems very enticing. Even the physical size fits. Now the only conundrum is my PSU. It's old and 650W(iirc) ... will it run 7800? Ofc, I'd upgrade my CPU too so the bottleneck isn't so hard(still AM4).
  • NanfoodleNanfoodle Member LegendaryPosts: 10,900
    Quizzical said:
    Nanfoodle said:
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
    If you're running TensorFlow a lot on your GPU, then fine.  That would be a good reason to go Nvidia, even if it's unusual for consumer use.  But if you're not doing something like that, then I think that being better at running something that you know that you'll never run is a really dumb reason to pick a card.

    If an RTX 4070 isn't breaking 100 W, then you're not pushing it very hard.  And that's fine:  it's good that GPUs can clock down to save power when you only need 1/4 of the performance that it offers.  But that doesn't make it into a sub-100 W card in demanding games.
    Thats the point, 1440p at high and some ultra settings and 100w kills AMDs power consumption. 
  • IselinIselin Member LegendaryPosts: 18,719
    Nanfoodle said:
    Quizzical said:
    Nanfoodle said:
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
    If you're running TensorFlow a lot on your GPU, then fine.  That would be a good reason to go Nvidia, even if it's unusual for consumer use.  But if you're not doing something like that, then I think that being better at running something that you know that you'll never run is a really dumb reason to pick a card.

    If an RTX 4070 isn't breaking 100 W, then you're not pushing it very hard.  And that's fine:  it's good that GPUs can clock down to save power when you only need 1/4 of the performance that it offers.  But that doesn't make it into a sub-100 W card in demanding games.
    Thats the point, 1440p at high and some ultra settings and 100w kills AMDs power consumption. 
    I'm glad you're happy with your new card but "kills AMD power consumption" seems like a grandiose and unwarranted statement unless you have the comparison testing data to back it up.

    Something like this: https://www.videocardbenchmark.net/power_performance.html

    "Social media gives legions of idiots the right to speak when they once only spoke at a bar after a glass of wine, without harming the community ... but now they have the same right to speak as a Nobel Prize winner. It's the invasion of the idiots”

    ― Umberto Eco

    “Microtransactions? In a single player role-playing game? Are you nuts?” 
    ― CD PROJEKT RED

  • NanfoodleNanfoodle Member LegendaryPosts: 10,900
    edited August 2023
    Iselin said:
    Nanfoodle said:
    Quizzical said:
    Nanfoodle said:
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
    If you're running TensorFlow a lot on your GPU, then fine.  That would be a good reason to go Nvidia, even if it's unusual for consumer use.  But if you're not doing something like that, then I think that being better at running something that you know that you'll never run is a really dumb reason to pick a card.

    If an RTX 4070 isn't breaking 100 W, then you're not pushing it very hard.  And that's fine:  it's good that GPUs can clock down to save power when you only need 1/4 of the performance that it offers.  But that doesn't make it into a sub-100 W card in demanding games.
    Thats the point, 1440p at high and some ultra settings and 100w kills AMDs power consumption. 
    I'm glad you're happy with your new card but "kills AMD power consumption" seems like a grandiose and unwarranted statement unless you have the comparison testing data to back it up.

    Something like this: https://www.videocardbenchmark.net/power_performance.html

    LOL that site does not even have the 7800 XT on it LOL you killing me. Go look it up, RTX 4070 at 1440p is running about 100w high end 200w on heavy loads. The 7800 XT is 263w on full load. So at full load thats about a 25% savings on power but the 4070 in Real World its normally sits at 100w if you are not pushing 4k. Most reviews expect the 7800 XT to be about the 160w mark at 1440p. That would be about a 30-40% power savings on your GPU with the 4070
    Iselin
  • VrikaVrika Member LegendaryPosts: 7,988
    Nanfoodle said:
    Iselin said:
    Nanfoodle said:
    Quizzical said:
    Nanfoodle said:
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
    If you're running TensorFlow a lot on your GPU, then fine.  That would be a good reason to go Nvidia, even if it's unusual for consumer use.  But if you're not doing something like that, then I think that being better at running something that you know that you'll never run is a really dumb reason to pick a card.

    If an RTX 4070 isn't breaking 100 W, then you're not pushing it very hard.  And that's fine:  it's good that GPUs can clock down to save power when you only need 1/4 of the performance that it offers.  But that doesn't make it into a sub-100 W card in demanding games.
    Thats the point, 1440p at high and some ultra settings and 100w kills AMDs power consumption. 
    I'm glad you're happy with your new card but "kills AMD power consumption" seems like a grandiose and unwarranted statement unless you have the comparison testing data to back it up.

    Something like this: https://www.videocardbenchmark.net/power_performance.html

    LOL that site does not even have the 7800 XT on it LOL you killing me. Go look it up, RTX 4070 at 1440p is running about 100w high end 200w on heavy loads. The 7800 XT is 263w on full load. So at full load thats about a 25% savings on power but the 4070 in Real World its normally sits at 100w if you are not pushing 4k. Most reviews expect the 7800 XT to be about the 160w mark at 1440p. That would be about a 30-40% power savings on your GPU with the 4070
    7800 XT isn't released yet. At this point there are no reliable performance measurements or reviews available, those will come later.
     
  • NanfoodleNanfoodle Member LegendaryPosts: 10,900
    Vrika said:
    Nanfoodle said:
    Iselin said:
    Nanfoodle said:
    Quizzical said:
    Nanfoodle said:
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
    If you're running TensorFlow a lot on your GPU, then fine.  That would be a good reason to go Nvidia, even if it's unusual for consumer use.  But if you're not doing something like that, then I think that being better at running something that you know that you'll never run is a really dumb reason to pick a card.

    If an RTX 4070 isn't breaking 100 W, then you're not pushing it very hard.  And that's fine:  it's good that GPUs can clock down to save power when you only need 1/4 of the performance that it offers.  But that doesn't make it into a sub-100 W card in demanding games.
    Thats the point, 1440p at high and some ultra settings and 100w kills AMDs power consumption. 
    I'm glad you're happy with your new card but "kills AMD power consumption" seems like a grandiose and unwarranted statement unless you have the comparison testing data to back it up.

    Something like this: https://www.videocardbenchmark.net/power_performance.html

    LOL that site does not even have the 7800 XT on it LOL you killing me. Go look it up, RTX 4070 at 1440p is running about 100w high end 200w on heavy loads. The 7800 XT is 263w on full load. So at full load thats about a 25% savings on power but the 4070 in Real World its normally sits at 100w if you are not pushing 4k. Most reviews expect the 7800 XT to be about the 160w mark at 1440p. That would be about a 30-40% power savings on your GPU with the 4070
    7800 XT isn't released yet. At this point there are no reliable performance measurements or reviews available, those will come later.
    Your right, max power load given to us by AMD does not mean anything. 7800 XT looks like a great deal, $100 less. My only point is its not all roses. There are some thing to consider when you pick between the 7800 XT and the 4070. 
  • IselinIselin Member LegendaryPosts: 18,719
    edited August 2023
    Nanfoodle said:
    Iselin said:
    Nanfoodle said:
    Quizzical said:
    Nanfoodle said:
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
    If you're running TensorFlow a lot on your GPU, then fine.  That would be a good reason to go Nvidia, even if it's unusual for consumer use.  But if you're not doing something like that, then I think that being better at running something that you know that you'll never run is a really dumb reason to pick a card.

    If an RTX 4070 isn't breaking 100 W, then you're not pushing it very hard.  And that's fine:  it's good that GPUs can clock down to save power when you only need 1/4 of the performance that it offers.  But that doesn't make it into a sub-100 W card in demanding games.
    Thats the point, 1440p at high and some ultra settings and 100w kills AMDs power consumption. 
    I'm glad you're happy with your new card but "kills AMD power consumption" seems like a grandiose and unwarranted statement unless you have the comparison testing data to back it up.

    Something like this: https://www.videocardbenchmark.net/power_performance.html

    LOL that site does not even have the 7800 XT on it LOL you killing me. Go look it up, RTX 4070 at 1440p is running about 100w high end 200w on heavy loads. The 7800 XT is 263w on full load. So at full load thats about a 25% savings on power but the 4070 in Real World its normally sits at 100w if you are not pushing 4k. Most reviews expect the 7800 XT to be about the 160w mark at 1440p. That would be about a 30-40% power savings on your GPU with the 4070
    Feel free to link any site you think is better with real world professional performance measurements instead of the cherry-picked BS you're pulling out your ass... LOL right back at you.
    "Social media gives legions of idiots the right to speak when they once only spoke at a bar after a glass of wine, without harming the community ... but now they have the same right to speak as a Nobel Prize winner. It's the invasion of the idiots”

    ― Umberto Eco

    “Microtransactions? In a single player role-playing game? Are you nuts?” 
    ― CD PROJEKT RED

  • NanfoodleNanfoodle Member LegendaryPosts: 10,900
    edited August 2023
    Iselin said:
    Nanfoodle said:
    Iselin said:
    Nanfoodle said:
    Quizzical said:
    Nanfoodle said:
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
    If you're running TensorFlow a lot on your GPU, then fine.  That would be a good reason to go Nvidia, even if it's unusual for consumer use.  But if you're not doing something like that, then I think that being better at running something that you know that you'll never run is a really dumb reason to pick a card.

    If an RTX 4070 isn't breaking 100 W, then you're not pushing it very hard.  And that's fine:  it's good that GPUs can clock down to save power when you only need 1/4 of the performance that it offers.  But that doesn't make it into a sub-100 W card in demanding games.
    Thats the point, 1440p at high and some ultra settings and 100w kills AMDs power consumption. 
    I'm glad you're happy with your new card but "kills AMD power consumption" seems like a grandiose and unwarranted statement unless you have the comparison testing data to back it up.

    Something like this: https://www.videocardbenchmark.net/power_performance.html

    LOL that site does not even have the 7800 XT on it LOL you killing me. Go look it up, RTX 4070 at 1440p is running about 100w high end 200w on heavy loads. The 7800 XT is 263w on full load. So at full load thats about a 25% savings on power but the 4070 in Real World its normally sits at 100w if you are not pushing 4k. Most reviews expect the 7800 XT to be about the 160w mark at 1440p. That would be about a 30-40% power savings on your GPU with the 4070
    Feel free to link any site you think is better with real world professional performance measurements instead of the cherry-picked BS you're pulling out your ass... LOL right back at you.

    Here, let me help you with a simple google search. Everything I said is all over the internet. I dont get the rudeness when your the one who posted info that had nothing to do with the 7800 XT vs the 4070. Thats the card AMD has decided to mark the card as its competitor. Im not knocking the 7800 XT, just pointing out the pros and cons. I have already said further up in the thread where I think the 7800 XT will win over the 4070 but that did not garner any negative response =) That makes me think your just defending blindly for what ever reason. 

    AMD Radeon RX 7800 XT Specs | TechPowerUp GPU Database

    NVIDIA GeForce RTX 4070 Specs | TechPowerUp GPU Database
  • IselinIselin Member LegendaryPosts: 18,719
    Nanfoodle said:
    Iselin said:
    Nanfoodle said:
    Iselin said:
    Nanfoodle said:
    Quizzical said:
    Nanfoodle said:
    I went with a 4070 in the end for three reasons. Nvidia just does drivers better. No one comes close. 2nd AI. Nvidia has always said they are more about softwear. Nvidia has already started to use AI for their cards and I think this is just the start. IMO this will be a game changer in the end. 3rd, the power requirement. I'm running games at 1440p high to ultra settings and not breaking 100w. Long run, that's allot of savings and eco friendly. 
    If you're running TensorFlow a lot on your GPU, then fine.  That would be a good reason to go Nvidia, even if it's unusual for consumer use.  But if you're not doing something like that, then I think that being better at running something that you know that you'll never run is a really dumb reason to pick a card.

    If an RTX 4070 isn't breaking 100 W, then you're not pushing it very hard.  And that's fine:  it's good that GPUs can clock down to save power when you only need 1/4 of the performance that it offers.  But that doesn't make it into a sub-100 W card in demanding games.
    Thats the point, 1440p at high and some ultra settings and 100w kills AMDs power consumption. 
    I'm glad you're happy with your new card but "kills AMD power consumption" seems like a grandiose and unwarranted statement unless you have the comparison testing data to back it up.

    Something like this: https://www.videocardbenchmark.net/power_performance.html

    LOL that site does not even have the 7800 XT on it LOL you killing me. Go look it up, RTX 4070 at 1440p is running about 100w high end 200w on heavy loads. The 7800 XT is 263w on full load. So at full load thats about a 25% savings on power but the 4070 in Real World its normally sits at 100w if you are not pushing 4k. Most reviews expect the 7800 XT to be about the 160w mark at 1440p. That would be about a 30-40% power savings on your GPU with the 4070
    Feel free to link any site you think is better with real world professional performance measurements instead of the cherry-picked BS you're pulling out your ass... LOL right back at you.

    Here, let me help you with a simple google search. Everything I said is all over the internet. I dont get the rudeness when your the one who posted info that had nothing to do with the 7800 XT vs the 4070. Thats the card AMD has decided to mark the card as its competitor. Im not knocking the 7800 XT, just pointing out the pros and cons. I have already said further up in the thread where I think the 7800 XT will win over the 4070 but that did not garner any negative response =) That makes me think your just defending blindly for what ever reason. 

    AMD Radeon RX 7800 XT Specs | TechPowerUp GPU Database

    NVIDIA GeForce RTX 4070 Specs | TechPowerUp GPU Database
    Not defending anything just criticizing your assertion that the 4700 "kills AMD power consumption" with zero real-world data to back it up because the card has not even been released.


    "Social media gives legions of idiots the right to speak when they once only spoke at a bar after a glass of wine, without harming the community ... but now they have the same right to speak as a Nobel Prize winner. It's the invasion of the idiots”

    ― Umberto Eco

    “Microtransactions? In a single player role-playing game? Are you nuts?” 
    ― CD PROJEKT RED

Sign In or Register to comment.