Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Nvidia launches GeForce RTX 3090 Ti video card/space heater

QuizzicalQuizzical Member LegendaryPosts: 25,499
Way back in 2020, Nvidia launched the GeForce RTX 3090.  Well, sort of.  It was based on GDDR6X memory that was still only sampling, so it was a very soft launch.  But they were eventually able to get the memory that they wanted, a little after the miners started buying everything.

Anyway, with a name like Ampere, you know that it was going to use a lot of them.  And so it did:  the card was rated at 350 W.  But apparently that isn't enough, as Nvidia doesn't have anything really new to offer yet, more than a year and a half later.  So instead, they're today launching a GeForce RTX 3090 Ti.  The RTX 3090 disabled part of the chip for the sake of yields, but the 3090 Ti is fully functional.  It's also clocked higher.

You know what happens when you take a card and clock it higher, don't you?  Power goes up.  The RTX 3090 Ti is rated at 450 W, or 29% more than the RTX 3090.  For all that extra power, it offers around 10% more performance than the RTX 3090.  It also uses the new 16-pin PCI-E connector that is rated at up to 600 W.

And, of course, board partners just have to offer factory overclocked versions of the card that clock it even higher.  EVGA and Galax both have custom cards that have not one but two of the 16-pin, 600 W connectors, as if inviting you to overclock it far enough to burn more than 675 W.  Don't forget that the PCI Express slot can also deliver 75 W.  Or maybe that's just for safety as they expect you to come close to that limit.  Galax apparently says that their card is rated at 516 W.

Rumors say that the next generation GeForce RTX 4090 or whatever Nvidia decides to call it could burn as much as 600 W.  Internet rumors are often wrong, but after seeing the H100 listed at 700 W, I'm not expecting Nvidia to dial back the power usage unless these new cards don't sell.  Speaking of which, if you want to buy one, they're $2000 each.  Which is, of course, another reason why they might not sell as well as Nvidia hopes.

This, of course, raises the question of just how much heat you want a video card to kick out.  The power usage isn't really a problem for household wiring, or at least not yet--and not unless you do something like putting two of them in SLI in an otherwise very high powered rig.  A typical microwave will draw more power than a typical gaming desktop built around even a GeForce RTX 3090 Ti and Intel's new Core i9-12900KS.

But you don't keep your microwave, oven, or dryer running all day, and they can heat up the house quite a bit when they do run.  Maybe that's okay during a cold winter, but outside of that, I don't want a video card to use more than 300 W or so.
IceAgeGrymmoire

Comments

  • MendelMendel Member LegendaryPosts: 5,609
    Space heater!  :D



    [Deleted User]

    Logic, my dear, merely enables one to be wrong with great authority.

  • olepiolepi Member EpicPosts: 3,053
    I live at high altitude where the air is thin and dry. Plus, I have no air conditioning. So I've had heating problems. One of my motherboards overheated to the point that it literally melted the power connectors.

    It can get up to over 80F in the house and my machine doesn't like that at all.

    No thanks, I have to throttle my graphics card as it is.

    ------------
    2024: 47 years on the Net.


  • VrikaVrika Member LegendaryPosts: 7,989
    I think if you've got $2000 to spend on fastest GPU available, then having to also pay for air cooler's running costs won't be a problem for you.

    It's not like the card would be good purchase from practical point of view anyway. It's clearly not meant to be. It's the extra-expensive product for the enthusiasts who want fastest speed at any price, and should be treated as such.
    TheDalaiBombaIceAge
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    It's not just the cost of cooling.  I find it annoying if one room is a lot hotter than another, and cranking up the central air conditioning to make them all cooler doesn't fix that.  It's possible to adjust it some by selectively closing vents in some rooms, but that's also a nuisance to do and undo.
  • VrikaVrika Member LegendaryPosts: 7,989
    edited March 2022
    Quizzical said:
    It's not just the cost of cooling.  I find it annoying if one room is a lot hotter than another, and cranking up the central air conditioning to make them all cooler doesn't fix that.  It's possible to adjust it some by selectively closing vents in some rooms, but that's also a nuisance to do and undo.
    There are a lot of people without that problem.

    Just like there are a lot of people who can't get the largest TVs due to lack of room, people who can't buy garden furniture because they lack a yards, etc. Not all product are fit for all homes, nor should they be.

    The increased price and power requirements are a problem when you've got products like RTX 3070 TI that should go into normal gaming computer costing $800 and using nearly 300W. But RTX 3090 TI is clearly not a mainstream product. It's not intended to fit everyone.
    IceAge
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Well sure, it's a niche product that has its niche.  I'm not part of that niche, and not only because of the price tag.  If power consumption keeps going up and up, then my future video card purchases might just be the best 300 W or so card that I can find, even if it's far from the high end.
    [Deleted User]
  • IceAgeIceAge Member EpicPosts: 3,203
    Quizzical said:
    Way back in 2020, Nvidia launched the GeForce RTX 3090.  Well, sort of.  It was based on GDDR6X memory that was still only sampling, so it was a very soft launch.  But they were eventually able to get the memory that they wanted, a little after the miners started buying everything.

    Anyway, with a name like Ampere, you know that it was going to use a lot of them.  And so it did:  the card was rated at 350 W.  But apparently that isn't enough, as Nvidia doesn't have anything really new to offer yet, more than a year and a half later.  So instead, they're today launching a GeForce RTX 3090 Ti.  The RTX 3090 disabled part of the chip for the sake of yields, but the 3090 Ti is fully functional.  It's also clocked higher.

    You know what happens when you take a card and clock it higher, don't you?  Power goes up.  The RTX 3090 Ti is rated at 450 W, or 29% more than the RTX 3090.  For all that extra power, it offers around 10% more performance than the RTX 3090.  It also uses the new 16-pin PCI-E connector that is rated at up to 600 W.

    And, of course, board partners just have to offer factory overclocked versions of the card that clock it even higher.  EVGA and Galax both have custom cards that have not one but two of the 16-pin, 600 W connectors, as if inviting you to overclock it far enough to burn more than 675 W.  Don't forget that the PCI Express slot can also deliver 75 W.  Or maybe that's just for safety as they expect you to come close to that limit.  Galax apparently says that their card is rated at 516 W.

    Rumors say that the next generation GeForce RTX 4090 or whatever Nvidia decides to call it could burn as much as 600 W.  Internet rumors are often wrong, but after seeing the H100 listed at 700 W, I'm not expecting Nvidia to dial back the power usage unless these new cards don't sell.  Speaking of which, if you want to buy one, they're $2000 each.  Which is, of course, another reason why they might not sell as well as Nvidia hopes.

    This, of course, raises the question of just how much heat you want a video card to kick out.  The power usage isn't really a problem for household wiring, or at least not yet--and not unless you do something like putting two of them in SLI in an otherwise very high powered rig.  A typical microwave will draw more power than a typical gaming desktop built around even a GeForce RTX 3090 Ti and Intel's new Core i9-12900KS.

    But you don't keep your microwave, oven, or dryer running all day, and they can heat up the house quite a bit when they do run.  Maybe that's okay during a cold winter, but outside of that, I don't want a video card to use more than 300 W or so.
    TL;DR

    I hate Nvidia, so here I am, qq-ing about something i'll never buy because it runs hot and is expensive..even tho' is the best card available ....in the world. 

    Reporter: What's behind Blizzard success, and how do you make your gamers happy?
    Blizzard Boss: Making gamers happy is not my concern, making money.. yes!

  • RidelynnRidelynn Member EpicPosts: 7,383
    edited March 2022
    I have two thoughts on this:

    The first is, this is obviously a halo product. nVidia doesn't expect to sell a ton of these (although they may, if the miners get involved). As such, if they are hot and loud - well, only a few people will actually experience that. And in the chase for more FPS / better graphics -- the people willing to shuck out $2K+ for just a GPU are also willing to overlook a lot of shortcomings. Like the fact that it's hot, and loud, and has a 3.5 slot cooler that will need support in a giant full sized EATX case.

    The second is an extension of the first. Something like this will not go main stream. Regular people will start to push back if the product is too loud, too big, too unwieldy. They may do some TDP game like they do with their Max-Q laptop products (which would let you get some impressive benchmarks until you saturate some entirely inadequate thermal solution) - but yeah, nothing this power hungry will take off main stream without a lot of blow back on a lot of different consumer fronts.,

    Not everyone buys the Lamborghini. Just imagine trying to crawl in and out of one every day with a bag of groceries.
    [Deleted User]
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Ridelynn said:
    I have two thoughts on this:

    The first is, this is obviously a halo product. nVidia doesn't expect to sell a ton of these (although they may, if the miners get involved). As such, if they are hot and loud - well, only a few people will actually experience that. And in the chase for more FPS / better graphics -- the people willing to shuck out $2K+ for just a GPU are also willing to overlook a lot of shortcomings. Like the fact that it's hot, and loud, and has a 3.5 slot cooler that will need support in a giant full sized EATX case.

    The second is an extension of the first. Something like this will not go main stream. Regular people will start to push back if the product is too loud, too big, too unwieldy. They may do some TDP game like they do with their Max-Q laptop products (which would let you get some impressive benchmarks until you saturate some entirely inadequate thermal solution) - but yeah, nothing this power hungry will take off main stream without a lot of blow back on a lot of different consumer fronts.,

    Not everyone buys the Lamborghini. Just imagine trying to crawl in and out of one every day with a bag of groceries.
    What I worry about is that we're headed for a future where the high end cards from all vendors are 500-600 W, the mid-range is 300 W, and even the low end is around 150 W.  For example, the Radeon RX 590 was hardly a high end card, but 225 W would have been very much in the high end not very long ago.
  • CleffyCleffy Member RarePosts: 6,414
    I use my rtx 3090 as a space heater. Very convinent dual purpose device. 
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited March 2022
    Quizzical said:
    Ridelynn said:
    I have two thoughts on this:

    The first is, this is obviously a halo product. nVidia doesn't expect to sell a ton of these (although they may, if the miners get involved). As such, if they are hot and loud - well, only a few people will actually experience that. And in the chase for more FPS / better graphics -- the people willing to shuck out $2K+ for just a GPU are also willing to overlook a lot of shortcomings. Like the fact that it's hot, and loud, and has a 3.5 slot cooler that will need support in a giant full sized EATX case.

    The second is an extension of the first. Something like this will not go main stream. Regular people will start to push back if the product is too loud, too big, too unwieldy. They may do some TDP game like they do with their Max-Q laptop products (which would let you get some impressive benchmarks until you saturate some entirely inadequate thermal solution) - but yeah, nothing this power hungry will take off main stream without a lot of blow back on a lot of different consumer fronts.,

    Not everyone buys the Lamborghini. Just imagine trying to crawl in and out of one every day with a bag of groceries.
    What I worry about is that we're headed for a future where the high end cards from all vendors are 500-600 W, the mid-range is 300 W, and even the low end is around 150 W.  For example, the Radeon RX 590 was hardly a high end card, but 225 W would have been very much in the high end not very long ago.
    I'm kinda ok with that. If the high end cards are gonna be $2k - that wasn't something I was going to afford anyway.

    The "high end" for me is still the $500 +/- bracket, and those usually come in something right around 200W. So it doesn't look like much has changed in my reality - just the a new tier of super-halo cards exist that doesn't really affect me

    The low low end had already been more or less overtaken by IGP, we haven't seen much under 125W that's been worthwhile in a while, and only a very few select products that come in without needing a PCIe power adapter (<= 75W).
    [Deleted User]
  • GrymmoireGrymmoire Member UncommonPosts: 191
    Two new card names of the future: The Sahara and Mohave; game better while heating up to 900 square feet of your room!

    Ridelynndragonlee66[Deleted User]
  • Jamar870Jamar870 Member UncommonPosts: 573
    Well the future for such hardware isn't good. Because we can't sustain continued increases in power while the demands on the grid will increase. Because we have to move from carbon based engery sources. Anyone think the grid will be up for all the demands of future, especially EV and the move to electric based heating and cooking?

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Grymmoire said:
    Two new card names of the future: The Sahara and Mohave; game better while heating up to 900 square feet of your room!

    Hardware vendors have been naming architectures for their high power consumption or heat output for quite some time.  AMD seems to have started the trend with Polaris and Vega:  stars have very high power outputs, after all.  Then came Intel Rocket Lake and Nvidia Ampere.  I'd like to think that it's not a coincidence that those architectures were all known for not being very efficient.
Sign In or Register to comment.