Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel unveils Alder Lake, coming November 4

QuizzicalQuizzical Member LegendaryPosts: 25,499
A few days ago, Intel announced that their next generation high end mainstream consumer desktop CPUs will launch next week.  This is a very limited launch, with six SKUs coming, and they're probably all just different bins of the same die.  They'll compete against the Ryzen 5000 series desktop parts.

That was a lot of adjectives on the type of CPUs, which merits further explanation.  These are desktop parts only, not laptop or server, though laptop and server versions of the same chips are probably coming later.  They're mainstream consumer desktop parts, as opposed to the HEDT parts akin to AMD's Threadripper or Intel's previous generations of Core-X parts.  It's not a full lineup of mainstream consumer desktop parts, but only the top end of it.  Listed prices range from $264 to $589.

It's really three pairs of parts, with each pair offering a version with an integrated GPU (ending in K) and one without (ending in KF).  Really, the KF parts probably have integrated GPUs, too, but this is Intel's way to get rid of the parts where the integrated GPU didn't work, as Intel can disable the GPU and sell it as a CPU without a GPU.  Not having an integrated GPU is fine in a gaming desktop that is going to have a discrete video card, but not fine in a laptop.  Intel's list price says that giving up the integrated GPU will save you $25.

Alder Lake will be the first parts on Intel's new 7 nm process node, formerly known as 10 nm SuperFin+, and even more formerly known as 10+++ nm.  Isn't it wonderful how a more mature process node that doesn't shrink anything as compared to the less mature version now gets smaller number of nanometers as its name?  This is kind of like TSMC going from 16 nm to 12 nm, or Samsung from 14 to 12, or Samsung from 10 to 8, or TSMC from 7 to 6, or... well, you see why Intel had to do it.

The top end part has 16 cores and 24 threads.  No, that's not a typo, even though 24 is not a multiple of 16.  Intel is launching the first ever part with heterogeneous x86 cores.  There are trade-offs between power and performance, and the "P" cores are designed for high performance at the expense of high power consumption, while the "E" cores are designed for low power consumption at the expense of low performance.  The P cores have hyperthreading, for two threads per core.  The E cores don't, so they only have one thread per core.

Intel is basically promising that the P cores will reclaim the single-threaded performance crown from AMD.  As compared to Rocket Lake, IPC goes up by about 15%, while the max turbo clock speed declines only slightly.  Rocket Lake was commonly about even with AMD's Zen 3 cores in single-threaded performance, so this probably does get Intel back the single-threaded performance lead.

Meanwhile, Intel is claiming that the E cores will have about the same IPC as Sky Lake, which launched way back in 2015.  Well, technically Intel said the much more recent Comet Lake, but Comet Lake is just Sky Lake Refresh Refresh Refresh Refresh, so Comet Lake cores are basically the same as Sky Lake cores.  Also, the E cores will clock lower than the original Sky Lake cores.  Even so, if Intel's claims are accurate, they're still decently fast cores, so they're nothing like the old in-order (pre-Silvermont) Atom cores.

So why move to heterogeneous x86 cores for a desktop part?  There are two reasons that I can see, both of which are dumb.  One or more of them may not have been a motive for Intel, but if there is actually a good reason to go this route, no one seems to know what it is.

One reason is to reduce power consumption when idle or otherwise running a very light workload.  Cell phones have had mixed cores like this for quite a few years.  ARM calls it big.LITTLE.  The idea is that when you need a lot of performance, you fire up the big cores and get a lot of performance from them.  When you don't need so much performance, you shut down the big cores and run everything on the little cores, which are more efficient.  Most likely, you just down all but one of the little cores while you're at it.

So why is this a dumb idea in desktops?  In a cell phone, saving a tenth of a watt matters a lot.  In a desktop, it really doesn't.  That desktop may already cranking out something like 100 W at idle, and reducing that by a fraction of a watt really doesn't matter.  Even if you did care about that fraction of a watt, there are easier ways to get it.  Alder Lake is eventually going to go into laptops, too, which are more power-sensitive than desktops.  But laptops still use massively more power than cell phones, so even there, that small fraction of a watt doesn't make much of a difference.

A second reason is that in workloads that scale well to many cores, more cores clocked lower will tend to beat fewer cores clocked higher.  If you want to get the best possible performance inside of a given die size or power budget, having a bunch of the "E" cores will beat far fewer "P" cores.  One problem wtih this is that many slow cores means that the CPU will feel slow when you're only using a few cores but want them to clock high.  If Intel's claims are accurate, tasks running on the E cores will tend to be slower than if they were running on a Core i7-6700K from 2015.  Another problem is that embarrassingly parallel workloads can often be shifted to a GPU, which will blow away any CPU in existence in any efficiency metric you can imagine.

The solution to this is to have both the P cores and E cores available.  The problem is, how do you decide whether to use the P cores or E cores?  You don't, but Windows does, and Intel is basically saying that anything besides Windows 11 will choke on this and do it all wrong.  Hope you weren't planning on buying this and running Linux on it.  If you want something to run fast and Windows decides to stick it on the E cores, then you just lost more than 40% of your performance.  If all of the cores are the same, then the OS can't screw this up, at least other than by deciding not to schedule threads at all.

Intel says that it's possible to disable the E cores in the BIOS.  I expect that in some communities, disabling the E cores will be regarded as a common performance optimization.  The OS can't put stuff on the E cores when it shouldn't if it can't put stuff on the E cores at all.
«1

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    But Alder Lake isn't just about new cores.  This is going to be the first launched part to support DDR5 memory.  Alder Lake will support both DDR4 and DDR5, but a given motherboard will only support one or the other.  It officially supports DDR4 memory up to 3200 MHz, but DDR5 memory up to 4800 MHz.  4800 MHz is the slowest speed of DDR5 listed on New Egg right now.  I'm not sure if that will change.

    DDR5 is generally better than DDR4.  It's not just about the clock speed and memory bandwidth.  It runs at a lower voltage, so it should use less power at a given clock speed.  Latency (as measured in nanoseconds, not clock cycles) stays about the same as with DDR4.  There are some other tricks that DDR5 offers, too.  The problem is the price tag:  at current New Egg prices, DDR5 costs about twice as much per GB as DDR4, and it's expected that there will be a price premium for DDR5 for a couple of years before prices drop to match DDR4, though memory prices can be unpredictable.  DDR5 also doesn't come in modules smaller than 8 GB, so you'll need at least 16 GB of it.

    Unfortunately, that memory support of "up to" 4800 MHz has some serious caveats.  If a motherboard has two DDR5 memory slots, then Alder Lake supports running it at up to 4800 MHz.  If the motherboard has four memory slots, then Alder Lake only supports up to 4400 MHz, even if you're only using two of the four slots.  If you actually use all four slots, then it only supports up to 4000 MHz.  And the 4000 MHz assumes that it's single-rank memory.  Make that 3600 MHz if it's dual-rank, in which case, you might want to think about just using DDR4 instead to save money.

    Alder Lake also brings support for PCI Express 5.0.  It has a PCI-E 5.0 x16 connection coming off of the CPU socket, intended for a video card.  There aren't any PCI-E 5.0 video cards yet, but there will be, even if 5.0 won't offer much of an advantage over 4.0 for gaming.  PCI Express lanes from the chipset are only PCI-E 4.0, however.  So don't get too excited about the possibility of PCI-E 5.0 SSDs.  Alder Lake won't support them, other than at PCI-E 4.0 speeds.

    Also new to Alder Lake is honesty about the power usage.  Well, maybe.  We'll have to see when the parts actually launch.  With Rocket Lake, you had a nominally 125 W part that could easily use double that under heavy workloads.  With Alder Lake, Intel lists an official turbo power.  It's 241 W for the top end part.  For comparison, the entire Ryzen 5000 series is 142 W, at least if you're looking at the X parts rather than the G parts that came later.  So yes, Intel officially needs 99 W more than AMD for a mix of 8 P and 8 E cores as compared to 16 of AMD's high-performance Zen 3 cores in the Ryzen 9 5950X.  At least Intel's 16 cores, eight of which are supposedly efficient, do come in a little lower in power than the 280 W for AMD's 64-core Ryzen Threadripper 3990X.  At least assuming that Intel is honest, which they might not be.  That sort of power burn isn't cheap to cool, either.

    So how does Alder Lake perform?  We should find out next week when reviews go up.  I do expect that the Core i9-12900K will typically offer the highest frame rates in games, reclaiming that title from AMD's Ryzen 5000 series.  Of course, the latter is plenty fast, too, so you'll sometimes be looking at the difference between 260 and 280 frames per second and then have to decide whether to care about that.  Or perhaps rather, whether you care about it more than the corresponding difference of 100 W.

    There's also one other considerable caveat about Alder Lake's mixed cores.  Intel says that Denuvo DRM checks your hardware specs to make sure that you're only running a game on one computer.  If the game sometimes starts on a P core and sometimes on an E core, then Denuvo thinks that they're different CPUs.  In some cases, it will then lock you out of the game.  Intel says that they're working on fixing this.  You can decide whether you want to blame Intel or Denuvo for this, and really, I'd blame Denuvo.  But it's apparently a problem on Alder Lake, though it's not like Denuvo just works flawlessly on older CPUs.
  • olepiolepi Member EpicPosts: 3,053
    "They'll compete against the Ryzen 5000 series desktop parts."

    The Ryzen 5000 is a medium part, out for over a year.

    ------------
    2024: 47 years on the Net.


  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    olepi said:
    "They'll compete against the Ryzen 5000 series desktop parts."

    The Ryzen 5000 is a medium part, out for over a year.
    At least for now, it's the latest AMD part in the same segment.  Unless AMD is going to beat Intel to the punch and launch something new next week.  Perhaps later in Alder Lake's lifetime, it will be competing against some other, newer AMD part.

    AMD has talked about some Zen 3+ parts that are basically Zen 3 with triple the L3 cache.  I'm skeptical that this will be as big of an improvement as AMD seems to claim.  It could easily end up like the old Broadwell parts with the Crystalwell cache:  the cache is really useful now and then, but often makes no difference, and adds so much to the cost that not very many chips get produced before they get discontinued.  Zen 4 is coming, but likely further away.

    But part of my point is that the Ryzen 5000 series X series parts currently dominate the market segment that the initial Alder Lake parts are built for, as opposed to laptops, servers, HEDT, low end desktops, cell phones, or whatever else uses CPUs.
  • olepiolepi Member EpicPosts: 3,053
    Quizzical said:
    olepi said:
    "They'll compete against the Ryzen 5000 series desktop parts."

    The Ryzen 5000 is a medium part, out for over a year.
    ...

    But part of my point is that the Ryzen 5000 series X series parts currently dominate the market segment that the initial Alder Lake parts are built for, as opposed to laptops, servers, HEDT, low end desktops, cell phones, or whatever else uses CPUs.
    I'm glad that Intel can do this, I don't like depending on fabs in other countries.

    ------------
    2024: 47 years on the Net.


  • RidelynnRidelynn Member EpicPosts: 7,383
    I’m waiting for some real world data before drawing any conclusions. I have to say a lot of this marketing around AL smells of desperation, but I am hoping more is real than not
    Asm0deus
  • olepiolepi Member EpicPosts: 3,053
    Ruh-roh, it looks like Intel is up to it's old tricks again. Their benchmark against the Ryzen 5950X was done with the AMD part's power throttled at 105W, while the Intel part ran at 241W.

    And the benchmark was done using Windows-11, before the AMD fix was in.
    [Deleted User]RidelynnAsm0deus

    ------------
    2024: 47 years on the Net.


  • laseritlaserit Member LegendaryPosts: 7,591
    One thing I will say after playing with a Ryzen for almost a year is that they sure do run hot. ;)
    Ridelynn

    "Be water my friend" - Bruce Lee

  • RidelynnRidelynn Member EpicPosts: 7,383
    They don't compete against Ryzen this time. They crush them.
    Hopefully AMD has something in their bags to counter it. Competition is good for us consumers, specially in these times of high component costs.

    My 12700k is coming ;)
    Can’t wait to see real world benchmarks from third parties. I would love for this to be true, but Im not gonna take Intel’s word for it
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    laserit said:
    One thing I will say after playing with a Ryzen for almost a year is that they sure do run hot. ;)
    If a 142 W Ryzen chip runs too hot for your liking, then you might not want a 241 W Core i9-12900K.  Power consumption of high end parts has been slowly increasing over time.  Enough decades of that have passed for it to be a considerable problem today.
    [Deleted User]laserit
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Torval said:
    Quizzical said:
    laserit said:
    One thing I will say after playing with a Ryzen for almost a year is that they sure do run hot. ;)
    If a 142 W Ryzen chip runs too hot for your liking, then you might not want a 241 W Core i9-12900K.  Power consumption of high end parts has been slowly increasing over time.  Enough decades of that have passed for it to be a considerable problem today.
    Agreed. That Anand article I posted goes over this a bit and discusses their cooling design. I think they see it as a primary concern in real world use. Anand also pointed out that their cooling designed shown is only really for high end i9 chips (those are the only ones being made at first anyway) and one concern is if lower end chips will get the thermal paste treatment instead of the metal layer. Those won't be 241W chips though either (supposedly).
    Cooling processors isn't just about getting the heat off of the processor.  Pairing a 241 W CPU with a 350 W GPU can mean that playing games on the computer heats up your room enough to be a nuisance.  As power usage has tended to go up over the decades, there's little reason to believe that it's going to stop here.  Do you want a future where it's commonly accepted that a decent gaming computer will crank out at least 1000 W under common gaming loads?  I don't.
    Ridelynn
  • RidelynnRidelynn Member EpicPosts: 7,383
    The new 12-pin PCI 5 connector (apparently different from the 12-pin nVidia introduced on the FE cards recently, and not really related to PCI v5) supports up to 600W of additional power, allowing GPU cards to scale up to 675W on a single cable.
  • Asm0deusAsm0deus Member EpicPosts: 4,618
    edited November 2021
    Not very interesting for me as a consumer that doesn't change his hole sytems for the newest bling tbh.

    I am more interested in gpu's tbh as cpu are relatively cheap now in the grand scheme of things when looking at a gaming PC.

    When I say cheap I mean it used to be my cpu cost more than the gpu I put with it when building but now......most of your budget kind of has to go into the gpu alone due to the scalping problem....

    Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.





  • RidelynnRidelynn Member EpicPosts: 7,383
    edited November 2021
    Quizzical said:
    Torval said:
    Quizzical said:
    laserit said:
    One thing I will say after playing with a Ryzen for almost a year is that they sure do run hot. ;)
    If a 142 W Ryzen chip runs too hot for your liking, then you might not want a 241 W Core i9-12900K.  Power consumption of high end parts has been slowly increasing over time.  Enough decades of that have passed for it to be a considerable problem today.
    Agreed. That Anand article I posted goes over this a bit and discusses their cooling design. I think they see it as a primary concern in real world use. Anand also pointed out that their cooling designed shown is only really for high end i9 chips (those are the only ones being made at first anyway) and one concern is if lower end chips will get the thermal paste treatment instead of the metal layer. Those won't be 241W chips though either (supposedly).
    Cooling processors isn't just about getting the heat off of the processor.  Pairing a 241 W CPU with a 350 W GPU can mean that playing games on the computer heats up your room enough to be a nuisance.  As power usage has tended to go up over the decades, there's little reason to believe that it's going to stop here.  Do you want a future where it's commonly accepted that a decent gaming computer will crank out at least 1000 W under common gaming loads?  I don't.
    Hmm. Kinda.

    GPUs have often flirted with 300W on the high end. We still are more or less, this is the first we are seeing an effort to really push past that with new 300W+ TDP's being targeted. Time will tell if they really do push hard past that, or if it's mostly just marketing and bragging rights for a few cards aimed at people with more money than sense and people chasing world records. You could, especially with AIO coolers on the GPU, and new triple slot coolers, push past that 300W barrier... today's high end cards can do that, especially with overclocking, but I'd still argue if that's a great idea, and that's on only the very top end SKUs that cost a lot of money.

    CPUs have had a pretty long but slow trip up to the low 100W range with Prescott, then dropped back down with Core to the ~65W range, which is still where most mainstream Desktop CPUs sit. Only on the HEDT and heavy overclock region do we see it really creep back up over 100W on the CPU side. Right now, I think the only reason we see Intel pushing 200W+ is because they have to in order to gain the marketing material and "beat" Ryzen. If those chips are pulling 200W+ on a continual basis it's a huge problem, I'd expect them to be much lower than that typically, and only boost up there for some specific benchmarking applications.... 

    But you are right, there is definitely an unwritten rule about total power in a computer package. 

    If you think about it, most electric space heaters are 600/1200W (Low/High)... and a computer that requires the same will put out very nearly the same amount of heat as the space heater will. 

    My computer setup can draw up to 700W from the wall if I'm benchmarking/stressing it, but more often it sits around 400-450 while gaming. That right there is enough that, over time, it definitely will start to affect the temperature in the room.  I think, for me, that's about as far as I'm willing to go for performance -- much beyond that and I would need to crank up the AC or put the computer in a different room than I'm in, and that's more engineering that I want to do to play a video game. I already can't really run a window unit and my computer on the same electrical circuit without tripping a breaker.

    Consoles are similarly constrained -- if you want it to fit inside your typical home theater, you have some size constraints and thermal constraints ... or at least, before it starts to sound like a jet engine: and that will get you dinged just as badly as being slow will.

  • laseritlaserit Member LegendaryPosts: 7,591
    Quizzical said:
    laserit said:
    One thing I will say after playing with a Ryzen for almost a year is that they sure do run hot. ;)
    If a 142 W Ryzen chip runs too hot for your liking, then you might not want a 241 W Core i9-12900K.  Power consumption of high end parts has been slowly increasing over time.  Enough decades of that have passed for it to be a considerable problem today.
    It depends on whether that power is being used efficiently, excess heat may indicate issues. 

    I'm just very surprised at the package temp, especially running at stock in a very large and well ventilated case.

    I reported during the NW beta that I was having issues, with my rig shutting down. Figured it was my EVGA 3090

    What I've since discovered is that my CPU was getting to hot, in part because of the GPU. 

    So far managing temps with this Ryzen has been challenging. Something that's never been an issue for me.

    Maybe one of my other components are faulty. I think I'll purchase a new liquid cooling loop to set my mind at ease.

    Performance is awesome though.


    "Be water my friend" - Bruce Lee

  • RidelynnRidelynn Member EpicPosts: 7,383
    edited November 2021
    laserit said:

    So far managing temps with this Ryzen has been challenging. Something that's never been an issue for me.

    Maybe one of my other components are faulty. I think I'll purchase a new liquid cooling loop to set my mind at ease.

    Performance is awesome though.



    My Ryzens run warmer than I'm used to at low loads, but I havnen't seen anything close to overheat when stressed. I'm used to a CPU at idle running fairly close to ambient - maybe 30-40C depending on the cooler. I have at home a 5900, a 5700 and a 3700, and all of them run around 50C at idle.

    But under full load, any of them will only get up to about 65-75C; well under any temp that's dangerous. 

    I've just come to accept that's the way Ryzen's are engineered. It doesn't hurt the chip, it's not overheating or even coming close to it, and everything appears to be working just fine regardless. 

    I've heard you can get idle temps back down if you go tweaking the voltage curve / undervolt; but since I don't have any other problem with the chips, the performance is spot on, the power draw isn't skewed high, and the fans and such aren't cranking into overdrive, I haven't felt the need to go monkey with it.
  • olepiolepi Member EpicPosts: 3,053
    Intel basically cheated and overclocked their chip to use over twice as much power as the AMD chip. In that configuration, with the AMD part not overclocked, Intel can win.

    But that is using a 7nm Ryzen 3 chip. AMD is scheduled to release a Ryzen 4 chip using the 5nm process next year. Ryzen 4 is faster than Ryzen 3, and 5nm is faster and uses less power than 7nm.

    So Intel better be ready to release the successor to their new chip, next year.

    ------------
    2024: 47 years on the Net.


  • laseritlaserit Member LegendaryPosts: 7,591
    Ridelynn said:
    laserit said:

    So far managing temps with this Ryzen has been challenging. Something that's never been an issue for me.

    Maybe one of my other components are faulty. I think I'll purchase a new liquid cooling loop to set my mind at ease.

    Performance is awesome though.



    My Ryzens run warmer than I'm used to at low loads, but I havnen't seen anything close to overheat when stressed. I'm used to a CPU at idle running fairly close to ambient - maybe 30-40C depending on the cooler. I have at home a 5900, a 5700 and a 3700, and all of them run around 50C at idle.

    But under full load, any of them will only get up to about 65-75C; well under any temp that's dangerous. 

    I've just come to accept that's the way Ryzen's are engineered. It doesn't hurt the chip, it's not overheating or even coming close to it, and everything appears to be working just fine regardless. 

    I've heard you can get idle temps back down if you go tweaking the voltage curve / undervolt; but since I don't have any other problem with the chips, the performance is spot on, the power draw isn't skewed high, and the fans and such aren't cranking into overdrive, I haven't felt the need to go monkey with it.
    I’ve been able to drop about 6-7 degrees by moving my 3090 further down away from the cpu and cooling lines. The heat is crazy when its all being pushed. I’ve had the system shut down because of cpu overheating at stock.

     I need to isolate the thermals from the gpu. The heat is crazy off the top of that thing. It was effecting cooling line temperatures big time.

    Already have some designs in my head to directly move the gpu thermals out the back.

    Idle temps go down into the high 30’s. After everything is hot from working hard, it will idle in the mid 40’s and when it’s giving it, I’ve got things down to the high 80’s at stock. It’s no longer tripping but I find the temps uncomfortably high for stock.

    Maybe my 3090 is faulty? Throwing out too much heat?


    Ridelynn

    "Be water my friend" - Bruce Lee

  • olepiolepi Member EpicPosts: 3,053
    "Maybe my 3090 is faulty? Throwing out too much heat?"

    My AMD 6800 XT GPU will overheat with several games, including NW, if I don't limit the framerate. On the beta test for Ship of Heroes, it got to 200 FPS (!), and was over 100C.

    Limiting FPS to 70 or so solves that problem. Are you limiting frames per second?
    laserit

    ------------
    2024: 47 years on the Net.


  • laseritlaserit Member LegendaryPosts: 7,591
    olepi said:
    "Maybe my 3090 is faulty? Throwing out too much heat?"

    My AMD 6800 XT GPU will overheat with several games, including NW, if I don't limit the framerate. On the beta test for Ship of Heroes, it got to 200 FPS (!), and was over 100C.

    Limiting FPS to 70 or so solves that problem. Are you limiting frames per second?
    I can keep all temperatures in the green if I throttle things down. Thing is that's not why I purchased a 3090. I purchased one because I have a couple applications where I really need the horsepower.

    I'm just surprised at how hot the system runs at stock with everyday run of the mill pc games.

    Just so different from my usual experience.



     

    "Be water my friend" - Bruce Lee

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    laserit said:
    Ridelynn said:
    laserit said:
    So far managing temps with this Ryzen has been challenging. Something that's never been an issue for me.

    Maybe one of my other components are faulty. I think I'll purchase a new liquid cooling loop to set my mind at ease.

    Performance is awesome though.

    My Ryzens run warmer than I'm used to at low loads, but I havnen't seen anything close to overheat when stressed. I'm used to a CPU at idle running fairly close to ambient - maybe 30-40C depending on the cooler. I have at home a 5900, a 5700 and a 3700, and all of them run around 50C at idle.

    But under full load, any of them will only get up to about 65-75C; well under any temp that's dangerous. 

    I've just come to accept that's the way Ryzen's are engineered. It doesn't hurt the chip, it's not overheating or even coming close to it, and everything appears to be working just fine regardless. 

    I've heard you can get idle temps back down if you go tweaking the voltage curve / undervolt; but since I don't have any other problem with the chips, the performance is spot on, the power draw isn't skewed high, and the fans and such aren't cranking into overdrive, I haven't felt the need to go monkey with it.
    I’ve been able to drop about 6-7 degrees by moving my 3090 further down away from the cpu and cooling lines. The heat is crazy when its all being pushed. I’ve had the system shut down because of cpu overheating at stock.

     I need to isolate the thermals from the gpu. The heat is crazy off the top of that thing. It was effecting cooling line temperatures big time.

    Already have some designs in my head to directly move the gpu thermals out the back.

    Idle temps go down into the high 30’s. After everything is hot from working hard, it will idle in the mid 40’s and when it’s giving it, I’ve got things down to the high 80’s at stock. It’s no longer tripping but I find the temps uncomfortably high for stock.

    Maybe my 3090 is faulty? Throwing out too much heat?
    Apart from sub-ambient coolers, which are rather exotic for everyday use, the job of a CPU or GPU cooler is to reduce the temperature difference between the CPU or GPU and the surrounding ambient air temperature.  If you're blowing very hot air at the cooler, then a good cooler that keeps the temperature difference down to something reasonable may well still allow the chip to overheat.

    The reason that the RTX 3090 throws off so much heat is that it's rated as a 350 W video card, and that's a lot of heat.  The most efficient way to cool a video card in isolation is to spray heat off in all directions, which may be good for the video card, but is bad for most other things in the computer.  If you've got enough general case airflow, it might well be fine, but it will take a lot more case airflow to properly handle a system with an RTX 3090 than a 200 W video card.
    Ridelynnlaserit
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    laserit said:
    olepi said:
    "Maybe my 3090 is faulty? Throwing out too much heat?"

    My AMD 6800 XT GPU will overheat with several games, including NW, if I don't limit the framerate. On the beta test for Ship of Heroes, it got to 200 FPS (!), and was over 100C.

    Limiting FPS to 70 or so solves that problem. Are you limiting frames per second?
    I can keep all temperatures in the green if I throttle things down. Thing is that's not why I purchased a 3090. I purchased one because I have a couple applications where I really need the horsepower.

    I'm just surprised at how hot the system runs at stock with everyday run of the mill pc games.

    Just so different from my usual experience. 
    AMD's GPU drivers let you limit the GPU turbo, and you can make difference choices for different games.  I'm not sure if Nvidia's drivers offer something analogous, but you might want to check.  A lot of times, once one GPU vendor offers something, the other feels the need to offer the same thing to compete.

    The last 10% of the clock speed uses a lot more than 10% of the power, as it ramps up voltage to increase the clock speed, too, so throttling back the clock speeds can save a lot of power with only a modest hit to performance.  With the ability to choose independent clock speeds for different games, you can readily throttle it back in games where the extra performance doesn't matter while letting the card go all out in games where you need the performance.

    You could also look into improving your case airflow.  What case do you have?  It might be possible to add fans or increase the RPM on the fans that you already have.  Alternatively, it's possible that a dead case fan is the underlying problem.  Make sure that your fans are all spinning, which is obvious just from looking at them while the system is powered on.  Some fans will stop under low loads, especially power supply fans, so you may need to stress the system some to get all the fans to start.
    laserit
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited November 2021
    I use a driver-based FPS limiter on my nVidia card. It works well in conjunction with G-Sync. New World didn't melt my 3080 at any rate.

    It's also possible to adjust the power cap like AMD allows, but it's not nearly so straight forward. Limiting the FPS to your refresh rate accomplishes close to the same thing in most situations.

    That said - I have seen open air coolers on GPUs affect air coolers on CPUs pretty significantly, especially if your case ventilation is struggling. But with an AIO cooler, maybe a very minor effect, but if you have AIO fans set to pull through cool outside air it should be only very minimal at most.
    laserit
  • laseritlaserit Member LegendaryPosts: 7,591
    Quizzical said:
    laserit said:
    olepi said:
    "Maybe my 3090 is faulty? Throwing out too much heat?"

    My AMD 6800 XT GPU will overheat with several games, including NW, if I don't limit the framerate. On the beta test for Ship of Heroes, it got to 200 FPS (!), and was over 100C.

    Limiting FPS to 70 or so solves that problem. Are you limiting frames per second?
    I can keep all temperatures in the green if I throttle things down. Thing is that's not why I purchased a 3090. I purchased one because I have a couple applications where I really need the horsepower.

    I'm just surprised at how hot the system runs at stock with everyday run of the mill pc games.

    Just so different from my usual experience. 
    AMD's GPU drivers let you limit the GPU turbo, and you can make difference choices for different games.  I'm not sure if Nvidia's drivers offer something analogous, but you might want to check.  A lot of times, once one GPU vendor offers something, the other feels the need to offer the same thing to compete.

    The last 10% of the clock speed uses a lot more than 10% of the power, as it ramps up voltage to increase the clock speed, too, so throttling back the clock speeds can save a lot of power with only a modest hit to performance.  With the ability to choose independent clock speeds for different games, you can readily throttle it back in games where the extra performance doesn't matter while letting the card go all out in games where you need the performance.

    You could also look into improving your case airflow.  What case do you have?  It might be possible to add fans or increase the RPM on the fans that you already have.  Alternatively, it's possible that a dead case fan is the underlying problem.  Make sure that your fans are all spinning, which is obvious just from looking at them while the system is powered on.  Some fans will stop under low loads, especially power supply fans, so you may need to stress the system some to get all the fans to start.
    The case is the easy part for me. Made some pretty cool server racks (the metal parts) back in the day, before everything got so corporate ;)

    I can put an external loop in and it should take care of the problem even when I want to crank things up for extended periods.  I just never had to do it with a stock system with no OC before.


    "Be water my friend" - Bruce Lee

  • olepiolepi Member EpicPosts: 3,053
    edited November 2021
    Quizzical said:

    Apart from sub-ambient coolers, which are rather exotic for everyday use, the job of a CPU or GPU cooler is to reduce the temperature difference between the CPU or GPU and the surrounding ambient air temperature.  If you're blowing very hot air at the cooler, then a good cooler that keeps the temperature difference down to something reasonable may well still allow the chip to overheat.



    This is the problem I have. I live in the mountains where it is cool most of the year. So I don't have any air conditioning. The hottest it might get is 80 in the summer for a month or two. It does cool down at night. The air is also kind of thin above 8,000 feet, and dry.

    80 doesn't sound very warm, but that means the ambient air the cooler has to work with is already 80. It is hard to cool down my system with that. I'm always fighting heat problems in the summer.

    Of course, in the winter, it is a different story.

    Ridelynn

    ------------
    2024: 47 years on the Net.


  • laseritlaserit Member LegendaryPosts: 7,591
    olepi said:
    Quizzical said:

    Apart from sub-ambient coolers, which are rather exotic for everyday use, the job of a CPU or GPU cooler is to reduce the temperature difference between the CPU or GPU and the surrounding ambient air temperature.  If you're blowing very hot air at the cooler, then a good cooler that keeps the temperature difference down to something reasonable may well still allow the chip to overheat.



    This is the problem I have. I live in the mountains where it is cool most of the year. So I don't have any air conditioning. The hottest it might get is 80 in the summer for a month or two. It does cool down at night. The air is also kind of thin above 8,000 feet, and dry.

    80 doesn't sound very warm, but that means the ambient air the cooler has to work with is already 80. It is hard to cool down my system with that. I'm always fighting heat problems in the summer.

    Of course, in the winter, it is a different story.

    You have to kind of think of the airflow like a stream with back eddies. You want to move all the air you can. It doesn't have to be cold air, 80 deg is ok, its a lot cooler than what its moving, it just needs to move, the less the dead pockets the better. The CFM (cubic feet per minute) and that you don't just have a clean stream that goes in one side and out the other is what's important. The best situation is if all the air moves. You can see this with smoke.

    "Be water my friend" - Bruce Lee

Sign In or Register to comment.