On my gaming laptop I'm running anywhere between 100-130 watts. Both my CPU and GPU are running at 178F or 82C. Just curious what others are using because I've seen some systems take up as much as 1k watts but run a lot cooler then my own.
Are you onto something or just on something?
Comments
Watt usage has nothing to do with temperature. Temperature has more to do with cooling (either air or water) and dust coverage of parts in your system. And mostly when living in older buildings the more dust your system will collect. I'm living in a house that's over a century old and I need to remove dust from my PC every couple of months, if I don't then I'll notice the temperature of my PC run from the average (CPU ~50C when fully used) to hot (60-65C for the CPU)
All of that, when stressing the computer to 100%, clocks in at just under 400W.
Power isn't the only thing that factors into temperature.
The internal energy of a system will equal the inputs minus the outputs. Power is going in, some sort of work is coming out, and heat is coming out as a waste product (and entropy, but that's beyond the scope of this).
That means, temperature will be proportional to both the power input and the heat removal capacity. The higher the power, the higher the temperature. The better the heat removal capacity, the lower the temperature. The final temperature will be whatever equilibrium the power input and heat removal output reach for a given state.
You could have a video card that uses 1W of power, and gets to well above 82C, depending on how the installation and heat sink are set up. Alternatively, it's possible for a 120,000,000W system to run at -271.3C
It's just a matter of applying the first law of thermodynamics: conservation of energy: that energy cannot be created or destroyed, only transformed.
I can't remember if i had another card when i last measured it with a kill-a-watt, but the cpu was overclocked then as well, so it's probably around 350 now
Power isn't free. Most people have to pay an electric bill. Even if you don't, your paying for it in one way or another (it's tacked onto the rent, dorm fee, your parents, whatever)
The difference between 200 and 500W may not sound like a lot in computer terms. But you turn out the lights when you leave a room right? Going with your GIF, incandescents are only about 65W, CFLs are only 18W, and LEDs are down aroun 8W, and we still worry about conserving that small amount of energy. Your computer is equivalent to lighting up a whole lot of light bulbs.
More Power means more heat.
More heat means more HVAC. A lot of people with high powered computers (500+W of actual use, not just PSU rating) -- that's the same heat output as a space heater, and enough to make a dent in the temperature of a normal sized room. Maybe if you live in an Arctic climate that's a bonus... most people don't.
More heat also means lower silicon life. It accelerates breakdown and causes ICs to wear more quickly.
Most ICs today are really constrained by power. There's a reason Intel shifted gears from the Pentium 4 to the Core architecture - they hit a power limitation. The way to faster chips isn't just by throwing more cores at the problem, or cranking up the frequency, it's in making your chips more efficient, you get the power per unit area (or volume) down, then you can pack more performance in the same unit area. And that is where the bulk of speed increases in computing has come from in the past decade, and why die shrinks are such a big deal.
More heat means more cooling required.
More cooling also means more noise. You can remove heat passively, but that gets to the next point - most solutions use a method to remove the heat mechanically. That's usually a fan someplace, and either more fans or faster fans, both of which add up to noise levels. And it also means more energy has to be spent on cooling. For most systems, a few fans isn't that significant, but it's still more power that isn't contributing directly to your computing, and it can start to add up once you start getting into the 6+ fans and water pumps and whatever else level of cooling.
And more cooling means more size, which means larger form factors are required. This kind of goes in hand with the former point - you can trade off size for noise (and vice versa), but the size goes up rather quickly. This is the biggest challenge in SFF/Laptop computing.
Now I have seen some pretty ingenious geothermal heat sinks that can be near silent and prevent the HVAC issue, but your looking into a lot of expense and effort and other limitations to get into that level of cooling.
So there are a few reasons to care about Power.
3x Asus MG279Q (monitors)
Core i7-4790K
Radeon R9 Fury X
Seasonic Snow Silent 750 W
Think that sounds low for a Fury X? Well, it didn't think Elsword was too demanding, so while the clock speed bounced around, it was generally near 330 MHz, as compared to a nominal stock clock speed of 1050 MHz. For comparison, idle at the desktop is 300 MHz, and a measurement at the wall of 113 W.
It's possible that only two of the three monitors are included in the measurement, as two have battery backup, while the third uses a UPS port that doesn't get battery backup.
It always makes me chuckle when I hear people recommend those big 1kW+ PSUs for everyday use.
Oh and I had a MSI laptop at some point. MSI techs need to go back to "thermal paste school." That may be your heat issue. Worth getting checked out.
Seaspite
Playing ESO on my X-Box