How much heat does a computer give off?

I recently relocated my computers to a back room. The room already had minor issues with air conditioning (it is the furthest room from the central air unit so it gets substantially cooler in other parts of the house than this room). The room is now even hotter (noticably) and it is hard to keep it cool without freezing the rest of the house. I have been closing vents in other parts of the house to compensate (trying to get more air to the back of the house and less to the front).

It got me wondering though, how much heat are these computers throwing off? Could my insanlely high electric bills be in part due to that fact the the air conditioner is running more to compensate for the heat from the computers? I was almost thinking about ducting the exaust from the computers out the window to see what effect it would have?

The amount of heat given off by a computer depends on several things:

Type and speed of the CPU
Type, size, and efficiency of the CPU cooler
Cleanliness of the CPU cooler
How many and what type of PCI cards you have installed
Case design and number/type of case fans are fitted
Ambient air temperature in the room
How hard the CPU is working (i.e. what process it is running)
Type of graphics cardMake and design of the motherboard

Mender–you’ve got me thinking. Saw a show on TV where a professor dropped a pen onto the table top and said “Where did the kinetic energy go?” Answer was it was turned into heat within the table top.

So–I wonder if “all” the power being consumed by the psu is finally turned into heat?==giving no allowance for creation of light and so forth as the light ultimately gets absorbed too??

For instance, cleanliness of cpu cooler is irrelevant and would only affect the amount of time it would take for all the generated heat to be transfered?

I ask because I was surprised at how much heat blows out the back of my computer and if I calculate the number of watts consumed and equate to 100 Watt lightbulbs–its seems pretty close?

Years ago read an article about super insulated homes and fact that in deep winter they could be heated by occupants body heat, normal lighting, plus five dogs. Today I’m thinking one computer could heat the whole place?

well ive just bought a thermometre today and have it placed with some string next the rear exhaust vent on my computer so i’ll tell you what mine is giving off soon. for specs see sig.

Right this minute, after sticking the them. on the fan, its reading 29 degrees C or 84 F.

will give reading of rest of room temp in a min.

Depending what’s in your computer and the peripherals I would say 450-600 watts. Remember that each average human also gives off about 100 watts of waste heat.


26 degrees c my room is after playing cs source for 1 hour using 1 gpu. the exhaust fan however was blowing out air that was 32 degrees c.

I have two computers, one an athlon xp2400 (soon to be switched back to an overclocked athlon xp2500) and an overclocked athlon 64 3000. I have 11 hard drive that most of the time are in one or the other and 5 opticals that I currently use. Both are in antec tower cases though I’m guessing that doesn’t really mater. I have several diffrent pci card so all slots may be full at any given time. I’m kind of thinking that the hard drives might be throwing off the most heat when the computer is idle. I have noticed when changing stuff around, at times when I didn’t have proper case cooling (temporarilly), the hard drives got pretty hot, even when idle. I’m sure the cpu is a major contributor too.

So the question is, might ducting the exaust fans out the window make enough diffrence that I might see lower electric bills? They are after all basiclly heaters that run all the time. I even have a small blower moter than might serve to duct air out?

I cannot stand central ACs I wouild NEVER be caught dead using central ACs… Not only are they bloody noisy, they make your house stink (carpets), the conduits require frequent cleaning and maintenance as bacteria accumulates and other health risks and a much higher electric bill. I use 2 ACs in separate rooms, including the computer room.

Computer generates a lot of heat, enough heat to keep me warm in the winter and too warm in the summer :smiley:

Just bought a portable unit (15,000 BTU) which I use to keep my computer room at 22c.
3 Machines can really warm a small room. Using DVD Rebuilder one of my machines CPU is currently recording 52c ( this is the max value I ever record) - the exhaust records 28c
hard drives 32c (idle) and 34c ( Rebuilding) system zone 1 39c zone 2 42c

these are about the best figures I can get for a fully working machine - so air con is necessary.

Well, I got interested and have googled a few days on this issue. Indeed, my guess was correct. All the power consumed by the computer is ultimately turned into heat. The following web site has quite a few interesting facts. If I ever design my own house, the computer will be near an outside wall and I will try to vent it with and to outside air in the summer time and vent it indoors in the winter time. Maybe using water cooling and running those heat exchangers outside would be simpler in some systems? Cooling–good subject!

Watts (W) is also a term used to express heat output and cooling. One watt is equal to 3.412 Btu/hr. For example, if you use 100 watts of power, you generate
341.2 Btu/hr.

Air conditioning capacity is also measured in Btu/hr or watts. Large air conditioning systems are rated in tons. One ton of air conditioning is a unit of cooling equal to 12,000 Btu/hr or 3517 watts.

Eventually this is always the case for electricity.

Computers generally come with a 450-550W power supply, with between 250-450W sustained power for a standard-high end system, including losses in the power supply itself.
Generally though, the system is about 200W + An extra 200W-450W for your Monitor (LCD approx 200W & CRT approx 400W).

Laptops of course use energy efficient parts and generally also low end gear, so they use about 50% of the power used by a desktop.

So looking at two desktop machines (or a desktop machine + a CRT monitor), approx 500W, that’s about 1/5 the heat output of a 2400W heater. Leaving them on for 24hrs is approximately equivalent to leaving your 2400W fan heater on for 5 hrs a day.

debro–good post. What I “think” I have learned over the past 2-3 days is more universal==ultimately “everything” turns to “heat.” I got almost that “oceanic” feeling of coming into contact with the supernatural as I contemplated the kinetic devolution of the universe. “I reached out and touched the face of God–he was an ice cube.”

200 Watts for a monitor seemed high to me as I thought I had read a TV was about 50 watts, so I googled and confirmed what you posted here:

But more interesting, for some time I have day dreamed about getting some exercise while sitting in front of this beast and powering it all with a bicycle driven generator has always been in my mind like that scene from “Soylent Green.”==Turns out, only Lance Armstrong on steroids could do it?

The doors of perception squeak open, then slowly close. /// bobbo.

:eek::eek::eek: how big is your monitor??!!

my 19" lcd is rated at 60W max and my old 17" crt was rated around 120W max

Dude! How dated is that link!
These days, it’s not unusual for a CPU to suck 70W & the Video card to dissipate the same -> That’s 140W without taking into consideration HD’s, Optical Drives, Ram, Sound cards, misc PCI devices, or even just the good old cheap chinese PSU losses :wink:
Also doesn’t take into account High-End Video Cards with high clock rates, or dual cores!

I have a Dell 15" LCD sitting beside my 19" CRT.
The dell 15" LCD is rated at 1.5A@100-240V, which is anywhere between 150-360W by my calculation :wink: I doubt that it requires 360W, as that would be more than my PC :wink:

The 19", I can’t pick it up and flip it over easily :frowning: but I managed to pull it out anyway) .
It’s rated at 2A@100-240V which is anywhere between 200-480W.

Taking best case scenarios, that means the 15" LCD sucks 150W & the 19" CRT sucks 200W.

debro–I was just confirming what YOU SAID!!! ((and basically just about the tv/monitor which is still valid??)) So, if you negate my (old) reference, you are negating YOURSELF!! Besides, I don’t think the science/mathematics changes over the years–only the efficiency of the units should improve but maybe net gain on watts consumed/heat given off/cooling required as the units get bigger?

Your most recent post puts me in mind of something though==how much power do we <<really>> need for our computers? I’ve run several different web based calculaters and my Thermaltake 420 watt unit is 100 watts over what is needed. I think it does “ok” on the 12 volt rail as I do have 8 hard drives. Now–it does ok until you add in that “start-up voltage” (a 50% kicker)–that puts me over. Then add a factor for higher than standard room heat and over one year in age, and basically the calculaters all tell me I need about a 700-800 watt unit???

Bottom line though is my system runs quiet and reliably so something is “off?”

Anyway, for this discussion, it all turns to heat which is kind of a neat univeral concept–and totally boring to any physicist who fully understood this in the 8th grade? But, being confused by what everyone else already knows is what forums are all about?? GREAT POSTING Debro–we need your info!! /// bobbo.

I don’t know how did you calculate that, but the typical 17" LCD sucks :slight_smile: only 40W.

at the mo, 27 degrees c outside room, 31 inside. Aggreed Oleh. My manual for my 17" says max power consumption 45 watts.

My 19" Benq is rated at 40W.

70W (max) is confirmed for newer 19" LCD monitors.
I guess this is a lesson for us all. Buy dell, be doomed to excessive power bills for the life of the product. Or perhaps older gear is just incredibly energy inefficient. Upgrade often :wink: to save the environment? :confused:
But wait! It also specifies that it has a power requirement of just 26W (max), but the nameplate specifies 1.5A @100-240V = 150-360W.

When You calculate your power requirements for a PC, you have to understand how the PC will be used. This is where the whole “It’s complicated” part comes in.

You are generally only using a fraction of the PC at any time.
When you are playing video games, you generally aren’t using all your HD’s.
If you are using all you HD’s & optical drives, you probably aren’t playing 3D games, and maxing out your CPU, Video Card & Ram, etc.

To top it off, motors (HD’s, optical drives, fans, floppy drives) have a larger power draw when they startup. Fans are also often controlled, so suck more power when they are changing speed.

So your actual requirements could be anywhere between 30-85% of the summation of your equipments ratings.

The other problem is the actual power ratings of the devices … under what circumstances did they test for the power … is it the same way you will use it.

Just as a comparison, a desktop machine will thrash a HD for a few seconds, followed by long periods of inactivity, A server may thrash the HD 24/7, or even just 8/5. Does the manufacturer specify the maximum instantaneous power used, or do they fudge the results & take an average of the power over a longer period.

And after all that, it turns out that Cheap PSU vendors rate their PSU’s for the PEAK power output (exceed it & it’ll blow up) and quality vendors rate their PSU’s for Sustained power, and will also specify a peak output on the nameplate.

So if you aren’t confused yet … you are doing well :wink:

To top it off, as electrical devices age, materials evaporate or deteriorate and they don’t operate as efficiently as they did when they were new … Components suck more power, the PSU can’t supply as much power… so one day, after everything has been working for a long time, the whole computer just stops working because the now power-inefficient PC is now sucking more than the PSU could deliver (since it has deteriorated).