On Mon, Mar 07, 2005 at 01:33:56PM -0800, Brian Mury wrote: > What happens to all the energy being used by my computer? It has to go > somewhere. It ends up as heat. Your light example is an easy one - a > lightbulb generates heat and light. The light gets absorbed by objects > (mostly the dark coloured ones) as heat. My motor in my hard drive is > using just enough energy to replace that which is lost to friction - > which generates heat. The energy used to move electrons inside a circuit > ends up as heat. > An electric heater is about the only thing that is 100% efficient. If > you are heating with electricity, and turn on any electric device, you > should, in theory, break even. If you are heating with anything else, > turning on any electric device means you are, in theory, actually using > less energy (of course, the cost will depend on the relative costs of > electricity and whatever you use for heat). An electric heater is not 100% efficient and the components in it are *made* to be efficient for the purposes of heating. (I'm not even sure what 100% efficient means when converting electricity to heat - I would think that it would mean 100% electricity in = 100% heat out - which is not the case at all for any sort of electrical heaters that I am familiar with) Computers, on the other hand, are specifically designed to *not* get hot. Running at the speeds that they do, over the tiny little conduits, this is a losing battle - nonetheless, it is a battle that *is* being waged by hardware designers around the world every single day. The electricity is being controlled, and as much as possible of it *leaves* the computer out your electrical socket, without being converted to heat. One might as well argue that a traditional electrical stove could heat a house. It's ridiculous. The device is not designed to heat a house, so even if you put dozens of them in a room, it wouldn't have the desired effect. Ask somebody who knows - I'm in Ottawa, Canada here and we have snow falling in March. I have a foot of snow in my driveway to clear when I get home. Although I do leave the stove open after I shut it off to release the heat, there is nothing in my mind that is convincing me that I'm "breaking even" by doing this. I'm recovering a fraction of the cost required to heat my stove by allowing the heat to escape directed towards the air in the house (as opposed to directed to the walls, and so on). Your argument is pretty flimsy. Of course - perhaps that is your point? Perhaps you are trying to convince people that leaving a computer on when it isn't being used is ridiculous, and you are using sarcasm, and a ridiculous position to make your case? Cheers, mark P.S. I've heated my house with electricity before - the last house I lived in. The electricity bill was $350/month in the coldest month. How many computers do you think I would need running 24/7 to heat my house when it is -30 C outside (-22 F for you Americans :-) )? I'm thinking at least 200, but I'm too lazy to calculate. At 200 computers, the cost to run the computer would have to be $1.75/month or less to break even. I think people who live in a warm climate, who notice even slightly more heat in their house, are under the impression that computers are actually heating devices. :-) -- mark@xxxxxxxxx / markm@xxxxxx / markm@xxxxxxxxxx __________________________ . . _ ._ . . .__ . . ._. .__ . . . .__ | Neighbourhood Coder |\/| |_| |_| |/ |_ |\/| | |_ | |/ |_ | | | | | | \ | \ |__ . | | .|. |__ |__ | \ |__ | Ottawa, Ontario, Canada One ring to rule them all, one ring to find them, one ring to bring them all and in the darkness bind them... http://mark.mielke.cc/