i trying to see how much electricity is required to power 'x' number of computers. I know it's a vague thing because some computers draw more than others (eg. diff chipsets, HDD, vid cards, PSU's, etc).
So, lets just assume is a mum-and-dad Dell computer with some average run of the mill stuff. Nothing fancy. 20" LCD's.
this is to help calculate the generator power required to keep around 'x' computers running in a LAN. The real figure is in the hundreds .. but i'm assuming i can just figure out the base cost for one machine and then multiple it by the number of seats.
I understand this doesn't include
- Switches
- Servers
- cooling (fans), etc…
Best Answer
I did some stats on this a while ago FWIW, using the handy dandy kill-a-watt..
Typical Developer Dell PC
(2.13 GHz Core 2 Duo, 2 GB RAM, 10k RPM 74 GB main hard drive, 7200 RPM 500 GB data drive, Radeon X1550 video)
Standard Developer Thinkpad T-60 Laptop
(core 2 duo 2.0 GHz, 100GB hdd, ATI X1400 video)
LCDs
It turns out with the LCDs the default brightness level has a lot to do with how much power they draw. I almost immediately turn any LCD I own down to 50% bright, just because my eyes are overwhelmed if I don't...