How to calculate cooling cost per server

physical-environmentserver-room

I'm attempting to figure out the power/cooling draw for some old servers that I'd virtualized.

Starting with info in this question, I've taken the 880W the old boxes appeared to be drawing (according to the APC logs), and gotten 2765 BTUs/hr, or 1 ton of cooling every 4.34 hours.

At this point I'm scratching my head as to how to figure the cost for cooling. I'm sure that depends on what I'm using to cool. I'd imagine that ambient temperature counts for something at this point, but I'm not sure if it's significant.

[edit] What I'm missing – I believe – is any idea of what 1 ton of cooling costs. Is that a sensible thing to shoot for, or am I barking up the wrong tree? [edit]

In any case, any pointers on what info to gather next, what to do with it, or what's wrong with the above figuring (if applicable) is most welcome.

Best Answer

  1. You need to convert BTU to watts.
  2. Convert Watts to kilowatts
  3. Then multiply the kilowatts by whatever you currently pay the electric company per kWh.
  4. $$$ Profit $$$

You should have two sets of numbers: the electricity used by the systems (880W in your case) and the electricity used to cool them (convert Tons to BTU to Watts to kWh), then add them up.

You not only have to account for the electricity you use to power the systems, but also the electricity used to cool them.

You computed the cooling you need not necessarily the cooling you provided. These can be two separate numbers. If your ambient temperature was around 70F these numbers are close enough.