Electronic – Why does a 240 volt outlet save money in electrical cost than a 120 volt outlet

acthree phase

I live in the US and we have many outlets that run at 120 Volts. I heard that for devices that support the 240 volt standard may not cost as much in electricity prices.

Can someone explain how by simply binding two 110 circuits together that is creates a money or electrical power savings when operating at 240 volts?

I vaguely know of AC "phases". When a 240 circuit is created out of two 110 circuits, what happens w.r.t. the phases in the circuits?

Best Answer

Don't believe everything you hear. In the US (and probably most other places too), you get billed for energy used. Watts are Watts, whether you consume 1.1 kW by drawing 10 A from 110 V or 5 A from 220 V makes no difference to how much power you use or what you are charged.

Your house is run from a center-tapped transformer secondary. Accross the ends, this secondary produces 220 V. That means from the center tap to each end is 110 V. All three wires are run into your house on big fat cables. The center tap is grounded close to where it enters the house, which should also be close to the breaker panel. This becomes the neutral for the 110 V circuits. The other side for all 110 V circuits is one of the secondary ends, which are switched by the breakers. 220 V circuits, like for a electric dryer, water heater, or range, will be accross the two ends with a breaker that can disconnect each end. While there are two breakers for such circuits, they will be ganged together in a single unit. Sometimes this is obvious as they look like two breakers with a bar accrross the two switches so that they are always turned on or off together. If one pops, it will also turn off the other.

The reason for going into all this detail is so that you can understand the difference between 110 V and 220 V circuits. Both are fed by the same transformer secondary. The primary can't tell which of the two halves of the secondary the power was drawn from, or if it was drawn as 220 V by using both together.

There is one advantage to 220 V circuits, which is why they exist at all. With 220 V you only need half the current to get the same power as with 110 V. Normally that doesn't matter much. But, with high power appliances the current required at 110 V would require the wires to be uneconomically thick.

At really high currents, you might actually notice the power lost in the wires between the breaker panel and the appliance. Those wires have some finite resistance, and the power lost in them goes with the square of the current thru them. Therefore, appliances that require high current not only require expensive and thick wires, but cause more waste in those wires. This is why for high current devices only, you go thru the trouble of using 220 V. That requires half the current, and therefore would cause 1/4 of the power dissipation in wires of the same size.

In most installations however, wires are sized by the current they carry, regardless of the open-circuit voltage at the end. The limiting factor is the amount of heat a wire is allowed to dissipate per unit length. The electrical code specifies the maximum current for each wire gauge, based on all those factors and with a considerable amount of margin built in. You don't houses almost catching fire at full load. For example #10 copper wire is rated for 30 A if I rememer right. So in reality, 220 V circuits allow for thinner wire, which costs less and is easier to work with.

So technically, devices that run from a higher voltage cause less wasted power in the wires. That could be the basis of the claim that 220 V device cost less to operate, but it's far fetched. The power lost in the wires is a tiny fraction of the power used by the device, so this is at best a very small advantage. There are other advantages to using a lower voltage like 110 V. This is why most things run from 110 V except that need unusualy high power.