Electronic – High voltages vs. Energy loss

energyvoltage

How does high voltages of electricity coming from power plants also helps in saving energy?

Best Answer

The power cables have some resistance. Power lost in the wires can be calculated as P=R*I^2 with R as the resistance of the wires and I as the current that passes through them. Power at the load is P=V*I. From this you can see that if you increase the voltage by 2x, you need only half the current to deliver the same power. However, if you pass half the current on the same wires, you will lose only a quarter of the power.

An example with made up exaggerated numbers:

Wire resistance is 10 Ohm (not very long power line), power required is 100kW. At 220V, the required current will be 100kW/220V=454A. The power lost at the wire will be 10Ohm*454A^2=2.061MW. Voltage lost at the wire will be 10Ohm*454A=4540V. So, the power plant will have to generate 4760V so that the load can receive 220V. Power generated at the plant - 2.061MW+0.1MW=2.161MW. Efficiency is 0.1MW/2.161MW*100%=4.6%

Now suppose the plant generates 330kV (and the voltage is stepped down to 220V very close to the load). Now the current needed is only 100kW/330kV=0.3A. Power lost on wires is 10Ohm*0.3A^2=0.9W, so the efficiency (ignoring power lost in the transformer) is 100000W/100000.9W*100%=99.9991%. Transformers are very efficient, about 98% for the big ones used in power distribution.

Now scale that up to, say, a power plant that generates 1GW and you will see why Tesla won the current wars (hint: there were no efficient DC-DC converters back then).