# High voltage power lines

powerpower supply

Why are high voltage power lines used? Electricity is generated at 11,000V…so is that a potential at the generating station or potential difference across the ends of the power line?

And if it is a potential then I think we cannot apply the formula power = voltage*current.

I don't understand whether it is the potential difference or potential.

There are essentially, three voltages (potential differences) to consider:

(1) The voltage at the source end of the transmission line

(2) The voltage at the load end of the transmission line

(3) The difference which is the voltage drop along (between the ends of) the transmission line

For example, there may be 11kV at the source and, say, 10.9kV at the load. Assume a current of 10A (I do not know if these numbers are realistic).

For simplicity, assume there is no reactive power. Then, the power delivered by the the source is

$$P_s = 11kV \cdot 10A = 110kW$$

The power delivered to the load is

$$P_l = 10.9kV \cdot 10A = 109kW$$

Thus, the power dissipated by the transmission line is

$$P_s - P_l = 1kW = (11kV - 10.9kV) \cdot 10A$$

In each case, \$P = VI\$ holds. This is an elementary analysis just to give the basic idea of how to apply the power formula.

Here's a simple schematic where the resistance of the transmission line is modelled as the resistor \$R_T\$

Clearly, by Ohm's law, the voltage across the load \$V_L\$ is less than the voltage \$V\$ at the source since

$$V_L = V - I\cdot R_T = 11kV - 10A \cdot 10\Omega = 10.9kV$$

So, as stated earlier, there are three voltage to consider, the source voltage \$V\$, the load voltage \$V_L\$ and the voltage drop across the transmission line (end to end) \$V_T = V - V_L\$.