Electronic – joule heating – transmitting power at higher voltages reduces resistive loss

heatohms-lawpowerresistance

electrical engineering hopeful here.

Can someone explain with math how transmitting power at higher voltages reduces resistive loss?

I know Joule's Law

Power is proportional to (I^2)*R

lets say we have two identical direct current power lines – 1000 feet and resistance is 2 ohms.

One wire will run at 1,000 volts, and the other 10,000 volts. if both have a load of 500 watts, how can it be shown that the higher voltage line with experience less heating loss?

my botched attempt to solve this – I know half of every step i make is probably wrong:

wire 1: 1,000v
load = 500w, so 500w/1000v = .5a = appliance amps
1000v/.5a = 2,000 ohms
2000 + 2 = 2002 ohm serial resistance.
1000v/2002r = .49a circuit amps (less current

wire 2: 10,000v
load = 500w, so 500w/10,000v = .05 ohms = appliance 2 resistance.
.05 + 2 = 2.05 ohm serial circuit resistance.
10,000/2.05 ohm = 4878a * 10,000v = 48,780,000 watts

Best Answer

Wire 1:

V = 1000V

P = 500W

I = P/V = 500W / 1000V = 0.5A

R_wire = 2 Ohm

Loss in wire 1 = I^2*R = 0.5A^2 * 2 Ohm = 0.5W

Wire 2:

V = 10000V

P = 500W

I = P/V = 500W / 10000V = 0.05A

R_wire = 2 Ohm

Loss in wire 2 = I^2*R = 0.05A^2 * 2 Ohm = 0.005W

UPDATE:

To work out the exact loss, assuming voltage is measured at start of wire:

schematic

simulate this circuit – Schematic created using CircuitLab

Total current, I = V1 / (R1 + R2)

Load power, P2 = 500W = I^2 * R2 = (V1^2 R2) / (R1 + R2)^2

This equation can be rearranged into a quadratic form and R2 solved for. Once R2 is found, the loss in the wire can be found by I^2 * R1

To find it using the P = VI formula, calculate V2 as I * R2, the loss is given by (V1 - V2) * I.