Electronic – Using Reactive vs Real Power to Manage Distribution System Voltages

reactive-powerreal-power

This is my first post. The introduction of photovoltaics (PV) on the electrical distribution system creates changes to real power, resulting in voltage fluctuations. Those voltage fluctuations can be mitigated in two ways:

a) real power

b) reactive power

My questions are below. I am hoping that somebody can provide the calculations in their responses, and explain those calculations if they are not simple.

1) Why is reactive power generally more effective than real power in returning voltage to system norms? (Assuming this is correct, please provide the calculation demonstrating this to be true)

2) When and why does this general rule stop being the case? Why do long feeder with different X/R ratios require more real power for voltage control? (Assuming this is correct, please provide the calculation demonstrating this to be true)

Best Answer

I'll offer an intuitive explanation instead. The calculations would seem to depend on specifics – maybe if you sketch a circuit that illustrates your situation it would be easier.

You have a single phase power system from the utility. You want to dump PV energy energy onto the power line but also regulate it to standard line voltage.

How do you regulate voltage? Well, you can either dissipate energy by dumping power into a resistor, or store the energy in a reactive load. Clearly energy storage will be more efficient than dissipating the excess as heat, but the problem is that it takes time to store energy (e.g. to charge a capacitor).

So if there is a sharp transient you may not have enough to store the excess energy quickly enough. On the other hand, you can start dissipating energy in the time it takes to switch a transistor.

If there is a transient and the X/R ratio is low (resistive) then energy can be stored. If it is high (reactive) then energy storage of the transmission line will create a DC offset, so you need to dissipate real power.