Electronic – Power factor correction consequences

power

From what I know so far (might be wrong), The ideal or unity power factor means that all the energy is consumed by resistive components and are converted to heat, light, etc. In this case apparent power and real power are equal.

On the other hand, if reactive power exist in the system we have stored energy that is not used. That is why power factor correction is used to get rid of the reactive power (stored power that just sit there and do nothing).The power factor correction introduces other storage unit so that they would provide current back and forth between the storage elements and hence less current is drawn from the power supply.

My question is if this statement that I conclude from the correction of the power factor is correct: "By correcting the power factor less energy is used in the circuit and hence less current is entering the circuit.This would result in less power loss since current has decreased"

If this statement is not true, then what is the point of power correction?

Best Answer

By correcting the power factor less energy is used in the circuit and hence less current is entering the circuit.This would result in lower electricity bill and less power lost since current has decreased

No. Assuming a fixed voltage (very low impedance) supply, like the power grid, low power factor causes more current than drawing the same power at unity power factor. However, you get billed for the real power consumed, not the "reactive" power.

Power companies don't like low power factors because the higher current for delivering the same power causes waste and stresses on their system, and reduces their capacity to deliver real power.

A typical small scale user, like a house, isn't going to save money by presenting a better power factor. Again, you get billed for real power used. Large industrial customers get billed for real power, but there are penalties added for low power factor. The power factor is monitored, and they get charged extra based on the worst power factor during the month, averaged over a minute or hour or something. Again, this does not apply to ordinary residential customers.

reactive power exist in the system we have stored energy that is not used

Not the way it seems you are thinking of it. At low power factor, you draw more energy than you use during part of the power cycle, then give it back during another part. The total energy drawn per cycle is still the same as with unity power factor, but a lot more energy gets sloshed back and forth. The sloshed energy averages to zero, so doesn't cost you anything, but all that sloshing causes inefficiencies and other problems for the power company.

if you do power factor correction, you would have less real power because the current has decreased. Therefore, you end up paying less

No. Power factor correction doesn't cause you to draw less real power. It minimizes the power you cause to slosh back and forth that you end up not using, but you still use the same power in the ideal case.

A little bit of the extra sloshing power will be lost in the wires on your side of the electric meter, so you get billed a tiny amount less. However, some schemes for doing power factor correction take a little power themselves to run, which costs you a little more. That's generally quite small too though.