The capacitor in a power supply following the bridge rectifier is not causing a phase shift in the current. Instead, it is contributing to harmonic distortion of the current waveform, which is a different method of creating a poor power factor.
When you remove the capacitor, the conduction through the rectifier is more continuous. When you add the capacitor, the conduction now occurs only in short pulses at the peak of the voltage waveform.
Putting an inductor in parallel with such a load would solve nothing — you would now have a lagging current component PLUS a harmonic distortion component.
However, putting an inductor in series with such a load can have benefits. It will filter out the higher-frequency components of the harmonic distortion by forcing the rectifier diodes to switch on and off more slowly. Note that it will still cause a lag in the current relative to the voltage, but you can probably find a "sweet spot" value for the inductance that balances the two effects to create the best overall power factor.
By correcting the power factor less energy is used in the circuit and hence less current is entering the circuit.This would result in lower electricity bill and less power lost since current has decreased
No. Assuming a fixed voltage (very low impedance) supply, like the power grid, low power factor causes more current than drawing the same power at unity power factor. However, you get billed for the real power consumed, not the "reactive" power.
Power companies don't like low power factors because the higher current for delivering the same power causes waste and stresses on their system, and reduces their capacity to deliver real power.
A typical small scale user, like a house, isn't going to save money by presenting a better power factor. Again, you get billed for real power used. Large industrial customers get billed for real power, but there are penalties added for low power factor. The power factor is monitored, and they get charged extra based on the worst power factor during the month, averaged over a minute or hour or something. Again, this does not apply to ordinary residential customers.
reactive power exist in the system we have stored energy that is not used
Not the way it seems you are thinking of it. At low power factor, you draw more energy than you use during part of the power cycle, then give it back during another part. The total energy drawn per cycle is still the same as with unity power factor, but a lot more energy gets sloshed back and forth. The sloshed energy averages to zero, so doesn't cost you anything, but all that sloshing causes inefficiencies and other problems for the power company.
if you do power factor correction, you would have less real power because the current has decreased. Therefore, you end up paying less
No. Power factor correction doesn't cause you to draw less real power. It minimizes the power you cause to slosh back and forth that you end up not using, but you still use the same power in the ideal case.
A little bit of the extra sloshing power will be lost in the wires on your side of the electric meter, so you get billed a tiny amount less. However, some schemes for doing power factor correction take a little power themselves to run, which costs you a little more. That's generally quite small too though.
Best Answer
The Power Factor (to be precise, displacement power factor) isn't set to 0.8, 0.9.
A generator will be capable of producing VA. If a resistor was place across the terminals then the DPF & PF would be 1. It is the load that draws reactive current and the phase shift is dependent on the load. This can be compensated by some reactive components so the supply see's a DPF closer to unity, but this doesn't change the fact that the load will draw a non-unity DPF