Electronic – Measuring leakage current in a capacitor

capacitorleakage-current

I've recently been hired in a lab at my university to continue research on supercapacitors, specifically leakage resistance. The former student who set up the current tests was a chemist so the professor wants me to verify all of the testing procedures.

In my research about measuring capacitor leakage, manufacturers of electrometers (or other high input impedance devices) say that you must supply the capacitor with a constant voltage to measure Rleak and it will subsequently decay over time. My question is why should a constant voltage be applied and why would this make Rleak decay? Wouldn't it be constant? It seems like a better test to me would to fully charge the cap, then disconnect the power supply and measure how much current is then leaking as a function of time.

Any thoughts on this?

Best Answer

I suppose I will answer my own question. There is a constant voltage because you want to observe how the circuit reacts once the capacitor is fully charged. Theoretically there should be no current when the cap is fully charged, but you will observe with enough accuracy there will be a small current. This small current is the leakage current, and the insulation resistance can be calculated using ohm's law.