Energy Absorption By Series Resistance During Cap Charging

energyinrush-currentresistorsthermistor

This question formed while trying to size a NTC thermistor to limit the inrush current to a capacitor. Assume the schematic below (with a fixed resistance):

schematic

simulate this circuit – Schematic created using CircuitLab

The total energy \$ E \$ provided by the inrush current to the capacitor is given by:

$$ E = \frac{1}{2} C V_{in}^2 $$

where:
\$ C \$ = the downstream capacitance, in Farads
\$ V_{in} \$ = the input voltage, in Volts

Many sources (http://powerelectronics.com/community/how-do-you-choose-right-type-ntc-thermistor-limit-inrush-current-capacitive-applications, http://www.ametherm.com/inrush-current/) use this equation to work out how much energy the NTC thermistor will absorb at turn on (when SW1 is closed).

This is where I am confused. I thought this equation tells you how much energy flows through the thermistor and is delivered to the capacitor. The NTC thermistor will absorb some amount of energy additional to this as determined by the integral of the voltage drop across the thermistor and current through it during the turn on period (both of these are changing dynamically).

Assuming those sources are correct, can anyone explain why the energy stored in the capacitor is equal to the energy dissipated through the resistor?

Best Answer

The energy that the battery supplies is the integral of the \$V_{in}I(t)\$, which is just the total charge that left the battery during the charging time:

\$E_{battery} = V_{in} \cdot \int{I \cdot dt} = QV_{in}\$

All that charge got to the capacitor (since they share the same current), and the energy in the capacitor can also be written in terms of charge (using Q=CV):

\$E_{capacitor} = \frac{1}{2}CV_{in}^2\ = \frac{1}{2}QV_{in}\$

Subtracting the two will give the amount of energy lost to the resistance during charging:

\$E_{battery} - E_{capacitor} = E_{resistance} = \frac{1}{2}QV_{in}\$ which is the same as the energy in the capacitor.