Electronic – Charging an ideal capacitor through an ideal diode, forever

capacitordiodesideal

Some context for why I am asking the question:

I have application in which I am to charge a supercapacitor from a voltage source which is equal to the the cap's rated voltage. As such I considered using a diode in series with the voltage source in order to realise a reduction in capacitor voltage. This had me thinking about how such a real circuit would work, and how its ideal counterpart would work.

The ideal problem

In the following schematic please assume all circuit components to exhibit ideal behaviour:

Charging an ideal capacitor through a diode

Assuming the voltage source starts at 0V, and is then set to 3.3V, the capacitor will begin charging. The capacitor voltage will be equal to the source voltage less the diode forward voltage drop, Vfwd. Vfwd decreases with current, a behaviour that is described by the Shockley diode equation:
enter image description here

where

  • I is the diode current,
  • I_S is the reverse bias saturation current (or scale current)
  • VD is the voltage across the diode
  • VT is the thermal voltage kT/q (Boltzmann constant times temperature divided by electron charge)
  • n is the ideality factor, also known as the quality factor or sometimes emission coefficient.

This equation suggests that given any voltage across the diode, some current should flow. If left forever, the voltage on the ideal capacitor should therefore approach equilibrium with the voltage source.

So before I move on to the real problem – in the ideal circuit above, what would the voltage on the capacitor, Vc, tend to if left forever?

Best Answer

The Shockley diode equation yields non-zero current for non-zero voltage, therefore the capacitor will charge up as long as there is a difference between capacitor and source voltage (\$V_C < V_1\$), i.e. \$V_C\$ will get arbitrarily close to \$V_1\$ if you just wait long enough.