Electronic – Discharging a capacitor through a resistor and an LED in series

capacitordiodestheory

Suppose you have a capacitor of capacity \$C\$ and initial voltage \$U_0\$, a resistor \$R\$ and a LED with threshold voltage \$U_S\$ (\$U_0 > U_S\$) in series.

Now I want to calculate duration \$\tau\$ where the led is lighting.

My intuition was that the effect of the LED should be small in this case and I can use just the usual formula for capacitor discharge \$U(t) = U_0 e^{-\frac{t}{RC}}\$. Then I guess that the LED is lighting until the voltage reaches the value \$U_0\$, i.e. I have to solve the equation \$U(\tau) = U_0\$ which leads by elementary algebra to \$\tau = -RC \ln\left(\frac{U_S}{U_0}\right)\$.

However I am not sure if my intuition is correct and how to give reasons for it. So is there a good simple argument, why the approximation above or a similar (correct) approximation is valid?

My second question is about how to derive this (or a similar correct approximation) from first principles.

My idea was to set up a differential equation like follows:

$$
C \frac{dU(t)}{dt} = -I(t)
$$

And put for \$I(t)\$ the formula for the current through the diode I found on http://en.wikipedia.org/wiki/Diode_modelling#Explicit_solution which involves the Lambert-W-function. However then it get's pretty complicated and I don't know how to solve this differential equation and how to make reasonable approximations (at best with bounds for errors).

PS: I have found this paper: http://www.uncg.edu/phy/hellen/HellenAJPAug03.pdf which discusses the problem in the case when only a diode is present. But it doesn't take the resistor in series into account.

Edit: If I assume approximately that the diode has the voltage \$U_S\$ all the time, after solving the corresponding differential equation, I end up with something like \$U(t) = U_S + (U_0 – U_S) e^{-\frac{t}{RC}}\$ which seems to make no sense because \$U_S\$ is a lower bound (which was actually already in the assumption…). So it would be great if someone could really clarify all the mess here…

Best Answer

I didn't read your whole question, which seemed to go out of its way to make a simple thing complicated. As I understand it, you have a capacitor, resistor, and LED all in series, and you want to know how things decay if the capacitor starts out initially charged up.

At first apporoximation, you can consider the LED a voltage source. That means the current will decay just as if the LED wasn't there and the cap was charged up to the LED voltage less than what it really was. This is now a simple R-C systems which follows a basic exponential decay with a time constant of RC, which it seems you already understand. The question of when the LED goes "off" then comes down to at what current you consider the brightness to be low enough to be off. This can vary a lot by the efficiency of the LED, ambient light level, and how obvious "on" is supposed to be. For example, if the cap is initially charged so that the initial current is 20 mA (a common maximum for LEDs) and you consider 1 mA the "off" level, then the on time will be the 95% decay time, which is 3.0 time constants.

As I said, this was the basic first approximation where the LED has a fixed voltage accross it. That will be largely true, but of course its voltage will drop with current somewhat. For practicle purposes, this is a small effect compared to the slop of deciding what current level "off" really is, unless that current is small, like less than a mA.