Equation for a capacitor charged by a decaying current source (solar cell)

capacitorsolar cell

I have a solar cell charging a capacitor. I want to find the amount of time it'd take for the capacitor to be charged to, say, within 0.01% of the final value (the open circuit voltage). I'm trying to derive an equation or relation that lets me calculate the required time. Normal capacitor charging equation obviously won't apply as this isn't an ideal constant voltage source. It can only source a current limited by short circuit current. The best description I have for the solar cell's behaviour here is that it is a decaying current source. The open circuit voltage remains constant and the current decays from a fixed short circuit current value to zero.

I did some digging in the internet and found the linear approximation to be true in the behaviour of charging upto 60 or 70% of open circuit voltage. i.e,

$$I = C\frac{dV}{dt}$$

$$dt = \frac{C\,d\,V}{I}$$

However this relation fails when the voltage across capacitor comes closer to the open circuit voltage of the solar cell.

If I had an equation on the nature of current decay, I could've just used the

$$V = \frac{1}{C}\int{i\,\,dt}$$

Is there any specific relation here? Can one be derived? My actual interest is to design a value of capacitor given the charge time required.

Best Answer

As a first approximation, model the solar cell with a Thevenin equivalent circuit as determined by the open circuit voltage, \$V_{OC}\$, and short circuit current, \$I_{SC}\$, of the cell at a given illumination.

Then, for that fixed illumination, the time constant will be:

$$\tau = \dfrac{V_{OC}}{I_{SC}}C $$

The time required to charge to some fraction \$\chi\$ of the final voltage is:

$$t = -\tau \ln(1 - \chi)$$