I read that the formula for calculating the time for a capacitor to charge with constant voltage is 5*tau=5*(R*C) which is derived from the natural logarithm.
in another book i read that if you charged a capacitor with constant current, the voltage would increase linear with time. Is this true and if it is, what is the formula used for calculating this? would a complete voltage charge be possible in constant current?
Best Answer
Normally I would let you go and look as this is not a hard question to solve, but as I am feeling generous here is how we get there:
From fundamentals, we know that \$Q =CV\$
If we take the derivative with respect to time (remembering that \$I = \frac {Q} {T}\$) we yield
\$i = C\frac {dv} {dt}\$
Rearranging, we find that \$\frac {i} {C} = \frac {dv} {dt}\$
Therefore charging a capacitor from a constant current yields a linear ramp (up to the compliance of the current source).
I will leave finding the solution in terms of time versus some voltage to you.