I want to measure capacitance using a microcontroller. As there are various drawbacks in measuring capacitor voltage from 0V, this resource has it right by counting the time capacitor charges from one voltage to another.

What baffles me is a selection of lower threshold – they are measuring time capacitor charges from 0.17Vcc to 0.5Vcc. They say, it's time of 0.5RC.

I can't get this 0.17 value.

As far as i understand, to get 0.5RC, you have to charge capacitor in voltage ranges that are equivalent to time points 0.2RC to 0.7RC.

$$

t/RC=-ln(1-Vt/Vcc)

$$

if i try to calculate t/RC value at 0.17Vcc they use, i get:

$$

-ln(1-0.17) = 0.186329578

$$

That is far away from 0.2.

if i try to calculate Vt/Vcc ratio at t/RC=0.2 i get:

$$

Vt/Vcc=1-e^{-t/RC}

$$

$$

1-e^{-0.2} = 0.181269247

$$

So, the lower threshold should be 0.18Vcc, not 0.17Vcc. Am i correct or am i missing something?

## Best Answer

First find at what time we reach 0.5 Vcc:

\$ 1 - e^{-t/R C} = 0.5 \$

That's for \$t\$ = 0.693147 \$R C\$. If we subtract 0.5 \$R C\$ from that we get

\$ V = 1 - e^{-(0.693147 - 0.5)/R C} = 0.175639 \text{ V}_{CC}\$

So the voltage varies from 0.1756 \$V_{CC}\$ to 0.5 \$V_{CC}\$ in 1/2 \$R C\$.

(I usually don't work with 6 significant digits, but here the discussion is between 0.17 and 0.18, and then it's relevant.)

editNote that you can use a current source instead of a voltage source + resistor to charge the capacitor.

The blue line shows the charging over a resistor, the purple line at constant current. The more shallow the line, the larger the difference in time for the same voltage difference, and hence the larger the error. That's why you with the exponential curve you better don't measure above 80 % of \$V_{CC}\$. At constant current you have a constant precision over the full measurement range.