I am measuring the voltage of a 24 V battery using the ADC from Arduino Uno. I use a voltage divider to bring the voltage below 5V (the maximum the ADC can accept).

I noticed that as long as the ratio of R1 and R2 is the same for my voltage divider – it's better to use higher resistance values to avoid slowly discharging the battery. I tried using resistors up to mega-ohm range. What do I sacrifice by using higher and higher resistances? The intuition tells me, that this cannot go forever, because high enough resistor would be equivalent to an open circuit.

By reading other similar questions it seems I might be sacrificing accuracy, but I am not sure, as other questions typically have a different setup.

If it is accuracy, is there a formula for that? When does the standard voltage divider formula break down? Giga-ohms, Tera-ohms? How to calculate that?

P.S.

I am aware of this question, however the answer there seems very convoluted, comparing load and no-load scenarios (where I don't have any load at all). If that is the answer, I do not understand it.

## Best Answer

The voltage divider breaks when it is not a voltage divider anymore. And when that happens? When the current through the load starts to be the same order of magnitude of the current through the resistors of the divider. But this may be not the only factor in choosing a voltage divider values.

For the specific case of the Arduino ADC, and according this link Input impedance of Arduino Uno analog pins?, the recommended source impedance of anything connected to an Arduino ADC input should be 10kOhm max. Since your source impedance is dominated by your resistors, the resistors of the divider should also be in the order of tens of kOhm.