The offset with the zener is of little use. The best zeners have a 1 % tolerance, which is 100 mV for a 10 V zener. On a 10-bit ADC this is a 20 count error, on a 12-bit ADC an 82 count error. You could trim the error away if you can measure the voltage accurately enough, but there are other factors. The BZX84-A10 has an 8 mV/°C temperature coefficient, giving a 2 count error per °C change in temperature for the 10-bit ADC, and 7 counts for the 12-bit. Looks like it's better suited as thermometer than as voltage meter. When you use a 10 V zener you'll also need a higher voltage power supply.
The resistance divider will do much better. Resistors also have a tolerance, but at 25 ¢ a 0.1 % resistor is still affordable. (Better that 0.1 % becomes expensive quickly: a 0.05 % costs almost 1 dollar.) At 10-bit resolution that will give you a 1 count error, 4 counts at 12-bit. Temperature coefficient will be less of a problem if both resistors are from the same series and placed close to each other: since the divider is ratiometric resistance changes will cancel each other out.
The numbers indicate that a higher than 10-bit resolution is of little use; component tolerances and variations will cause extra bit to be unreliable. A few extra bits may help to increase noise immunity, though, by averaging a series of measurements, or using a sigma-delta ADC, which averages the input signal anyway.
There's also something more philosophical: we always want better, but why on earth would you want to know a 12 V battery's voltage to a precision of better than 10 mV. You'll have a hard time getting the required resolution, and you'll always be uncertain about that last digit.
The ADS1000 is a low cost 12-bit ADC which will operate from a single 5 V supply.
The dynamic range is the ratio of the maximum voltage to the minimum voltage that the ADC can convert. The maximum voltage is 5 volts. Since it is a 12 bit converter, it has a resolution of 212 - 1 or 4095. Thus the minimum voltage, for which the ADC would have only the least significant bit set, is 1.22 millivolts. So the dynamic range of your ADC is 5/.00122 = 4095 = 72.2 dB. In general, the dynamic range is only a function of the number of bits, not the maximum input voltage. But I calculated using voltage just to show you the details.
Best Answer
The output of a 10 bit ADC is a number between 0 and 1023.
But very few applications require measuring a voltage between 0 and 1023 V.
Luckily (for people who don't happen to want to measure voltages between 0 and 1023 V), we can arrange for the ADC output to be scaled to measure essentially any range we want. Typically we measure between 0 and some value \$V_{ref}\$, but other ranges are possible (for example \$-V_{ref}\$ to \$V_{ref}\$). The datasheet for your ADC will tell you what the conversion range for your device is, and may also allow you to adjust the range, for example by providing an external reference voltage on one of the device's pins.
For the case of 0 to \$V_{ref}\$, assuming an ideal linear ADC, the actual input voltage can be recovered from the ADC reading by
$$V_{in}=V_{ref}\frac{d}{2^n-1}$$
where \$d\$ is the ADC reading, and \$n\$ is the number of bits produced by the ADC.
This isn't quite right. It should say the ADC provides more resolution, or better precision, if the analog signal uses the whole voltage range.
For example, say your ADC is set up to give 10 bits of resolution for input voltages between 0 and 5 V, but your input signal actually only varies between 0 and 1 V. Then you'll only read output values between 0 and 205. So you have only 8 effective bits of resolution in your signal range. And the voltage resolution is only 4.89 mV per count.
If you reduced the ADC reference voltage to 1.023 V (a convenient value), then you'd have very nearly the full 10 bits of resolution over your signal range, and the resolution in terms of voltage would be 1.0 mV per count.