The 1 digit means that the least significant digit can be off by +/- 1. In this resolution 1 digit would mean +/- 0.001V. 10 digit means that basically of your 79.999V displayed, it could also be 79.989V (not including the 0,03%!)
So basically in your range the 10 digit specification means that +/- 0.03% + 0,01V is your error. For measuring 79.999V it means an absolute maximum error or +/- 79.999*0.03% + 10*0.001V = 0.034V.
+/- 30 digits indicates the 'absolute error' of the value displayed means that what ever value you read on the display, you have to add / subtract max. 30 from the display value to find the range the actual value is in.
For example your multimeter shows 10.00V then:
- add 30 ticks to find the upper limit 10.30V
- subtract 30 ticks to find the lower limit 09.70V
So your actual value will be between 9.70V and 10.30V
This also illustrates why you have to choose a measuring range that fits the measured value best. If you choose a too high range, the absolute error will render your measurement completely useless. Look at this example:
Your display reads 00.10V:
- add 30 => 0.40V
- subtract 30 => -0.20V
Which is an entirely useless result, the actual value ranging from -200mV to +400mV due to bad range and its absolute error. In such a case choose a better (lower) range.
You didn't ask about it, but your relative error was 0.2%. Back to the 9.70V - 10.30V example:
- The lower limit will be 9.70V - 0.2% = 9.68V
- The higher limit will be 10.30V + 0.2% = 10.32V
So when you read 10.00V on your multimeter, the actual value will range somewhere between 9.68V and 10.32V.
Best Answer
You measure the voltage of a 'prevision voltage reference'. They are sold pre-calibrated to the right voltage.
If you have a cheap 3.5-digit multimeter you can use cheap 0.1% reference: https://www.adafruit.com/products/2200. Note that it is calibrated at the factory.