The display is not the limiting factor (as you point out). Fundamentally, there is a cut-off. If the display was the most expensive part in the meter, then most meters would probably read up to 9999 on all digits. However, there's a couple factors that make the meter ranges what they are.
There is a limit to how high a range goes - the count is a design decision for the meter maker. A 20... meter isn't the only type available either, I personally have a 6000 count meter.
Logarithmically, you get the most bang for your buck by having 2000 counts. See the image of the number line below. By doubling the counts from 1000 to 2000, you get a bigger bang for your buck, and you also get to use one more digit (better for marketing).

There's another reason for not dropping the least significant digit. Part of the gotcha with auto-ranging meters is that you can easily forget that the meter has different measurement ranges, and that the measurement accuracy changes with each range. For example, compare the following displays:
193.00 k
.1930 M
193.0 k <----- Would you notice the missing digit as easily?
Personally, I see it as a more obvious way of conveying information about the measurement. It doesn't detract much from the readability in my opinion.
I expect your multimeter is rather more accurate than the grid voltage. Grid voltage can vary widely from the supposed nominal value. At least 10% should be assumed off the top, but more should not be surprising. 220V +-10% is 198-242V, so you're within spec. There are places in the world I would trust the power less than others, and frankly eastern Europe is in the second tier. It's not as bad as some places where outages are common, but more than 10% variation shouldn't surprise anyone.
Then there is the whole EU standardization attempt. Various countries had somewhat different power voltages, like 220V, 240V, etc. The EU wanted to standardize this. But instead of actually changing the voltages they just re-defined nominal with enough slop so everyone was within spec!
Best Answer
Most DMMs are integrating type (of which the dual-slope was one of the first methods developed suitable for high resolution and accuracy).
An advantage of integrating is that the integration period can be designed to be a multiple of 50Hz and 60Hz line frequencies. For example, 300 msec is 18 cycles of 60Hz and 15 cycles of 50Hz. This has the effect of a natural notch filter at mains frequency so that hum caused by mains noise is cancelled out and the reading does not jump around as much.
Integrating converters can also be built with high resolution and pretty good (< 0.1%) linearity with cheap parts (all the errors cancel out to a first order except for the reference voltage)- 0.1% linearity and accuracy, even with a 5% film capacitor, 5% resistors, a crude RC clock, and with adequate refresh rate for visual purposes (a few Hz)
For UI reasons you really don't want to update the display too too fast, 2-5Hz is about right- if the display rate is too fast, it could jump back and forth between (say) 201 and 101 and it would look almost like 301. If it is too slow, you don't get to see the reading stabilize and how much apparent noise there is in it.
More modern high-resolution converters are often made using sigma-delta techniques, which eliminates at least one second-order effect (capacitor dielectric absorption) from the integrating converter error budget (improving linearity*). They can be used in a DMM by low-pass filtering and decimating the result to an appropriate display rate (or just averaging over a suitable time). You'll also see low-end voltmeters and ammeters that use the built-in 10 or 12-bit successive-approximation converter built into a micro and add some averaging to get a sort-of acceptable reading with a crummy (low) input impedance.
*Although most users can't see nonlinearity without a better instrument to compare the DMM against, they can simply flip the leads on a stable DC source and see that (say) a reading of +10.00V reads -9.98V when the leads are reversed. Of course these days such an effect could just be fiddled out with a microcontroller.
Modern higher-end DMMs like my Agilent have options for reading speed, here is an example table showing capabilities:
NPLC refers to the number of power-line cycles that the reading is integrated over.