The multimeter does exactly the same as what you would do manually with a non-autoranging meter. Suppose you have a 3 1/2 digit meter, so 1999 is your maximum reading.
The multimeter starts at the highest(*) range, and if the reading is less than the 199.9 threshold it switches to 1 decade lower, and repeats this until the reading is between 200 and 1999. That goes very fast because it doesn't have to display anything during this procedure, so it appears that it gets the right range first time.
Or, if it includes enough logic, it can take the first measurement on the highest range, and then directly select the lower range that is most appropriate for that voltage level.
For example:
1st reading, at 1.999 MΩ range: < 199.9
2nd reading, at 199.9 kΩ range: < 199.9
3rd reading, at 19.99 kΩ range: > 199.9
So this is the range we want.
Do actual measurement: 472
That value is between 200 and 1999, so that's the best resolution possible. If it would go another decade lower it would overflow. So the resistance is 4.72 kΩ.
Note that during the first readings it doesn't really measure the actual resistance, it just checks if it's higher or lower than 199.9.
Alternatively the multimeter may have a set of comparators that can all work simultaneously, each checking a next higher range. You get the result faster, but this requires more hardware and will probably only be done in more expensive meters.
(*) Not the lowest, as "Mary" aka TS suggested. Those as old as I am have worked with analog multimeters. If you would start measuring at the most sensitive range the needle would hit the right stop hard. You could hear it say "Ouch". Switch to the next position, again "bang!". If you care for your multimeter as a good housefather ("bonus pater familias") you start at the least sensitive range.
The range switch on the front of the multimeter shows the maximum current that can be measured on that range. The range switch is pointing at the "200m" DC Amps range in the picture. Therefore, the full-scale readout for this range will be about[1] 200 milliamps. If more than 200 mA of current passes through the multimeter on this range, the multimeter will display an over-range indicator instead of the measured current. This means that a display of "2.0" or "2,0" indicates a measured current of 2 milliamps, not 2 amps.
Also, notice that there are two sockets on your multimeter for measuring current. One is for use with the milliamp and microamp ranges, while the other socket is for use with the amp range. The milliamp range is fused (notice how only the 5A plug has the label "unfused"). Passing more than 200mA of current through the milliamp socket will blow the fuse to protect the multimeter. That is another way you can tell that you are measuring 2 mA and not 2 A.
[1] The exact full-scale range of many multimeters is not exactly the same as the number printed on the range switch. For example, on a Fluke 77 multimeter, the 300 mV range actually measures up to 303.1 mV before displaying an over-range indicator.
Best Answer
The reason the range is so large is that few people care what the range is.
The 'continuity beep' is a 'nice to have' feature, not an essential feature. It doesn't do anything that can't be done on the ohms range, albeit avoiding the need to read the display.
When a large enough segment of the market starts demanding continuity beep thresholds with a tolerance of x%, then the manufacturers will start specifying and providing them.
Test is a significant part of the cost of making equipment. If a feature doesn't need testing, because it's not accurately specified, then that feature can be added very cheaply.
The reason the ABC company might add a continuity beep function is that XYZ company have one on theirs, and we don't want to lose out on a sale when customers hold the data sheets up side by side and do a comparison. "1% on DC voltage, check, at least 1000v, check, continuity beep, check!" If there's no standard, it just doesn't get standardised.