A computer or other appliance power supply is not a resistive load, it is a reactive load. It has a phase relationship to the incoming voltage, which is itself an alternating (AC) voltage. AC voltages inherently show an "average" of essentially zero. What is measured for power computation is an "effective" or "Root Mean Square" (RMS) voltage across, and current through, the appliance power feed.
Therefore measuring the resistance across its power supply leads will not provide meaningful results.
At a simplistic level, Voltage measurement could be done with a RMS voltmeter across the supply leads. See this EE.SE answer for more details.
Current measurement would need an RMS current meter either inserted into the power line in series, or using a clamp-type non-invasive current sensor.
Low cost AC power line meters use a basic rectifier circuit and internal computation to indicate power consumption. These are designed for specific power line types (e.g. 110V 60 Hz, or 230V 50 Hz), and will deviate from precision if used on a different line frequency, if they work at all.
The above does not take into account Power Factor calculations, another element impacting actual power consumption calculations.
The proposed multimeter approach will yield nothing except possibly a damaged multimeter and the risk of electrocution if you are not qualified to work with mains voltages.
There are commercially available power meter devices that plug into your wall socket, with the appliance plugged into the device, and log or display power consumption. That would be the recommended way to go.
To measure current with a basic multimeter, you have to break the circuit and insert the ammeter in series. That, is you must a series connection the 5V -> circuit board -> ammeter -> ground. If you used the ammeter in parallel, which your description suggests, then low resistance ammeter would draw as much current as the power supply can provide. This certainly could overheat the probe, although ammeters usually contain a fuse that should prevent this.
It's also possible you accidentally shorted power to ground with the probe somehow, or if a second power supply is involved (quite likely for microprocessors), that you created a connection between the power supplies.
By contrast, voltage is measured in parallel, and voltmeters are high resistance so they do not (usually) interfere with a circuit's operation.
It is often easier to place a "shunt" resistor in series with your load and measure the voltage as an indirect way of measuring current, because you do with this without breaking a circuit on a PCB.
Best Answer
The multimeter does exactly the same as what you would do manually with a non-autoranging meter. Suppose you have a 3 1/2 digit meter, so 1999 is your maximum reading.
The multimeter starts at the highest(*) range, and if the reading is less than the 199.9 threshold it switches to 1 decade lower, and repeats this until the reading is between 200 and 1999. That goes very fast because it doesn't have to display anything during this procedure, so it appears that it gets the right range first time.
Or, if it includes enough logic, it can take the first measurement on the highest range, and then directly select the lower range that is most appropriate for that voltage level.
For example:
1st reading, at 1.999 MΩ range: < 199.9
2nd reading, at 199.9 kΩ range: < 199.9
3rd reading, at 19.99 kΩ range: > 199.9
So this is the range we want.
Do actual measurement: 472
That value is between 200 and 1999, so that's the best resolution possible. If it would go another decade lower it would overflow. So the resistance is 4.72 kΩ.
Note that during the first readings it doesn't really measure the actual resistance, it just checks if it's higher or lower than 199.9.
Alternatively the multimeter may have a set of comparators that can all work simultaneously, each checking a next higher range. You get the result faster, but this requires more hardware and will probably only be done in more expensive meters.
(*) Not the lowest, as "Mary" aka TS suggested. Those as old as I am have worked with analog multimeters. If you would start measuring at the most sensitive range the needle would hit the right stop hard. You could hear it say "Ouch". Switch to the next position, again "bang!". If you care for your multimeter as a good housefather ("bonus pater familias") you start at the least sensitive range.