As an example, I have an ACDC "wall wart" that specifies an input voltage of 100230 volts and an input current of 0.3 amps. Since my mains supply is 230v:

Does that mean that it uses 230*0.3=69 watts?

If I were to use it in a 110v country, would it use 110*0.3=33 watts instead?
Best Answer
Short answer: no.
First of all, the way you compute the wattage works with DC, in AC the calculation is different because you need to use the rms value of the voltage, and the phase between current and voltage might not be zero, that adds another corrective term.
Anyway, assuming your power supply is switching, and looking at the input voltages allowed I'm quite sure I'm right, the power consumption is practically the same and depends on the power consumption of the utilizer. If you read carefully the tech specs you'll find the maximum output current, that times the output voltage gives you the maximum output power (that's DC now!). The efficiency of a switching PSU is quite high, so that roughly corresponds to the power required.
If you want to deepen the AC power computation topic you can find all what you need on wikipedia:
AC Power
Switching power factor
Long story short, the problem is that inside a switching PSU there is a full bridge rectifier and huge electrolytic capacitors, so the current is absorbed only when the input voltage exceeds the capacitor one. This leads to very high current consumption spikes that make the consumed power computation a bit tricky.