Q1. They do exist, but they are (much) more expensive. In most cases you can measure a fixed battery voltage and continue to assume that voltage while measuring current. Or, you can buy a meter with a dual display, but those typically aren't cheap. Most dual display meters though can't do everything you need - for example, they can only measure voltage and frequency at the same time. I'd suggest what you do instead is keep the nice meter for accurately measuring most things but buy one of the cheapie Mastech £3 meters to measure the second quantity. They actually do pretty well, most within 0.5% accuracy. And it's no big deal if you damage one.
Q2. I've heard good reviews from this one, designed for model aircraft.
You should be able to compute the scaling factor. The resistor will make volts from the current according to Ohm's law. After that you should know what gain you have into the A/D and what range the A/D is using.
For 1%, you probably do need to do some calibration. A large enough known voltage source with known resistor will give you a current. You can make the current as accurate as the resistor and your ability to measure the voltage accross it. With a 1/2 % resistor and any reasonable voltmeter (has to be good to 1/2 % minimum), you can know the current to 1%, then store that and the zero reading in EEPROM and correct from those on the fly each reading. Be aware that some of that might drift with temperature, so you want to calibrate at your center temperature or specify a narrow range.
Added:
Component values and amplifier offsets vary over temperature. I was assuming a two point calibration, which can always be mathematically reduced to
OUT = IN*m + b
M is the gain adjustment and B the offset adjustment. Since both gain and offset are functions of temperature, any one set of M and B values is only valid at the particular temperature the measurements were made to derive them. If this calibration temperature is in the middle of your usage range, then the actual temperature will never be more than 1/2 the range off of the temperature the unit was calibrated at. This may possibly be good enough and not require temperature compensation. If instead you set M and B to calibrate the unit at one end of the temperature range, then the actual temperature at usage time could be the full range off from the calibration temperature, making the worst case error higher.
Since you mentioned a A/D, you will have the measured values in digital form. This allows for performing the calibration equation above digitally. This also means the M and B values have to be stored in non-volatile memory somehow. The obvious answer is in the EEPROM of the same processor receiving the A/D readings. Calibrating digitally and storing the calibration constants in EEPROM is cheaper and better than ancient methods like trimpots. Trimpots cost real money, take board space, and themselves drift with time and temperature. On the other hand, most microcontrollers come with non-volatile memory, and usually have enough code space left over to perform the calibration computation at no additional cost. Even if not, using the next larger micro is usually a smaller increment than the cost of adding a trimpot.
As for AC measurements, why do you need them. Current shunts work at DC, so you should be able to calibrate the system at DC unless you have deliberately AC coupled the signal for some reason.
Best Answer
I would use a precision voltage reference such as the LT1019