Electronic – AC adapter: Input power vs Output power

acac-dcadapter

When I look at, let's say an iPad AC adapter, the input/output rating on the charger states:

Input: 100-240V 0.45A (AC)
Output: 5.1V 2.1A (DC)

I know that the input and output ratings are maximum. The AC voltage in my country is 230V. Through a simple calculation, I can deduce the following (correct me if I am wrong):

Input power: 103.5W
Output Power: 10.71W

Now, my question is this: what is the true power drawn from the wall socket? Is it 103.5W or 10.71W? If it is 103.5W, then I presume the iPad adapter is 10% efficient?

Best Answer

Undoubtedly the charger you have is a switching regulator. I say this because that is what all modern chargers appear to be and the input voltage range is wide enough to make this assumption valid. It's not a big charger - roughly 10 watt output means it is small in my book and more than likely it will be based around the following: -

  • Raw AC voltage is rectified to DC (peak will be about 338V DC on 240 Vac input and about 140V DC on a 100 Vac input)
  • This gets smoothed by a capacitor - probably in region of 220 uF (rated at 450V)
  • A switching circuit will convert this high dc voltage to 5.1 Vdc

As a thought experiment, if you connected the AC input of the charger to a 140V DC supply and the charger's output to a load resistor that took 2.1 Adc (10.71 watt load) and assumed the power conversion efficiency in the charger was 80%, you would expect to see about 14 watts taken from the input DC supply of 140V. This means a current of about 100mA.

Input power = 140 Vdc x 0.1 Adc = 14 watt

Output power = 5.1 Vdc x 2.1 Adc = 10.7 watt

So, when you connect it to an AC supply of 100V AC RMS, why could a current of 0.45 A flow? To understand this you have to recognize that this device's AC input current (as measured through an RMS measuring ammeter) is not representative of real power into the device. Unlike DC circuits (where it would be representative of real power), AC circuits like this can draw very non-linear (non sinusoidal) currents whose RMS value could be quite high compared to the "useful" current.

This means you can't make the assumption that input power into the device is Vac x Iac. The device has (more than likely) a bridge rectifier and smoothing capacitor and the current drawn will almost be like a spike of a few milliseconds every 10ms (50Hz supply). This, without doing the maths could mean that the input RMS current is twice the "useful" current: -

enter image description here

This would take your "useful" and needed input current of 100 mA to 200 mA - a bit closer to the 450 mA stated.

Also to consider is the inrush current - the manufacturer's rating of 0.45A may include some measure of inrush current into this figure but we don't really know.

Remember also that the rating will be most valid when the input AC voltage is at 100 VAC and not when the voltage is much higher (as per your calculation).