Electronic – A basic question about amps, watts, and charging batteries

amperagebasicbatteriesbattery-chargingwatts

Why do we speak of watts (and not amps) when we refer, for example, to the "7.5W Wireless Charger for iPhones," but when we talk about cord-based USB charging, the specs typically refer to amps (and not watts). For example, one page explains that the "USB spec is 500 mA maximum current. The charger provided with your iPhone can provide up to 1000 mA."

The difference between amps, watts, and volts is something I have struggled to understand since high school, which was a long time ago for me. I've re-read your standard references like Wikipedia, but I'm still not getting it.

Can someone help me understand these concepts by demonstrating how I would apply them to understand how to compare the difference, in performance, between a 7.5W wireless iPhone charger and a 1000 mA corded USB charger?

Thanks in advance for your help and patience.

Best Answer

The reason for the difference is that it's kind of apples and oranges.

With a wireless charger it's (hopefully!) safe to assume that the whole chain is reasonably efficient, and in the end what you care about is how much energy is stuffed into your battery, not how many electrons have passed through. So the volts and amps don't matter as much, and in the case of the power that's being transferred wirelessly, aren't even in play.

For USB, you can fully specify the power by specifying the current. USB is (or should be) a fixed 5 volt system. So a 1000mA charger will supply up to 1000mA, which means up to 5000mW (or 5W, but I didn't want to make you do all the conversions until the end).

I'm not sure how to help you with the amps/watts/volts problem -- watts = volts * amps if you're talking electricity, but because energy is a much more fundamental physical property than anything electric, watts = force * speed, or torque * rotational speed, or any number of other physical quantities that multiply out to power.