Electrical – Why does the phone charger consume more electricity (watts) when delivering less amps (slower charging)?

chargercurrentvoltagewatts

As far as I could understand, the charging speed is measured in amps. Like, if I use Ampere, it will tell that it charges in 2.5 amps, or 1.5 amps. I hope I got this right. Amps is the only measurement I should look at when it comes to charging speed, right?

Also, I'm really confused about this. Here is the writing on my charger.

Output: 5v=2.5A/9v=2a … 12.5v=1.5a

Power = Amps X Voltage. So, when delivering…

2.5 amps = consumes 12.5 watts
2 amps = consumes 18 watts
1.5 amps = consumes 18.75 watts

I've looked at other phone chargers and haven't seen anything like this. They usually have only one formula; like 5v = 1a

I think the reason behind these 3 different formulas is due to the quick charge technology. I've heard that this technology usually "decides" how fast to charge at a given temperature and battery level. Is that the case?

Also, since electricity bills are measured by how many watts you use, why does it use more watts when charging slower and use less watts while charging faster? Why does it up the voltage while delivering less amps thus causing more electricity consumption?

This really doesn't make any sense to me so if someone could explain this formula, I would be grateful.
(I know that phone chargers don't consume that much elecricity – money wise. It's just that I'm trying to understand how electricity works but the more I try to understand, the more I get confused.)

Best Answer

First, these are OUTPUT parameters. The charger doesn't consume the stated power, it can POTENTIALLY DELIVER the stated parameters. A device connected to the charger will determine how much to take depending on its design, negotiated mode, and battery charge status (nearly fully charged batteries will take less and less current as they are charged up). This charging process has nothing to do with stated parameters, they just define the limit for what a device can take as INPUT.

Second, charging "speed" is not measured by input amperes. The charging speed is measured by internal output of an actual Li-Ion charger inside the smartphone. The input power gets CONVERTED by internal charger into battery charging current with nearly 95% efficiency and in inverse proportion to the input voltage. So the 12.5 voltage mode with 1.5 A maximum source gets converted into 4.2 V charging voltage (typical for Li-Ion battery), or 3:1 conversion making potentially about 4 - 4.5 A of battery charging current (1.5 A x 3), which could deliver a very "fast charge" if the battery can hold it. So it is not charging slower with 1.5A input, it is charging faster (at 4+ A) and thus consumes more power overall.

Third, your charger has three possible modes of operation. The switch to the modes is negotiated between the device and charger upon connect, using one of available "fast charge" protocols, PD or QC or else, you are right here. Except that it is not based on temperature nor battery level, the mode is negotiated once and stays. Overall power delivery in each mode has nothing to do with each other, they are standard operating conditions and are INDEPENDENT (nearly). So your question is based on few false premises.