Electronic – Connecting a higher-current power supply to a lithium-ion charger will damage the battery. Why

battery-charginglithium ionsafety

I am not asking how the battery gets damaged, because that answer is straightfoward.

What I am asking is why lithium-ion chargers allow batteries to be damaged by excessive charge current in the first place. My understanding is that all lithium-ion chargers already support current limiting features in response to battery temperature (e.g. as part of "JEITA compliance"):

JEITA guidelines for charging Li-ion batteries in single-cell handheld applications

So why don't the chargers also enforce the maximum charge current, regardless of the wattage of the input power supply?

UPDATE

Putting the question a bit better: considering the huge number of handheld devices that integrate the charger and the battery, why don't charger ICs offer the ability to set a maximum charge current to be enforced? Why don't device manufacturers want to protect their batteries in this way?

current vs lifetime

Best Answer

Short:

Important: Note that the reduction in maximum charging voltage with battery temperature is only of relevance when the battery is in the final stage CV (constant voltage) charging mode. For the majority of a charge cycle the battery is in CC (constant current) mode and the charging voltage is below Vmax - so the charger's CC limit has to be correct for the battery - altering the V in CV mode will not help at this point.

Systems which allow the charger to determine battery capacity generally do not damage the battery by overcharging. Where this does happen it will generally be because the charger cannot determine battery capacity and assumptions have been made which are violated by subsequent user or supplier actions so that a lower capacity battery than the designer has assumed are installed.

I have seen computers which charge without damage not only LiIon batteries of different capacity but also of different terminal voltage in the same machine. I do not know how general this capability is but it is both sensible and impressive.


Longer:

Lithium Ion / Lithium Polymer batteries are usually charged in two stages - first a constant current (CC) mode where the current is by design limited by the charger and then a constant voltage (CV) mode where the current is limited by the battery.

The maximum current allowed in CC mode for a given battery is set by the battery manufacturer. This is typically 1C but in some cases may be as little as C/2 and I have seen 2C quoted. I suspect that 2C may require special magic as if it was realistically achievable without compromising cycle or capacity lifetimes more manufacturers would offer it.

If a charger "knows" what the C capacity of the provided battery is then it can limit CC to the required C rate. Intelligent battery-charger arrangements can and do do this. It is possible to buy batteries of different capacity for the same computer or other equipment. If the equipment is unable to determine battery capacity it will necessarily treat it as having the designed capacity. Larger batteries will charge at < design C rate and smaller capacity batteries will be charged at higher than design C rate and will probably be damaged.


In very simple terms:
(Very non-technically put - mainly just remember - "don't do it, it's bad" :-) ).

As well as thermal effects mentioned in the cited article maximum charge rates relate to the ability to properly "put the Lithium where it belongs" in the battery structure. Excessive charge rates can end up with pure metallic lithium 'where it ought not be' with capacity effects at best and vent with flame at worst. Among other things LiIon battery lifetimes are due to the structure being mechanically flexed as Li is moved around the cell. LiFePO4 cells avoid this issue by having a permanent olivine structure which the Li is 'moved' in and out of - with a resultant reduction in available energy storage capacity due to the inactive material.

Related Topic