Electrical – Why does it take so long to trickle charge an AGM battery to the point where it matches the self discharge rate

batteriesbattery-charging

I did an experiment where I used a "regular" smart-charger set at 12A max AGM mode on a 12V 100Ah rated AGM battery to what it told me was 100% charged. The charger then automatically went into trickle charge mode of 13.5V. I immediately took the battery off of that charger, and put it on a laboratory power supply unit (lab PSU), manually set at 13.5V (and allowed to go up to 1.5A max).

What I am wondering is why does it take so long for that battery to "level off" at about 1/2 watt of charge power going in? For example, when I used the lab PSU, it started at about 3 watts of charge power. Over the course of about a week, it slowly went down to about 1/2 a watt then didn't go much lower.

So what I am thinking is it is very difficult to get those last few % of charge in the battery, so many modern smart chargers don't bother and maybe stop at 90-95% charged.

So does anyone know what exactly is happening in the battery and why those last few % take "forever and a day" to go in the battery and "stick"?

Just as a related comment… it is interesting to see what a particular battery will "bottom out" as, as far as charge power. 1/2 watt at 13.5V is only 37mA. I think many smartchargers may be programmed to stop at 1% of the Ah rating of the battery, so in this case, stopping at 1% of 100Ah = 1A. However, at that point, the battery still seems to want to take more charge.

One experiment I want to do very soon is after charging the battery for a week until it is what I think is a true 100%, drain some fixed amount of power out of it (let's say 1/2 KWh) using a reasonable load (such as a 150W incandescent lamp) and a Kill-A-Watt meter, then measure how much AC wall power it takes to get it back to that true 100% State of Charge about a week later, using that same Kill-A-Watt meter. I suspect running the initial charger a few hours plus running the lab PSU for about a week 24/7 will add up to maybe 3x to 4x the power (I am guessing 1.5 to 2 KWh). If that is correct, then something seems "out of whack" that it takes that much more power to get the battery fully charged.

Also let's assume the first charger really does stop at a true 90 to 95% state of charge. At that point, the wall power consumed is probably much less than 3x to 4x the power we got out of the battery, and I will record this number. It might take as much (or more) wall power to trickle charge for a week, then it does to get the battery to "100%" by the smartcharger's "definition". This should be a very interesting experiment.

UPDATE

I've had the battery on the lab PSU for about a week now and it is very close to drawing 1/2 watt (37mA at 13.5V). It is around 41mA. It must be very difficult to get that last little bit of charge in there, however, I suspect keeping the battery at a true 100% SoC is healthier than what a typical smartcharger (that is much quicker) calls "100%". It would be interesting to do a capacity drain test using both charge methods (smartcharger only vs. smartcharger for bulk + 1 week on lab PSU). I wonder if the difference in capacity will be 5% to 10% more using the lab PSU to top it off.

The last few mA that it drops are VERY slow. It looks like 37mA might be the lowest it will go without reducing the voltage. 13.5V is where it stayed. It must be very close to the battery's self discharge rate, but I wonder if the self discharge rate is a function of how much the battery is charged. For example, if a battery has 1.2 KWh of capacity, and the self discharge rate is 1/2 watt per hour, then in 2400 hours is should be totally dead but that cannot be right cuz that is only 100 days (24/7). The self discharge likely tapers based on SoC.

Here is a pic of four large 6V 230Ah rated AGM batteries in series/parallel configuration (so 12V 460Ah rated). 124mA at 13.5V is only about 62mA per battery (at 6.75V) which is LESS than 1/2 watt per battery of charge power going into each battery. The total charge power is 1.674 watts so that is about 0.42 watts per battery.

For the benefit of those of you wondering, that laboratory DC power supply is Instek brand, model GPS-3030DD. It has a range of 0-30V (actually about 31.4) and current limiting from 0 to 3A (actually about 3.1). It is rated at a maximum of about 90 watts output but can be coaxed into about 100 watts. I usually run it at 1.5A max (about 50% of its max rating) and I usually blow a fan across the heat sink if the charge current is 1A or more.

enter image description here

enter image description here

Best Answer

What I am wondering is why does it take so long for that battery to "level off" at about 1/2 watt of charge power going in?

As the battery charges its voltage rises, so the charger has to raise its output voltage to put more charge in. However if the voltage goes too high the water in the electrolyte will break down into hydrogen and oxygen. A sealed battery can withstand a bit of this, but if the gas pressure gets too high it will vent and lose electrolyte.

To prevent excessive gassing the maximum charging voltage is limited to ~2.3V/cell. The battery has some internal resistance which drops voltage proportional to current (Ohm's Law), so once the terminal voltage reaches the maximum allowed the charging current must go down as the battery's internal voltage approaches it. The result is an exponential drop in charging current over time, similar to charging a capacitor through a resistor.

At the same time the battery also self-discharges, due to various unavoidable chemical reactions. This is why the current never quite drops to zero. The combination of charge current reduction and self-discharge results in an exponential curve that levels off at a current above zero. At that point the battery is already at 100% charge, and any further 'charging' current is just maintaining the full charge state.

if 2 identical batteries have 2 different bottom out charge rates (let's say one is 37mA and the other is 40mA at 13.5V), what (if anything) can we conclude from that?

We can conclude that one battery has a lower self-discharge rate than the other. This could be caused by fewer impurities, different temperature, different battery size or a different type of battery (eg. Calcium-lead vs. Antimony-lead). In a 100Ah battery the difference between 37mA and 40mA is only 3mA, which is insignificant compared to its capacity.

since it takes to long to bottom out, does that mean that most (if not all) smartchargers don't do it for speed reasons and if so, does that mean that a battery charged with a smartcharger is not really 100% charged?

Yes. The time taken to put in that last 1% (or 5%, or even 10%) probably isn't worth it.

However 'smart' chargers often have a multi-stage charging strategy that raises the voltage a little higher to speed up the 'absorption' stage. This may cause some gas production (which in a sealed battery recombines once the charging cycle is finished), but must not be continued for too long or the battery will start to lose electrolyte. Once the absorption stage is over the charger either shuts off or drops to a lower 'float' voltage which charges very slowly (if at all). With this technique the battery can get very close to full charge in a reasonable time period.