Electronic – How to calculate the charging time for a battery from photovoltaic panels

battery-chargingsolar cell

I'm trying to develop a spreadsheet that will work out a power budget for a photovoltaic solar powered installation. There are 2 panels which I think output 35 Watts. There are 2 x 200 Ah lead acid batteries.

I know roughly how much current my installation will draw and how long it will draw it for per day.

I need a formula for the time it will take to recharge the batteries given the current drawn. Alternatively, I'd like to be able to work out (given the panel output), what's the maximum time per day (on average) that I can run my installation without depleting batteries.

I don't know much about battery chemistry or charging so I don't really know how to work this out. Can someone help?

Best Answer

When dealing with battery power and current, you MUST include battery voltage. I'm going to assume that your batteries are 12 volt units.

Furthermore, you need to specify the state of charge (SOC) for your batteries. I'll assume that you will treat your batteries gently, and not discharge them more than 20% max - that is, the batteries will never be discharged to less than 80% of full charge.

This paper suggests that, for a particular set of lead-acid batteries, the incremental charge efficiency above an 80% SOC runs about 50%. Using this as a guideline, the maximum possible effective charge rate for your system at full sunlight will be $$C = \frac{2\times 35}{12} \times 0.5 = 2.9 Ahr/hr$$ However.

PV cells have a roughly constant current over a fairly broad voltage range, and maximum power output is calculated at the point where the voltage starts to drop off. Depending on your charge circuit, this may cause problems. For instance, if the PV develops 35 watts at an output of 18 volts, this implies a current of ~2 amps. If the PV cells are used as a current source for the batteries, then 2 arrays will supply a total of 4 amps peak, and an efficiency of 50% will give an effective charge rate of $$C = 2\times 2 \times 0.5 =2 Ahr/hr$$ rather than 2.9. This is reflected in the assumption in my first equation that all power is converted to battery current (2x35/12) which is not true for a simple charger. If the PV cells feed a DC-DC converter with a variable output this can, in principle, be compensated for, but now you have to factor in the efficiency of the converter, which may well be in the 80% to 90% range.

Assuming a DC-DC converter in the charge circuit with an 85% efficiency,$$C = \frac{2\times 35}{12} \times 0.85 \times 0.5 = 2.47 Ahr/hr$$

You will, of course, have to provide your own estimate of full-sunlight/day equivalent for the PV array. This will have to take into account misalignment of the array due to seasonal shifts in sun elevation (since I assume a 70-watt array is too small to merit a tracking installation), along with estimates of the effect of bad weather.

The uncertainties associated with weather, PV performance, charge efficiencies, etc, mean that you MUST take all of the above numbers with a grain of salt. Quite specifically, numbers like 2.47 are misleadingly precise, and if you use such numbers without constant awareness of just how imprecise they really are (despite the apparent precision of 3 significant figures) you will get yourself in deep trouble.

ETA - Examples of why these equations aren't precise. I've used 12 volts as a power conversion voltage, while actual lead-acid charge voltages usually run 13 to 14 volts. Meanwhile, I don't actually know the operating point of the PV power number, and PV cells actually put out slightly higher current at lower voltages. Depending on device type, PVs do not necessarily have a linear response to different sunlight levels, so calculating effective response during cloudy periods is not simple if you want high precision.