How to compute the charge of a required battery if I know average power usage of device

batteriesbattery-chargingpowersolar cell

I have a device that operates on average 0.6 watts. I want to design a battery that will leave this device on for 17 hours straight. I know that charge(Q) = current * time. My device has an operating voltage of 5 volts. So my calculations are:

(0.6 watts / 5 volts) * 17 hrs = 2000 mAh

I also want to charge the battery using a solar panel for 7 hrs of sunlight. So i wanted to compute the required wattage from the solar panel to charge this battery to max capacity in 7 hrs:

(2000 mah / 7 h) * 3.7 v = 1.05 watts

(3.7 volts battery)

Can someone please verify that my math is correct?

Best Answer

Your question makes some wrong assumptions but also it's much easier to just think in terms of power (W) and energy (Whr) in your question.

If your device uses 0.6 watts for 17 hours, then its total energy usage is 10.2 Whr. Given a small (20%) buffer, you'll probably want something that provides 12.24 Whr of energy in a real-life project.

Now if your battery is Li-Ion based (which is where I assume you get the 3.7V from), you'll want (12.24 Whr / 3.7V) ~3300 mAH of battery capacity.

The question about charging capacity is a little bit more complicated. Li-Ion batteries do not charge linearly. You can charge them very fast from empty but you MUST slow down as you approach full or risk overheating and damaging the battery.

Given that, you may choose to use an overrated battery (so that you can charge faster because battery isn't full) or you can oversize your solar array (so that you can charge faster when the battery is empty)