I have virtually no understanding of how batteries work and charge, and have various questions on them

batteriesbattery-chargingpowerwatts

I'm doing research into portable power generators for a university class and need help understanding how charging a device works. From my research, I've found certain methods of harvesting energy can yield an output of about .5mW of electricity. If you were to try to charge a device with this output, what would happen? For examples sake, lets say we were trying to charge my phone's battery. The battery is a 3200mAh, 3.8 V Li-Ion battery, and the default charger is a 5.3V, 2A charger.

On such a device, how does the amount of wattage being inputted affect charging? Is there a minimum amount of watts needed before the device will charge at all? What do the battery and charger voltage requirements mean?

Best Answer

Different chemistries require different charging techniques. But in general, as long as your source voltage is higher than your battery voltage charging will happen. If the source voltage is lower than battery voltage then charge will be drawn out of the battery. That's why your charger is 5.3V - your battery has a voltage of 4.2V when fully charged.

So all we need is to maintain voltage. What does current do?

You'll notice that your battery is rated as 3200mAh. Which is 3.2Ah. What this means is that if you fully discharge this battery over the period of one hour, the current consumed will be 3.2A. Similarly, if you want to fully charge this battery from empty (around 2.7V) and do it in one hour you'll need to supply 3.2A.

So, what happens if you can only generate 0.5mW? Well, we still need to maintain voltage above 4.2V to charge. So let's take your charger's 5.3V. At 5.3V, we'd be able to supply 0.094mA of current. Which is 0.000094A.

So, plugging the numbers in to Google:

3.2 amp hours / 0.049 milliamp = 3.88355924 years

So, using your charger, given 0.5mW power, assuming no other losses, it would take almost 4 years to charge that battery.