Electrical – Linear voltage regulator with a solar panel

solar cellvoltage-regulator

I'm wondering how to go about reasoning about the losses in a linear voltage regulator connected to a solar panel.

The OC voltage of the panel is around 22V, the maximum power voltage is around 18V, and I'd down-regulate it to 12V for a battery pack.

So far, I got this:

  • The more current I draw from the panel, the lower its voltage will be. Since the battery can sink in as much current as I can provide it (it's a tiny panel, let's say 0.5A), the voltage will probably drop significantly. The current will be maximal in this phase but the voltage drop across the LDO will be minimal (let's say 0.5A * 2V = 1W dissipation).
  • As the battery gets charged, it will draw less and less current, and the solar panel voltage will rise. When it rises above the LDO voltage, the LDO will start dissipating more power as heat, however, at that point the current will be minimal.
  • When the battery is charged at a steady voltage (let's say it's 12V), the voltage on the input of the LDO will be near the OC voltage, but the current through it and into the battery will be negligible.

Does this make sense? Am I missing something?

Edit: I'm not asking about exact calculations, just a confirmation about is this line of reasoning ok.

Best Answer

The more current I draw from the panel, the lower its voltage will be. Since the battery can sink in as much current as I can provide it (it's a tiny panel, let's say 0.5A), the voltage will probably drop significantly. The current will be maximal in this phase but the voltage drop across the LDO will be minimal (let's say 0.5A * 2V = 1W dissipation).

It also depends on how much sunlight hits the panel (notice the different amounts of power in the graph below, its a different panel but the same principles apply), which determines the power the panel can provide. Assuming you are providing your panel with the full amount of light (usually 1000W/m^2) you'll get some efficiency rating of that amount (usually 15-10%). For a 200cm^2 panel (0.04m^2) this would be 40W of sunlight and anywhere from 6W to 4W. (plug in the numbers for yours, they are usually on a nameplate on the back)

The more current (or larger load you have) the less power your cell can provide. Take a look at the graph below, on one color there is a I-V curve and a power curve. The power (and efficiency) goes to near zero if your drawing too much current.

Using an LDO can work assuming that your cell is providing more power than your source is sinking, which means you'll need a much larger panel. You'll be operating at the voltage of the LDO and your efficiency will not be nominal. Typically batteries are charged with constant current, then constant voltage.

d

As the battery gets charged, it will draw less and less current, and the solar panel voltage will rise. When it rises above the LDO voltage, the LDO will start dissipating more power as heat, however, at that point the current will be minimal.

Correct, but the current won't always be minimal, it will drop from it's nominal charging current to the 'float charge'. So at some point your LDO could be dissipating a lot of heat depending on the voltage of your cell.

When the battery is charged at a steady voltage (let's say it's 12V), the voltage on the input of the LDO will be near the OC voltage, but the current through it and into the battery will be negligible.

The voltage on the input of the LDO will depend on what part of the I-V curve the panel is on which will be determined by the amount of sunlight, the panel, and how much current your drawing. It would be best to plan on the LDO dissipating most of the power through it and put a hefty heatsink on it.

While true that near the float charge the LDO will be dissipating minimal current, the system should be sized for the worst case condition so the LDO doesn't burn up. If the max voltage on the cell is 36V, and the max current draw is 1A then that is (36V-12V)*1A=24W into the LDO.