I am a novice. I want to correctly size a solar panel and battery for the following application. I will be powering 2, 6 watt cfl lights and a 40 watt incandescent light. I am planning on using 400 watt inverter from Northern Tool. I need the inverter because of the need for the 40 watt incandescent bulb which will be used as a heat source. The 40 watt bulb will be on 24 hours a day when the temperature is below 32 F. The two 6 watt cfl lights will run a maximum of 5 hours a day year round. I have done research on the internet, but so far some of the information I have obtained is contradictory. I live in southwestern Pennsylvania. What I need to know is the size of the solar panel in watts and the size of the battery Ah.
Assume the power input to the bulb is 10 Watts.
Assume for now 100% efficiency from battery output to bulb input.
Efficiency of energy storage by the battery of energy supplied to it will vary with battery chemistry and how well the charger is designed. Best case using a Lithium battery of some sort, over 90% efficiency may be achievable. Lower or much lower efficiency is often achieved in practice.
Efficiency of energy provided at the battery terminals compared with energy out of the PV panel will depend on the interface design and will also vary with battery state of charge.
Power output from the panel at any moment (Wp) and compared to the maximum power the panel can make under ideal conditions (Wmpp) will vary with insolation level (sunshine level), panel conditions, atmospheric conditions and more.
SO overall, a say 100 Watt panel will produce 100 Watts in full sunshine when new and will produce the equivalent of 2 or 3 hours of equivalent sunshine in most continental US locations in winter and 5 to 6 hours of equivalent full sunshine.
ie you get 200 to 700 Watt-hours per day depending on season.
With the very best interface equipment (MPPT, intelligent battery sizing to minimise resistive losses, ... you may get 95% + of this energy at the battery terminals and, as above, 90%+ of this actually stored into the battery.
So PV Watts rating x 0.95 x 0.9 x hours_equivalent_per_day = Watt-hours available. Say 85%. Using 80% would be safer and still very optimistic in many cases.
At the start I assumed 100% battery out to bulb in power.
Regardless of load type (which is usually LED in this context), if you want constant brightness as battery varies or constant "bulb" input there will be some conversion losses. 90% from battery to bulb or LED would usually be excellent.
So overall PV "nameplate rating" watt-hours to 'bulb' input watt-hours is at best about 75%. Usually less.
When the sun is providing energy, some gains can be had by running the bulb from the panel without battery storage. This gain is useful but still a small part of the total energy needed via the battery. I'll ignore it in the following and it can be factored in later if needed.
From the above:
Watt hours available = (Panel Watts rated) x 75% x Sunshine hours.
Watt hours wanted = Load_Watts x 24.
Rearranging the above -
Panel Watts needed = Load Watts x 24 / (0.75 x Sunshine hours )
= Load_Watts x 32 / Sunshine_Hours
So eg 10 Watt load in winter with 2 hours/day sunshine hours /day (= equivalent full sunshine).
Panel Watts needed = 10 x 32 / 2 = 160 Watts !!!
10 Watt load in Summer with 6 sunshine hours/day.
Panel watts needed = 10 x 32 / 6 = 53 Watts.
In practice higher Watts will be needed.
Averge sunshine hours per day can be found at the wonderful Gaisma site here - this example is for Houston
Top line is insoltaion in kWh/m^2/day = sunshine hours/day = hours of equivalent full sunshine. I = January, II = February etc.
2.34 hours/day in January.
5.98 hours/day in July
These are means for many years and any year and any day in the montyh may vary widely from this. That's weather for you :-)
More later ...
Your answer would be approximately correct only if your battery voltage was ~= 220V.
As that is probably not what you had in mind, your answer is probably wrong.
IF you are using a lower voltage battery then you need to use energy rather than current capacity to calculate equivalent amounts at different voltages.
A 420 Watt motor requires 420 Wh (Watt Hours) of energy per hour. That's the easy part :-).
The Wh capacity of a battery is V_battery x Ah_capacity_battery.
Battery energy capacity in Wh = V x Ah
If voltage is changed energy will be lost in the process.
Call the efficiency factor Zbm = Z_battery_motor, where Zbm < 1.
eg if Zbm = 0.85 then 15% of the energy is lost during conversion.
figure of Zbm = 0.8 is an OK starting point from say 12V or 24V to 220V.
Motor energy requirement per hour = Wm = Vbat x Ibat / Zbm
Ibat = Wmotor / Vbat / Zbm
For a 24 V battery Ibat = Wm/Vb/Zbm = 420/24/.8 ~= 22A.
To run the system for one hour you'd need a nominally 22Ah battery. BUT batteries are usually rated at the 10h rate or some other period of some hours, so at the 1h rate they will have much lower capacity. You would need to look at specifications for the battery you had in mind but a factor of say 1.5 is probably not overly pessimistic. So Ah required from 1h operation ~= 1.5 x 22 = 33Ah.
If you want to not go below 50% capacity (which is very wise for lead acid batteries) you need double that so say 66Ah.
So your originally stated 40 Ah battery would run a 420W motor at 220V for about 60 minutes x 40/66 ~+ 36 minutes IF it was a 24V battery. A 12V battery with the same assumptions would run the motor for about 18 minutes.
A lot depends on assumptions made.
Above I have used:
Up converter efficiency low voltage to 220V = 80%
Battery 1h Ah rate = Specification sheet rate / 1.5
Discharge depth = 50%
12V or 24V system.