I have a boat, that uses two 13V, 100 watt lights. The output from my alternator is running at 13.8V-14.5v and it is rated for 90 Amps. These lights run $270/pair and seem to burn out quickly. They are factory rated for only 25 hours at 13v. A technician at a bulb supply store read a chart and told me that running the bulbs at 14v reduces their lifespan by 60%. So, now these expensive bulbs only last for 10 hours. They are intended for docking the boat only and not continuous use. The same tech that told me I was burning the lights up so quickly told me if I reduced the voltage in that circuit, I would get a huge increase in the lifespan of those same bulbs. Therefore: I would like to reduce the voltage in the circuit to 12v and still be able to draw the 100 watts to run the lights as needed. IF the solution is impractical, I will look for other lights, but cutting a new hole in the hull of the boat (for new lights) is my last resort.
Electronic – Need to extend bulb life on 13vDC, 100 watt headlights
resistors
Related Solutions
Bulbs have a high PTC ratio from 300'K to 2500'K to 3000'K
measure the string R value to obtain the cold R per bulb, I expect it to be 1/8 to 1/10th of 14.1 Ohms.
Due to this huge PTC range , the bulbs tend to operate in a constant current with the addition or subtract of a few bulbs while the voltage across and power changes according to these changes in voltage drop but at relatively the same current.
Thus when you add a linear R to the string, the overall power remains the same roughly for small changes so removing or adding a small power R does not affect the total P, as the bulb voltage drops a bit, cools down lowers its R and thus keeps the current about the same. It acts like an INrush current limiter (ICL) which operates at 160'C or thereabouts intended only for limiting surge start currents, These were once used on 100W bulbs to extend the life but cost as much to make as the bulb.
Using the diode however switches off the voltage and power by 50% If this reduced power is acceptable consider reversing the diode in one string to balance the load current.
The suggested diode method cuts off another direction current in the alternating current circuit. That reduces the effective power 50% because half of the time the current is is blocked as effectively as the wire was disconnected. As commented, that's not 50% reduction of the effective voltage, it's less because the power is proportional to the square of the voltage in resistors.
You probably do not see anything more than some brightness loss because the frequency probably is 60Hz where you live and every bulb will get current half of every 1/60th second. The filaments in the bulbs do not get cold that fast.
To keep the brightness intact you should remove half of the bulbs of one series if you add a diode. It's difficult to pretend how the lifetime of the bulbs suffers due the changed shape of the current pulses.
The diode must stand the peak AC voltage and the current.
The diode does not get hot because it works like a switch. That's true, if the diode is selected for the load current and can be cooled freely by the air.
Serial resistor doesn't block the current totally, both halves of the AC get through, but part of the total power isn't dissipated in bulbs, it's dissipated in the resistor.
The resistor generates heat. If the resistor is selected properly using Ohm's law it dissipates electric energy to heat exactly as much as the bulbs which are replaced by that resistor. If the resistor has about the same dimensions as one bulb (=same ability to get cooled by the air), but dissipates worth of ten bulbs, it surely gets hot because its temperature must rise 900% higher over the ambient temperature than one bulb.
This text is valid only for incandescent bulbs which have a filament. Led lights work totally differently.
Best Answer
The lights apparently draw about 8A at 13V if they are rated 100W @13V.
You could drop the voltage slightly by using two or three silicon diodes in series. The total current for two will be about 16A, with a surge of perhaps 150A on turn-on. You will need a fairly hefty diode to withstand that. To take one example, consider two Vishay VS-40EPF06PBF diodes in series.
They will typically drop about 1V each at 16A if you keep them reasonably cool.
It looks like they have an excellent chance of withstanding the surge current too (though perhaps not the fault current when a lamp fails).
If you're going to drop 2V at 16A, that's 32W to get rid of. If you can find a fairly large piece of aluminum to bolt them to, you'll be fine.
A cheaper and simpler solution is to use a 50W chassis mount resistor. A 0.15 ohm resistor such as a 9-1625984-3 from TE is less than $4. At 16A it will drop 2.5V and dissipate 40W. Again, you'll need a fairly large heatsink or it it will die, but the resistor can be allowed to run hotter than the diodes (100°C is okay).
Personally, I'd probably opt for two resistors (one per lamp) if that was convenient for the wiring, and still use the 50W size. Of course they're going to be double the value each, and the granularity of value choice is a bit better (.25 or 0.3 ohm, so drop will be 2V or 2.4V each).
Note that if your lamps are halogen type running them at below rated voltage may result in darkening of the bulb because the halogen cycle won't operate properly- but in this case you're mostly trying to get them to rated voltage.