I assume that you are asking how to take the 15.27V to 12.64V declining supply and to produce an efficient steady 12V out.
Solution Summary:
Use a proper battery box - the voltage drop you are seeing is completely unacceptable.
For testing the drop across the battery contacts form part of the load, which is OK, BUT as it obscures the battery voltage due to battery to contact voltage drops it causes measurement problems.
Proper charging
Your stated charging voltage of 15 Volt is too low. You need just under 18V to fully charge 12 x NimH in series. If you really only have 15V then the batteries will never fully charge and you will not get the full 2000 mAh/cell.
As you state that the starting loaded battery voltage is > 15V the charger MUST be > 15V max - measure it and determine how it changes with load.
Use a constant current load for testing. See below for details.
An LM317 and one resistor will provide this.
Check battery capacity is genuine.
Many batteries do not provide the claimed capacity.
Use a Buck converter.
A buck converter can add maybe 10 to 15% more run time after everything else is sorted out but is pointless until you fix the other major issues.
The most efficient way of converting a varying battery voltage to a lower output voltage is to use a buck converter.
BUT your figures suggest that there is "summat aglae" - something is very wrong indeed.
30 ohms load at 15V or less Vout should give 500 mA max.
2000 mAh / 500 mA = 4 hours.
Discharge rate is ` 500 mA/2000 mAh = C/4
At that level a NimH cell should be at an average of 1.15V or so and fairly flat across most of its range.
Where you measure end point of the battery depends very much where you measure the voltage. As you have such a terrible battery holder (dropping about 5V at 500 mA !!!) you should consider this part of the load and measure the voltage at the battery.
If you consider end point to be 1V/cell you should get about 4 hours down to 12V measured at the battery. As the battery contacts are interleaved with the batteries it is hard to tell which is load and which is holder loss and which is battery voltage etc.
The battery holder is a major issue - fix or replace it!
You report
15.27 / 12 = 1.27 V/cell initial
14.39 /12 = 1.2 V/cell at 1 hour
12.64 /12 = 1.05 V/cell at 2 hours
Regardless of the reasonableness of these with time, lets look at the conversion efficiency they represent with a linear regulator.
12/15.27 = 78.6%
12/14.39 = 83.4%
12/12.64 = 94.9%
You can easily get better than 78% with a buch regulator
You can get better than 83% with only moderate care.
You can get beter than 95% only with much effort and care.
So, the start mid and end point voltages are such that a buck regulator will help.
BUT if the buck regulator gives 95% out and you would otherwise average 85% it will extend time by only about 95/85 = about 12% more.
Whereas, fixing your battery box and wiring will double the run time.
That's assuming that the batteries really are giving 2000 mAh. That's something yo need to check.
A constant current load can very easily be arranged using an LM317.
Connect an R = V/I = 1.25V/500 mA =~ 2.5 ohm resistor from Vadj to Vout.
Voltage into Vin
Output from Vadj (NOT from Vout).
Yoi now have a constant current load that allows much more consistent testing.
Note that charging 12 batteries in series will require attention to balancing. Without this you are almost certain to get imbalance problems so that one or two cells fully discharge first under load leading to early failure.
Your reported charging voltage is too low.
NimH cells need about 1.45V each to fully charge.
12 x 1.45V = 17.4V= say 18V
If your voltage source used for charging does not reach 18V open circuit then your cells are not getting a full charge.
Your reported voltage of 15V/12 = 1.25V/cell is too low.
Change this to 18V and you may get 2 x the result.
As a test, charge the batteries in a commercial charger and see what results you get.
See Smart charging circuit for NiMH battery pack where the answer states
In such cases a very reasonable charging strategy is to terminate
charge at 1.45V per cell.
It is reasonable to believe he is referring to "at the cell".
It is worth noting that BQ2002PN is a FAST charge. You need to ensure it will not burn out your Cells. A good charger will switch between slow and fast. In-circuit application charging should design the charge rate to exceed the discharge rate of the applications load and consider the margins. It is more then acceptable to use a fixed supply voltage and resistance to supply a minimal charge rate. Assuming it is not too high, a small trickle small enough not to exceed self heating, works.
Before Low Self Discharge Cells we made a +12V with Diode drop and Resistor to slow charger for a dozen parallel Cells, (i.e "Hot and Ready", not really hot). It is cheaper than a smart charger. And we could keep the charge rate very low. Lower then most chargers slow rate as they are still higher (enough for larger capacities) than needed and wasteful.
In fact I have several Maha and LaCrosse (nice chargers) for NiMH, but they are too smart. When using more than one cell, on a load, in series (typical 3 and higher) they UN-evenly discharge. Where one gets below the Under Voltage sense and considers it a failed cell. But putting it on a 5ma source for a minute kicks it up and then it works on the Smart Chargers.
You should refer Ada's page on Minty Boost. LiPo's are the in spot for in circuit charging and plenty of chargers for them. Such as SFE's Lipo Charge Boost. There are plenty of examples out there.
Best Answer
Yes, it's possible.
But it can be done more simply than you propose.
Using the LT1618 you need
2 x NimH cells to meet the minimum Vin spec,
3 cells to meet the maximum switch and duty cycle spec and
4 cells would be wise and still may be marginal with worst case component specs.
5 or 6 cells are really needed to be sure of an in spec design!
Surely not?
See below.
When designing it is essential to use the worst case values of data sheet specifications so that an in-spec component will operate as intended. If the design is one-off and you are prepared to pick amongst a batch of ICs you may be able to use typical of best case values, but this is unwise and can lead to problems with other parameters.
An example here is the LT1618 internal transistor switch current rating.
This is rate at 1.5 / 2.1 / 2.8 A minimum/typical/maximum. In many cases you may find that the switch will handle 2.1A acceptably well, but SOME devices may have a 1.5A limit and still be "in spec" so 1.5A should be used for design. Similarly, NimH cells may provide from about 1V to 1.3V out so operation with say 4 cells from 4V to 5.2V should be possible. In fact a fully charged NimH cell will have a voltage of about 1.35V when first used and immediately out of the charger may approach 1.4V. While the 1.4V output will be present only very briefly, it means that eg 4 cells can supply ~= 5.6V at no load and use of a device with an eg 5V upper limit may cause failures for reasons which would not be apparent. That does not apply here but is the sort of thing that a designer must be aware of.
[As another example, Alkaline cells have a potential of about 1.65v when new so a 4 cell nominal 6V alkaline battery can measure 6.6V at the terminals when new].
The LT1618 claims to be a constant voltage, constant current boost converter.
For this to be true the load would have to assume a single specific value as
Rload = V_constant/ I_constant.
What it in fact does is to implement a traditional CV/CC power supply where output voltage is no more than Vcv when I load is less than Icc and Icc is never more than Icc so that voltage out decreases if necessary to limit Iout to <= Icc.
When driving an LED it is usually adequate to simply control Icc and allow Vout to assume whatever value is needed to achieve this. If the LED is removed or goes open circuit Vout would notionally rise indefinitely and a simple secondary voltage control can be used to limit Vout to some value somewhat above what would be adequate to provide Icc in the intended load.
eg in your case if I_LED_desired is say 0.25A and VLED_nominal = 12V at this current, then limiting Vout to ~~= 15V max with no load is adequate, and ILED will be limited to the desired 0.25A when the LED is connected. It is 'desirable' that any output capacitor which is charged to Vout max when the LED is removed does not contain enough energy to damage the LED when it is connected if the capacitor is charged to Vout_max.
Use of the LT1618 is entirely acceptable for this purpose if desired with Vcv being set to slightly above L_LED_max at the target current. The LT1618 has an operating range of Vin = 1.6V to 18V. A NimH cell has a useful minimum output voltage of about 1V so as 2 x 1V > 1.8V, 2 cells is notionally enough.
The IC's switch maximum current limit is 1.5A minimum.
Desired output power >= 12V x 0.3A = 3.6W.
Input power is Pout / efficiency
say use 3.6W/80% to start = 4.5W.
Switch on/off time ~= proportional to 1/(Vin:Vout).
At 100% duty cycle a 2V minimum supply could give 1.5A x 2v in = 3W, so 2 cells is not enough.
3 cells gives 3V x 1.5A = 4.5W at 100% duty cycle.
At 3V in, 12V out ton:toff of switch = 12:3 at 100% efficiency = 12/15 = 80% on duty cycle.
Max switch duty cycle = 88% min (page 2) so 3 cells is about enough but somewhat marginal.
4 cells ensures that switch is well inside duty cycle limit, Iin peak can be well inside Iswitch_max and efficiency will be higher at higher Vin.
So a good choice is 4 x NimH cells with operating voltage of 4 x (1V to 1.3V) = 4V - 5.2V.
So with 4 NimH cells.
Vbatmin = 4V say.
VLED = 12.5V max say
ILED_max desired = 0.3A.
Power_LED max = 12.5 x 0.3A = 3.75 Watts.
Efficiency is uncertain, but a look through the various example circuits and efficiency curves suggests that 75% efficiency MAY result and that higher is a bonus.
Use 75% for design.
Vin = 4V, Vout = 12V (close enough to make figures tidy)
Vin effective for duty cycle calculations = Vin x efficiency = 4 x 75% = 3V.
Ton:Toff limiting = 12V:3V = 12/15 on = 80%.
Power in = Power out/efficiency = 3.75W /75% = 5W.
Iin = Pin/Vin/dc = 5W/4V /0.8 = 1.56A
Ipeak with linear ramp = 2 x Iavg = 3.12A.
We have (potential) problems - if not actual.
The worst case assumptions have ganged up to produce a load which exceeds the switch current IF I_inductor falls to 0 on each switching cycle.
A look at the photo of the oscilloscope trace at the bottom left of page 13 indicates that the converter operated with I_inductor always well above zero in a circuit the same as the one intended.
Will it work? - Probably yes, but it's marginal.
An IC with somewhat more switch capability and ability to CC regulate would be preferred BUT the LT1618 is a nice IC if it does meet the need. Reducing I_LED to say 0.2A and devices which do not fall in the worst case spec would make it safer.
OR use even more cells - probably 6.
who would have thought!
E&OE!
ie I may have made a major blunder somewhere. I hope not. Do point it out if I have.