It seems at a fundamental level, the panel puts out less power than the thing you are trying to power wants. That means you can't "convert" your way out of this. You can convert some combination of volts and amps to a different one, but the volts x amps product of the output can't ever be higher than the input volts x amps product.
The first thing you need is proper specs for the thing you are trying to power. Apparently it expects around 5 V. So give it 5 V and measure the current. You can't design a circuit to meet a current requirement if you don't know what that current requirement is. That really should have been obvious.
It sounds like your switcher can deliver sufficient output current when given sufficient input voltage. The problem is that your panel can't supply the necessary power, so the switcher keeps trying to draw more current from the panel. That causes the panel voltage to drop, so the switcher draws even more current, which causes the panel voltage to drop even more, etc. Rather quickly, the panel voltage collapses. It then produces even less power than it could if managed properly.
So what to do? One possibility is to add another panel in parallel. With enough input current capability, the system will at least work in steady state in full sun. However, when the insolation goes away, the voltage will collapse again, and may not be able to recover when the insolation returns.
What you need is a circuit that disables the switcher altogether when the panel voltage is too low. If the switcher doesn't have a shutdown input, get one that does. That could be done with a external transistor, but at your apparent level it is better to drive a shutdown input that is meant for the purpose. Many buck switchers have shutdown (or enable) inputs, so this is not a onerous requirement.
Derive the shutdown signal from a comparator with hysteresis. Find what a reasonable maximum power voltage for the panel is under a bit less than optimum illumination, then set the comparator off threshold a bit below that. Set the compator on threshold a bit below the open-circuit output voltage at medium illumination.
Now connect a big capacitor across the panel. This should be 10s of mF at least, rated for 25 V or more.
What will happen now is that the panel will charge up the capacitor to the comparator on threshold. The buck switcher then makes 5 V and your device charges. The current drawn by the buck switcher will exceed what the panel is producing. That capacitor supplies the remaining current, but discharges in the process. After some time, the capacitor voltage drops to the comparator off threshold. The buck switcher turns off, stops charging the device, and stops drawing input current. The solar panel current now charges the capacitor, and the cycle repeats.
The larger the capacitor, the longer the device will be charged at a time. Depending on how the charger in the device works, it may need some minimum on time to do any useful charging. A larger capacitor will lengthen that on time. It will also lengthen the off time, but that shouldn't bother the device.
Overall, you still aren't getting more power out than in. However, the output power is now in burst of high power with gaps in between. The device charges during those bursts of high power in the way it is intended to work. Obviously the overall charging will take longer, but that's again due to basic physics limited by the available input power.
Best Answer
Well MPPT would give maximum power to the batteries as that is what it stands for: Maximum power point tracking. But MPPT requires intelligence, intelligence requires power. Depending on the efficiency of your design (processor and software in processor, or the power use of the hardware, if you're planning on attempting that route). This may then reduce the power to the batteries by enough that you would be better using one of the other options. Also, MPPT means that there is no control of the power flow into the cells. Not controlling input into a battery is a bad idea. Do not do that. If you are using batteries in anything in life, make sure you know they are being charged and discharged in a safe manner.
A DCDC to take the power down to around 4.5V sounds simple, but you've got to work out how to convince the DCDC to keep outputting power power to the cells to charge them up when they're 3V or less. So then you end up making a constant current charger. This is a type of DCDC, but it's a CURRENT rather than VOLTAGE controlled. Until the voltage reaches a certain level, and then you'll need a constant voltage to avoid damaging the cells. This would be the best way to charge the batteries if you had a nice steady source of power. But you solar panels are not going to give lots of current at and ideal voltage.
What you could do (and what I would want to do) in an ideal world is: use a MPPT to get power from the solar panel and store the power in a capacitor bank in the useful voltage range. Capacitors are a lot more resilient in charge currents and voltages, and so can handle the inconsistent nature of energy flows from the MPPT. Then us a constant current, constant voltage charger for the batteries. This will then get you the most out of the solar cell, and charge the 18650s as well as possible. This method is the most expensive and most complicated.