What you describe is workable - it's essentially a buck converter - but your exact circuit is not specified and there are many ways to go from a somewhat correct general concept to an incorrect circuit.
The design note 188 that you reference is about 15 years old and its time has passed. While it is a marvel of improvisation, you can get better results for less cost using modern ICs.
I'm aware that you only cited it as an example, and that you wanted a switch mode version, but using a dedicated switch mode converter IC would be a good idea at minimum, and using a LiIon charger IC would be an even better one.
An LTC1541 - datasheet here costs about $3 in 1's. It's quite a useful IC if you need its features. With a 5uA typical supply current it could not fight its way out of a wet paper bag, and a slew rate of 8 mV/uS means that you have to be clever to use it and the characteristics of the device can greatly affect circuit performance.
AN188 loosely uses the term "float voltage". A Lithium Ion battery must not be "floated" in the sense the term is usually used. They intend it to mean "hold at CV while charge current ramps down due to battery chemistry action" - but the applied voltage should be removed once current has fallen to some preselected percentage of Imax.
Once you have terminated charging there will usually not be any need to restart charging until there has been some load current drawn. This is because an unloaded LiIon cell will maintain a close to full state of charge over considerable periods. If it is going to stand unloaded for weeks or months then testing battery voltage for possible restart of charging may be in order - but the load from the voltage measuring circuit can easily exceed the self discharge rate.
If you are trying to produce a low cost design with a high element of DIY in it you could try the venerable MC340-63 switching regulator IC. These are about as cheap as you can buy a SMPS IC for and can implement almost any topology. As you have noted, to make a constant current driver you compare the drop across the current sense resistor with a reference voltage. The MC34063 has an unfortunately high reference voltage (about 1.2V) so you can use an opamp or comparator. This adds cost and complexity, but using an eg LM358 dual opamp is almost as cheap as the MC34063. Scale current sense resistor with opamp up to reference voltage. Figure 6. in this applicationnote give the general idea - in this case opamp inverts and it's a CUK converter, but the principle is the same.
In small signal mode (current within specification with varying currents) U4B is operating as an integrator. Interestingly, U4A (the other half of the amplifier, providing the current set reference) has no such diodes across its inputs.
The two primary reasons for diodes across an input are :
Preventing the input differential mode voltage from being violated
Ensuring the loop can respond quickly.
As I can see no mechanism where the input differential mode voltage can be violated, the other reason (as you suspected) is the likely culprit.
An inspection of small signal response and large signal (i.e. in slew rate limit) is informative:
Note that the time axis for small signal and large signal is an order of magnitude different (0.5\$\mu\$s for small signal, 10\$\mu\$s for large signal). Admittedly, the voltage axis is also much smaller for small signal response, but this is the sort of response we would want to see.
For the integrator to maintain proper response (and quite possibly remain stable, although I have not looked carefully into that. Integrator stability or lack thereof has caused many a hair pulling day) when it is in small signal mode (and more particularly, when entering and exiting small signal mode) we do not want the inputs to deviate very much from each other so that the dominant loop response is set by C12.
I cannot see any other reason for the diodes (although that does not mean another reason does not exist).
[Update]
As the supply can only be set into constant current mode with no load atached (see page 5 of the manual), then the transition we can see is from constant current to constant voltage mode.
When the output of the supply exceeds the programmed maximum voltage, U1 engages and takes control of the loop and this actually (effectively) removes U4B from the control loop via the operation of the CR4 and CR5 analogue OR gate.
When this happens, there will be transients at U4B output, which can easily destabilise the integrator. Any perturbations will feed back but will be clamped by the diodes to a diode drop around Vout (which is where R22 and R23 are attached), which prevents serious output disturbances affecting the performance of the voltage error amplifier.
Best Answer
If you set your power supply to constant current mode, the voltage will change depending on the load. Pure Ohms law V = IR. If I is constant and you change R, V must change.
Obviously within the limits of how much V is available.
So when you set your current limit on the supply and watch the meters while you increase the load, at first V will be constant and the current needle will ramp up to the limit. After that the current needle will stay put and the voltage needle will drop.