If you want to test capacitors fully, you will need what is called an ESR meter(Equivalent series resistance)
Electrolytics have an ESR which increases through general use (age and heat are main factors).
SMPS (Switch-mode power supplies) are pretty sensitive to ESR. The ripple voltage on the output is calculated as \$V_{ripple} = I\times ESR\$. This means that as the ESR of your caps increases so does the amount of voltage ripple. I can't say with certainty the problems cause by ripple voltage so I've included an extract from Wikipedia.
Effects of ripple
1. Ripple is undesirable in many electronic applications for a variety of reasons:
The ripple frequency and its harmonics are within the audio band and will therefore be audible on equipment such as radio receivers, equipment for playing recordings and professional studio equipment.
2. The ripple frequency is within television video bandwidth. Analogue TV receivers will exhibit a pattern of moving wavy lines if too much ripple is present.
3. The presence of ripple can reduce the resolution of electronic test and measurement instruments. On an oscilloscope it will manifest itself as a visible pattern on screen.
4. Within digital circuits, it reduces the threshold, as does any form of supply rail noise, at which logic circuits give incorrect outputs and data is corrupted.
5. High-amplitude ripple currents shorten the life of electrolytic capacitors.
Now to answer your actual question. As long as the capacitance and voltage rating match that of your current capacitors then that's all you need. Personally I'd recommend Panasonic capacitors, every time I change an aluminium electrolytic I always change it for a Panasonic capacitor.
The backlight of your monitor shouldn't make any difference to the capacitors you need on your power supply.
The answer is, of course, neither extreme, but somewhere between the two.
You can tell the state of charge of a cell from voltage
If a cell measures 1.5v off load, it is fully charged. You should not charge this cell further. If it's 1.5v on charge, on a low current, then it's pretty much fully charged.
If a cell measures 0.9v off load, it is fully discharged. You should not discharge this cell further. In fact, you should have stopped discharging when it was 1v, on load.
By all means use a multimeter to measure the terminal voltage of a cell that's finished charging. It will tell you there's no point in charging the cell any more. It won't tell you whether the cell now contains 100%, or 80% of its original capacity, only a discharge test will tell you that.
You cannot tell the state of charge of a cell from voltage
If a cell measures 1.2v on or off load, you cannot tell where its state of charge is from 10% to 90%. If you stop or start charging or discharging, then you find the terminal voltage can wander 10s of mV in the first few seconds, then more 10s of mV over the few minutes, in a way that's not modelled by the terminal voltage being dependent on charge alone. With the flatness of the discharge curve, that dependence on history pretty much destroys any predictive power that measuring voltage alone might try to claim.
This is why practical battery metering systems use a gas-gauge IC (coulomb counter) to track the state of charge directly.
Best Answer
What you've observed is called "dielectric absorbtion" or "recovery voltage phenomenon".
It's cause by kind of interia of the dipoles (ions) in the electrolyte while charging and discharging.
From wikipedia:
Further:
From a Mouser note