The impact of running a 3.2V device on a 1.2V battery would be that the battery would go flat very rapidly, and the device won't function.
There are two values you need to consider: The voltage and the amperage (or current). The voltage is measured in volts (V) and the current is measured in amps (A or mA - 1A = 1,000mA)
The voltage of the device and the power source (battery) have to match. Too much voltage from the power source and you will destroy (or seriously damage) the device. Too little and it just won't operate.
The power source has to provide at least as much current as the device requires. The device will never draw more current than it needs, so it is perfectly safe to use a power source with a higher current rating without damaging the device. However, using a power source with a lower current rating that the device could risk damaging the power source - in the case of a battery it could cause the battery to rupture and a fire could be caused.
Batteries can be connected together in series to increase the voltage (+ of one battery connected to - of the next; + of that one to the - of the next etc), or in parallel to increase the current (all the + linked together and all the - linked together) or you can do a combination of the two to increase both the current and the voltage.
So, three batteries at 1.2V each connected in series would give 3.6V - a little over the rated voltage of the device, but it may be allowable - you should check the manual or data sheet for the device.
Batteries don't have a current rating as such, but instead have a "mAh" rating. That's *milliamp-hours" or the amount of current that can be given out in an hour.
So, an 800mAh battery can give 800mA over the course of an hour before it goes flat. Or it could give 400mA over 2 hours, or 200mA over 4 hours, etc. The more current that is drawn the quicker it will go flat.
To better understand voltage and current I like to try and get people to visualize a pipe with water flowing through it
The voltage is akin to the diameter of the pipe. The wider the pipe the more water can flow through it at once.
The current is akin to the speed of the water flowing through the pipe. The faster it flows the more kick it has as it squirts out the end.
The water pressure is akin to the power or wattage (which is the current multiplied by the voltage), which is like the number of liters per hour that flow through the pipe.
As far as chargers are concerned that depends on the chemistry of the battery you buy.
There are three major chemistries that fall into two groups:
Ni-MH - Nickle Metal Hydride. These are the run-of-the-mill AA and AAA rechargeable batteries you buy in the shop. Your normal AA or AAA battery charger charges these easily. Most will have a charge current and time on them, such as "16hr at 220 mA".
Li-Ion and Li-Pol - Lithium Ion and Lithium Polymer. These are the kind you get in things like your mobile phone. They are much harder to charge up and require special electronics to manage them. They must not be allowed to go completely flat or you won't be able to charge them up again. However, they are much more powerful than the Ni-MH ones.
The Cycle Use is the voltage the Battery needs to become completely full, but it should not be held at that voltage, because the cells don't really like that. The Standby Use voltage is the voltage that the battery can be kept at once it is full to keep it in good shape "on the shelve" until you need it.
You can easily charge the battery at 13.5V, but it will not become completely full, or if it does it will take very long. There is nothing really stopping you from that, but you will not get the full range you could or should.
Whether it is compensated by more charge/discharge cycles depends on the Battery Chemistry, but as far as I know the balance is always worse with lower storage. Seeing your voltages I'm guessing Lead-Acid or LiFePO4, in which case I'm fairly sure you'll get more total-life-time-range when charging them according to specification.
Best Answer
I'm assuming you only want to charge NiMh and/or NiCd based cells because that's what your old charger supports. If it supported Lithium based cells it would need to have more complex electronics.
The more expensive chargers often use a higher charging current for faster charging but this means that this current has to be switched off or lowered when the cells are full.
Cheap chargers like your old charger take much longer (10 - 14 hours) to fully charge the cells. These chargers simply charge with a small current which is allows continous charging meaning, it does not need to be switched off.
When NiMh and NiCd cells are fully charged but you still charge them, they get warm. That's not a problem at a low current (as is the case with the simple chargers) but it is a problem (the cells will overheat) at a large current. So fast-chargers need electronics to detect that a cell is full and stop fast-charging it.
If you can wait for your cells to charge, by all means get a slow / cheap charger.
If you cannot wait, get a fast charger.
You could also consider buying more cells and alternate sets using the slow / cheap charger and for the same money as a fast charger.
For cell lifetime in my experience the slow charging does put less stress on the cells so that should make them last longer.