The impact of running a 3.2V device on a 1.2V battery would be that the battery would go flat very rapidly, and the device won't function.
There are two values you need to consider: The voltage and the amperage (or current). The voltage is measured in volts (V) and the current is measured in amps (A or mA - 1A = 1,000mA)
The voltage of the device and the power source (battery) have to match. Too much voltage from the power source and you will destroy (or seriously damage) the device. Too little and it just won't operate.
The power source has to provide at least as much current as the device requires. The device will never draw more current than it needs, so it is perfectly safe to use a power source with a higher current rating without damaging the device. However, using a power source with a lower current rating that the device could risk damaging the power source - in the case of a battery it could cause the battery to rupture and a fire could be caused.
Batteries can be connected together in series to increase the voltage (+ of one battery connected to - of the next; + of that one to the - of the next etc), or in parallel to increase the current (all the + linked together and all the - linked together) or you can do a combination of the two to increase both the current and the voltage.
So, three batteries at 1.2V each connected in series would give 3.6V - a little over the rated voltage of the device, but it may be allowable - you should check the manual or data sheet for the device.
Batteries don't have a current rating as such, but instead have a "mAh" rating. That's *milliamp-hours" or the amount of current that can be given out in an hour.
So, an 800mAh battery can give 800mA over the course of an hour before it goes flat. Or it could give 400mA over 2 hours, or 200mA over 4 hours, etc. The more current that is drawn the quicker it will go flat.
To better understand voltage and current I like to try and get people to visualize a pipe with water flowing through it
The voltage is akin to the diameter of the pipe. The wider the pipe the more water can flow through it at once.
The current is akin to the speed of the water flowing through the pipe. The faster it flows the more kick it has as it squirts out the end.
The water pressure is akin to the power or wattage (which is the current multiplied by the voltage), which is like the number of liters per hour that flow through the pipe.
As far as chargers are concerned that depends on the chemistry of the battery you buy.
There are three major chemistries that fall into two groups:
Ni-MH - Nickle Metal Hydride. These are the run-of-the-mill AA and AAA rechargeable batteries you buy in the shop. Your normal AA or AAA battery charger charges these easily. Most will have a charge current and time on them, such as "16hr at 220 mA".
Li-Ion and Li-Pol - Lithium Ion and Lithium Polymer. These are the kind you get in things like your mobile phone. They are much harder to charge up and require special electronics to manage them. They must not be allowed to go completely flat or you won't be able to charge them up again. However, they are much more powerful than the Ni-MH ones.
See Smart charging circuit for NiMH battery pack where the answer states
In such cases a very reasonable charging strategy is to terminate
charge at 1.45V per cell.
It is reasonable to believe he is referring to "at the cell".
It is worth noting that BQ2002PN is a FAST charge. You need to ensure it will not burn out your Cells. A good charger will switch between slow and fast. In-circuit application charging should design the charge rate to exceed the discharge rate of the applications load and consider the margins. It is more then acceptable to use a fixed supply voltage and resistance to supply a minimal charge rate. Assuming it is not too high, a small trickle small enough not to exceed self heating, works.
Before Low Self Discharge Cells we made a +12V with Diode drop and Resistor to slow charger for a dozen parallel Cells, (i.e "Hot and Ready", not really hot). It is cheaper than a smart charger. And we could keep the charge rate very low. Lower then most chargers slow rate as they are still higher (enough for larger capacities) than needed and wasteful.
In fact I have several Maha and LaCrosse (nice chargers) for NiMH, but they are too smart. When using more than one cell, on a load, in series (typical 3 and higher) they UN-evenly discharge. Where one gets below the Under Voltage sense and considers it a failed cell. But putting it on a 5ma source for a minute kicks it up and then it works on the Smart Chargers.
You should refer Ada's page on Minty Boost. LiPo's are the in spot for in circuit charging and plenty of chargers for them. Such as SFE's Lipo Charge Boost. There are plenty of examples out there.
Best Answer
You don't need to short the +5 got ground, only the bottom end or R2 needs to be connected to +5. That's going to be tricky to do with a N-channel-MOSFET because to do that you need a voltage higher than 5V for the MOSFET gate.
g8ven a 12V (or 10V) supply in addition to the 5V supply it's possible.
simulate this circuit – Schematic created using CircuitLab
The 12V doesn't need to be strong, you could use the +10V from a MAX232 or build your own capacitive booster
a simple voltage booster can be made like this. it don't do a very good job the voltage is higly load dependant, but it's probably enough if you just want to turn a mosfet on or off.
simulate this circuit