By "iPhone charger" I assume you mean 5V 500 mA power source. Note that some USB outlets only provide 100 mA until the device actually answers the USB protocol and requests more than that, although wall outlet based ones will have all the juice all the time.
You can use a boost converter to take the 5V input to a 15V output. Either build one on your own using a switching controller, an inductor, some capacitors, and some amount of diodes/transistors, or buy a ready-made one from vendors like CUI or Murata or RECOM. Note that the current at 15V will be about 150 mA with 500 mA in at 5V and 90% efficiency.
Then, use a charge circuit for NiMH batteries. It's OK to parallel the charge circuit putting current in, and the device drawing power out from the batteries, as long as the charge circuit is properly current limited and follows the battery chemistry charging profile.
Actually, given that 1.2V NiHM cells have a max charge voltage of between 1.4V and 1.5V, 15V may not be enough, given the likely drop in the charge circuit. You may need to go to 18V. If you're ambitious, you could control the output voltage of the boost converter based on the current charge level of the batteries. That'd take some careful design tuning, but would make the whole system more efficient.
Stacking NiMH to get to 12V seems bad, though. I'd rather use something safe like a LiFePO4 battery, and charge that from the 5V (using a direct charge circuit, or using a buck converter that is controlled as a charger for higher efficiency.) Then use a boost converter from the battery to generate the 12V output needed. Again, such boost converters can be hand-built, or can be bought from providers. The charger would connect to the battery. The on/off switch for the whole device would connect/disconnect the battery to the boost converter that goes to 12V.
In both cases, the device could run on charger power, assuming the total draw at 12V is less than 150 mA (meaning total power draw of about 2 Watts or less.)
You could merge the two sets of 3 batteries and charge them in series with 7.2v (assuming they're balanced) but the pack of 2 batteries needs to be charged separately with a source of 2.4v.
Nevertheless, if you could remove the batteries it would be extremely recommended to charge each cell separately so they will be balanced and the risk of corrosion will be practically zero.
Best Answer
A battery's charge current is not determined by the basic battery size (eg: AA). The battery's capacity specification (in AH, mah) needs to be considered. A simple AA battery can have a wide range of capacity values, some are low cost low capacity while other are high cost high capacity. The best way to start with a charger design is to consult the manufacturer's specifications, most quality battery manufacturers provide this. In the specifications the required charge current, voltage, safety, and other charging parameters are often given.
In regard to building up a 20v assembly: Did you consider that battery packs in this range already exist? For example manufacturers such as Black & Decker have battery packs available for many of their portable products. 20v units are quite common. If you were to use one of these packs the charger could also be a standard Black & Decker part. If the large interface connector is a problem for your project you may be able to remove/modify it as needed. If you start with a known matched up pair of battery and charger you can save yourself a lot of time and effort. Not to mention you would know they work well together.
Note that many of the newer B&D battery products use Li battery types, though some other brands still use NiMH battery types. Some commercial battery packs are actually made up from stacks of several AA batteries.
Even if you choose not to use a commercial battery pack and charger for your project, looking at the manufacturer's specifications (and other hacks) of such a systems can give you a lot of good ideas on making your own.