Why don't all devices use this? It adds cost and complexity. Is their any other reason for not doing something?
Seriously, I'd say that there are plenty of options and implementations for this. Having two equal batteries doesn't make much sense, so often the second is used for emergency or limp-home power. For instance, your PC has a RAM retaining battery on the motherboard for when you loose power. A laptop often gives a "Low battery" warning, at which time you're welcome to reduce power however you can.
I think that your statement that 'batteries work best if they are used until they are completely drained, and then recharged.' is a little broad. This is more the case for Nickel-based (NiCd and, to a lesser extent, NiMH) chemistries. Lithium Ion cells don't suffer this memory problem. In fact, their lifetime improves if you avoid deep discharges. See this page from BatteryUniversity.com for reference.
There are a couple of options for doing more intelligent power management in your own devices.
The simplest is an ORing diode on the power supply. If all you want is a hot-swappable power supply and you have a bit of leeway for your inputs, you can connect backup battery to the anode of a diode, and connect the cathode to your main battery. When the voltage of the main battery dips to 0.7V less than your backup (Or is removed), the other battery kicks in. Be careful of leakage current into the backup battery, it might overcharge it.
Alternatively, you can use a power mux IC like the TPS110. This lets you select your input independently (or dependently, if you prefer) of the input voltages, instead of always using the higher supply.
Finally, Linear Technology incorporates what they call "PowerPath" controllers into their battery charging ICs. I've used their LTC4011 which seamlessly transitions between battery and external power, and charges the battery while running off of the external power.
I think you can fix it as follows:
Put a pulldown on sysoff, maybe 100k. Just make sure it is strong enough to overpower the internal 5M pullup. Also keep sysoff connected to your IO expander. It should be configured as an output driven low. But when you want to power down, have the expander drive the output high, which will overpower the 100k pulldown, and turn the system off.
Once the system is off, your rails will all collapse, so the IO expander will be dead, and the 100k pulldown will ensure that sysoff is now low so that startup will be possible the next time charger is attached. You will need to make sure that the IO never goes high during bootup or normal operation. For example, program it to be low before you program it to be an output.
Best Answer
I think you're laboring under a false assumption. Where there are power strips that turn outlets off to save power based on a 'control' outlet, the 'control' outlet is always live. It will always be consuming its 'phantom power'/'vampire power' (cue the Twilight fangirls!). All of the outlets that are slaved to the 'control' outlet will be off until the 'control' outlet has enough current flowing through it to trip the circuit. As others have suggested, this is the only reasonable way to do it - you can't be continually switching something on and off just to test whether it's on.
I have such a power strip for my home entertainment system - older solutions were to run power through your receiver via a built-in pass through. When the receiver is off, the passthrough is off, when on, it's on. I can't say for certain whether it's saving me any money but it definitely means fewer absurd 'off' lights on the electronics (seriously, a light that's on when the device is off? Crazy).
If you want to measure this 'phantom'/'vampire' power, get a Kill-a-watt. It measures power draw of whatever is plugged into it. I haven't used one yet but many people rave about them, and Lady Ada hacked one to put an Xbee unit in it for datalogging purposes, so they're at least popular.