mAh (or mA·h) is not how many milliamperes a battery can deliver in an hour. That would be mA/h. Current, measured in amperes, is already a rate of stuff. Specially, one ampere is one coulomb per second. So, if current is like speed, then mA/h is like acceleration, and mAh is like distance.
Rather, mAh it is a unit of charge. It is what you get when you multiply current by time. By multiplying by time, the "per time" part of the ampere is cancelled, and you get back to charge.
If an ampere is a coulomb per second, then:
$$ \require{cancel} 1~\mathrm{mAh} = 1\cdot10^{-3}~\mathrm{\frac{C}{s}h} $$
and by dimensional analysis:
$$ \require{cancel}
\frac{1\cdot10^{-3}~\mathrm{C\cancel{h}}}{\cancel{\mathrm{s}}}
\frac{60\cancel{\mathrm{s}}}{1\cancel{\mathrm{min}}}
\frac{60\cancel{\mathrm{min}}}{1\cancel{\mathrm{h}}}
= 3.6~\mathrm{C}$$
For example, if you draw 1 mA for 1 hour from a battery, you have used 1 mA · 1 h = 1 mAh of charge. If you draw 2 mA for 5 hours, you have used 2 mA · 5 h = 10 mAh.
You can approximate how long a battery will last by dividing its total charge (in mAh) by your nominal load current (in mA). Say you have a 1800 mAh battery, and you connect it to a 20 mA load:
$$ \require{cancel}
\frac{1800~\mathrm{mA\cdot h}}{20~\mathrm{mA}} =
\frac{1800\cancel{\mathrm{mA}}\cdot\mathrm{h}}{20\cancel{\mathrm{mA}}} =
90~\mathrm{h}
$$
This is an approximation because:
The charge capacity (the number measured in mAh) is determined by measuring how much charge can be removed from the battery before voltage drops to some arbitrarily selected level where the battery is considered "discharged". This may or may not be the threshold at which your circuit no longer functions. Battery manufacturers, wanting to make their batteries seem as good as possible, typically select a very low threshold voltage.
Assuming you are considering charge available only down to some voltage threshold, the actual charge available from the battery depends on temperature, and the rate at which you discharge it. Lower temperatures slow the chemical reaction in the battery, making it harder to extract charge. Higher rates of discharge increase losses in the battery, decreasing the voltage, thus hitting the "discharged" voltage threshold limit sooner.
The electric potential difference provided by the chemicals in the battery is actually constant; what makes the voltage decrease is the depletion of the chemicals around the electrodes and degradation of the electrodes and electrolyte. This is why battery voltage can recover after a period without use. So, the point at which the threshold voltage is reached can actually be quite complex to determine.
If you can find a good datasheet for your battery, it may give some insight into the parameters under which these calculations were made.
A 3.5 Volt, 1000 mAh battery can supply 3.5 watt hours. That means the battery can supply 500 watts for 0.007 hours. So, with perfect efficiency of all circuitry, including stepping up the battery's 3.5 volts to the 12-24V the motor needs, you could theoretically supply full power to the motor for 25 seconds.
Of course, it's very unlikely the battery could tolerate being drained in 25 seconds. The fastest draining batteries I've seen can tolerate being fully drained in 72 seconds with only moderate loss of lifespan.
You're going to have to design a circuit to step up the 3.5 volts from the battery to at least the 12V that seems to be the motor's minimum specification.
An 11.1V 30-60C LiPO battery (such as those used for drones) seems like a reasonable match for your application, and could probably run the motor at 1/2 voltage (just under 12V) and 1/2 duty (pulsed). The one I linked to stores about 24 watt hours and could probably run the motor at 1/3 speed or so for about 5 minutes. It's designed to tolerate complete discharge from a fully charged state in as little as one minute.
Update: Since this is a quadcopter application, you should be looking at using a brushless motor with an electronic speed controller. Otherwise, it will be very difficult to adjust the rotation rate of the motor quickly enough to provide a stable platform. You should look at the design of a quadcopter that's similar in size and weight to your requirements and look at what batteries, motors, speed controllers, and props it uses.
Best Answer
Did you read the battery life specs for your smartphone? Did you believe them? Calculating battery life for a smartphone is easier than doing it for a robot. There are many ways to calculate this, and @geometrikal gave a reasonable summary of it. But there is a problem with this approach. The accuracy of your calculations are only as accurate as your data-- and your data is terrible. I posit that while you can do these calculations, the results will be meaningless to the point that you're better off not trying (very hard).
Let's just look at your main drive motors. Some things that can effect the current draw of these motors are: speed, weight, dirt/tile/carpet/floor, acceleration, breaking, etc. Can you accurately predict the usage of your robot and figure out how much power your motor will require? Probably not.
Now look at the arm motors. Same thing applies here. Can you predict how the arm will be used? How much current will the arm require when picking up something heavy vs. something light?
How about your CPU? The power consumption of the CPU depends on what the software is doing. Doing lots of complex calculations with massive memory accesses will consume a lot of current, but sitting idle the CPU power consumption will be less. Many CPU's also have ways to achieve lower power modes by reducing the clock rate, going into a sleep mode, and turning off various peripherals. Have you mapped out how your software is going to work? Does your OS support various power-down modes, and if so then which ones?
Then there is your power system. What is the efficiency of your power supplies at different loads? A typical SMPS efficiency can vary from 60% to 95% depending on the design and what load it is at. If the load is constant then the efficiency of the power supply and the wiring will be different than if the load is pulsed (a.k.a. PWM-ing the motors). Have you worked all this out?
The accuracy of this data is going to directly effect the accuracy of your battery life estimates. The problem is that your accuracy is going to be terrible. There might be a 2x to 20x difference between your low and high estimates.
Here is what I recommend doing:
Go through the exercise with worst case and reasonable numbers. Don't worry about getting it super accurate (since it won't be anyway). Basically all you are doing is seeing if the size of battery is "somewhere near correct". Then, if possible, choose the next larger battery size!
Once the robot is built, build something like a robot course. This is a basic set of operations/movements/etc that the robot can do over and over-- exactly the same way each time. Hopefully this course will approximate what you think will be a typical use for the robot. This course does two things: it tells you what you can expect, but more importantly it gives you a way to judge if any power improvements you made really worked!
Note: The battery life figures that you get from step 2 are only estimates. Even those are only as accurate as your test course. It won't be super accurate for real world uses, but it will be a whole lot more accurate than what you did for step #1 and more accurate for what you might have gotten if you spent weeks calculating everything out.