Battery x is rated at 5volts, 2500mAH.
Device y draws 5 volts, 5mA. This means that the battery would last 500 hours correct?
Correct.
Now, device z draws 5 volts, 3000mA. This means the battery would last under an hour correct?
Correct.
Now circuit d draws 5 volts, 2500mA. The circuit would last 1 hour.
Correct
Now take a AC -> DC adapter, it is rated at 12 volts, 1A.
What does that amp in this circumstance refer to?
That's the maximum the supply is capable of providing before it overheats and melts.
Is it to work out the maximum resistance that a device plugged in could have?
No, it's purely a maximum rating.
Would this adapter be working as the battery in a circuit?
In effect, yes. It can be seen as a fixed voltage source.
So, batteries don't output amperage. It changes depending on the voltage/resistance of the device.
That is correct. A battery is a (reasonably) fixed voltage, but the current is dependant on the load (Ohm's Law). The mAh of a battery is how much charge is stored in it. A power supply doesn't have this concept as the charge is essentially infinite. A battery, at 2500mAh, has enough charge to supply 2500mA for 1 hour, or for other currents, an approximation:
$$
t = \frac{mAh}{mA}
$$
As @PeterBennet mentions, it's rated at a pre-defined discharge time. Discharging faster than that rated time can result in less apparent capacity.
Now wattage. Why are some devices rated in watts?
For example my PSU is 600watts.
So it outputs 600watts total over a bunch of 12v, 5 volt and 3 volt cables?
It's input is 220V, so the amperage it draws is: 2.7Amps?
Or is that the amperage it outputs?
Again, it's the upper limit. It can output 600 watts in total over the different outputs before it overheats. For a supply with multiple voltages it's often easier to specify the total limit using a voltage-agnostic value, like Watts. If you just specify current, then you have to specify it for each voltage, and if some current may be shared between voltages (e.g., cascading 3.3V off 5V) then it gets more confusing. Simpler just to give a total power rating.
It of course varies from chemistry to chemistry.
But also from manufacturing procedure to the next.
A good manufacturer of batteries knows what they are doing and can give you very detailed specifications for allowable maximum rating, advised rating and absolute peak rating. For LiPo there are many cheap manufacturers that get relabelled and resold and then, well, you can't be sure and you might be best off to stick to 0.5C, because a cheap manufacturing process in the Lithium trade can create unequal surfaces and high currents may then cause very annoying aberrations on the "plates" of the cell. Which is bad for its life span to the order of NO!
If it's a very doubtful cell I'd say don't even go over 0.25C.
But with increased experience and insights in how to make a good Lithium based cell, there are procedures that create cells that handle 5C charging with only a 10% decrease in usable life cycle, which in many products can be very desirable.
So, to TL;DR:
If you know the battery's specification, because it comes from a reliable source and with a datasheet showing many charge and discharge graphs and such, you can go by whatever it says. If it doesn't, at least stick to below 1C, if possible to below 0.5C.
(Very good factories even include a graph that shows Charge and Discharge current effect on expected average cell life time in cycles)
Best Answer
There is a charge controller chip inside the phone that determines how much current to put into the battery. Generally lithium ion batteries are charged with a constant current until the cell voltage reaches a specific level, at which point the charge controller switches over to constant voltage charging until the current drawn by the cell decreases to zero. It's a bit difficult to think about in terms of resistances as the cell itself has chemical reactions going on inside and the charge controller is built up with many transistors.
One thing to note about ratings: the rating on the power supply is generally the nominal voltage and maximum current. It does not supply the current on the label at all times. It's quite easy to see why this is: when nothing is connected, there is no path for the current to flow so the current is zero.
Charge controllers generally regulate the flow of current into the cell in one of two ways. Depending on the design of the charge controller, the controller IC can use a transistor to act either as a switch or as a variable resistance. Linear charge controllers work like super fancy variable resistors, changing the resistance between the charger input and the battery terminal so that a specific amount of current flows. The current is usually measured with a current sense resistor, a resistor with small value (generally 0.01 to 0.5 ohms) that generates a small voltage in proportion to the current. The measured current is then used in an analog feedback loop to control the transistor. This drive transistor dissipates the difference in voltage between the charger input and the cell as heat, P = (Vcharger-Vcell) * Icell. Linear charge controllers are generally small and cheap, but inefficient. This dissipated power can result in quite a bit of extra heat that has to be dissipated somewhere. Linear charge controllers also must have a higher input voltage than the desired cell charge voltage. Lithium ion batteries generally charge to around 4.2 volts per cell, so a single cell with a 5v power supply leaves the charge controller around 800 mV to work with.
Another design of charge controller is a switching controller. These controllers use a DC to DC converter to move charge into the cell. A DC to DC converter uses two switches (generally a transistor and a diode) and some form of energy storage (generally an inductor and several capacitors) to efficiently change the input voltage. A step-down conveter (also known as a buck converter) works by alternately storing up and draining energy in the inductor at a high frequency (100s of kHz to a few MHz). Since the transistors are either fully on or fully off most of the time, less power is dissipated making the converter more efficient. It is also possible to design a converter that can draw power from a supply with lower voltage than the cell voltage. Aside from the DC to DC converter, the operation of a switching charge controller is essentially the same as a linear charge controller: it measures the cell current and voltage and generates a control signal to adjust the duty cycle of the switching transistor to change the current flowing into the battery. Switching charge controllers are more complex and more expensive, but more efficient than linear charge controllers.
Now, as for how much current the charge controller can draw to charge the battery, this is generally determined by the software running on the phone. When you connect the phone to your computer's USB port, it can only draw a limited amount of power before it has to ask the computer for permission to draw more. Cell phone chargers generally advertise their current limit via a resistor connected between the USB data lines. This resistor is detected and measured and the corresponding current limit is then passed along to the charge controller so it knows how much current it can safely draw to charge the battery.
As far as sharing power with the battery charger, the phone will certainly draw additional power above and beyond what goes in to the battery. In fact, depending on how the phone is configured, it can draw more power when plugged in to a charger than it would if it was running off of its internal battery, using this current to provide a brighter display, longer backlight on time before standby, higher CPU performance, etc.