Given this number of questions all in one blob, I'll only discuss each one briefly, but this could get you started:
When connecting the rails together for a +/- voltage, will there be a reverse current through one supply? If yes, will a standard linear supply design (transformer/rectifier/LM317) handle that with no problems? Considering electrolytic caps and such.
A negative supply will need to "sink" current rather than source it. But I don't consider this a "reverse" current, its the correct way for current to flow in a negative supply.
An LM317 is a positive linear regulator. It can only source current, not sink current. For your negative supply you will need a negative regulator, like an LM337.
You always need to connect your electrolytic capacitors so that their + terminals are connected to a more positive voltage than their - terminals.
Also, if yes above: how can I home-hack a µC based ammeter that can understand current in both directions?
You will need an analog circuit to translate the sense-resistor voltage into the range of your A/D input. This is a big enough topic for a whole question of its own.
For a transformer with this pinout: pin 1-2 = 12 V, pin 3-4 = 12 V, what happens if we measure voltage between pins 1 and 3, or 2 and 4, with a multimeter? Is a 0 volt reading guaranteed, or does something else happen?
If you are still talking about two unconnected secondary coils wound on the same core primary, there is no guarantee about the dc voltage between them except by what you wire up external to the transformer. The ac voltage will be in phase for the two secondaries because it just follows the input on the primary. Obviously, by observing the "dot-location" on the two coils, you could arrange for the two outputs to be 180 degrees out of phase.
My current design has a 0.1 ohm shunt resistor (the supply is 1.2-13 volts or so, at up to 1.5 A). Is this a decent value? Lower is of course better for the output voltage, but the opamp has to amplify it more. I don't want to lose too much precision due to amplified noise!
The resistor value won't much affect the output voltage, provided you take your feedback for your regulator after the shunt resistor.
For 1.5 A through 0.1 Ohms, your resistor will be burning 225 mW. This is somewhat excessive for this function, and will cause the resistor value to drift due to self-heating. So you can either lose precision due to noise by reducing the resistor or lose accuracy due to thermal effects by keeping it large. I'd expect you could drop the resistor to 0.01 Ohms and still get good precision in your current measurement (10 bit at least) but that will depend on good analog design.
If you really want exceptional accuracy in this application you may want to look into "bulk metal foil" resistors from Vishay, which are much more stable than carbon film and other types w.r.t. thermal drift (and a bunch of other effects).
Assuming all voltages ultimately come from the same mains jack in one room, is it ever dangerous to put a multimeter set to voltage between ANY two places? For example, between the DC PSUs output and earth ground, or between earth gorund and a wall's live wire, etc?
If you are using correctly-rated probes and a correctly-rated meter, you should be able to probe mains without damaging the instrument or injuring yourself. If you are using incorrect equipment you could start a fire or electrocute yourself. Even some meters (like hardware-store models) that are labelled for 100-200-400 V are not really designed safely, so stick to reputable brands (Fluke, Keithley, Agilent, ...).
Building an AC-DC power supply inherently means working with mains, so if you don't know how to keep yourself safe while doing that, you might want to consider alternative projects until you get more experience.
Best Answer
I think you are focusing on a small fact about power conversion efficiency, and ignoring all the more important factors:
(this is partly why Tesla's AC beat Edison's DC system),
Edit:
Copper is expensive, London Metal Exchange - copper lists one tonne of copper (cash) is $7050, and that hasn't been processed into cable.
By comparison, London Metal Exchange - steel billet lists one tonne as $430, i.e. copper is 16x more expensive than steel.
So how thick must the 50A 12V cable be so as not to waste more energy as heat than AC? (Remember most of the benefit sited is efficiency, so it seems reasonable to ensure the losses due to heat in the DC cable are not worse than 230V AC.)
Lets compare AC mains cable rated at say 15A (Europe 230V), to 12V 50A DC cable.
Power loss = I^2 R
15^2 Rac = 50^2 Rdc
Rdc = Rac x 15^2/50^2 = Rac x 0.09,
i.e. Rdc must be ~11x lower resistance than Rac to achieve a similar power loss
DC cable would need to have over 11x greater cross sectional area than AC cable to reach the same power loss in heat. Put another way, DC house wiring would need 12x more copper so that the losses in the DC cables were no worse than AC mains.
Further, while the 230V 15A AC cable could carry 3.6kW, the 12V 50A DC cable carries 600W for the same losses.
Yes, for 11x more copper (to maintain comparable energy losses to AC) the DC cable carries 1/6th the power. Even at the scale of a house, the proposed low voltage DC cabling is not viable vs 230V AC.
The significant and important economic, practical, political and transition costs seem to dwarf any notional benefit.