The output impedance of the potentiometer (with both ends connected to low impedances) varies over the pot's revolution. The greatest is at halfway. The impedance is the parallel combination of half the pot's resistances, or one-fourth of the potentiometer's resistance.
Each ADC will give a specification for maximum driving impedance. If this is greater than the pot's output impedance, then accuracy will be affected. Some ADC's care more than others about input impedance. Some datasheets will give you an equivalent circuit to the ADC's input impedance. You can use this to analyze the error you are likely to see, either by hand or with a Spice simulator.
In particular, watch out for some PIC microcontrollers which have a surprisingly low input impedance and maximum driving impedance spec.
The low-pass is for one of several reasons:
1) There is a low-pass filter for removing frequencies above the Nyquist frequency. But a hand-operated potentiometer just won't generate enough high frequencies for aliasing to be a problem.
2) The capacitor is right at the ADC input pin and is intended to reduce the driving impedance, at least at high frequencies. I've had mixed success with this method. It doesn't work if there is DC leakage at the ADC input.
3) The capacitor is across the potentiometer's wiper. The common failure mode for potentiometers is that the track gets dirty and makes intermittent contact. You have experienced this problem if you've ever had an old radio that made loud crackling sounds when you touched the volume knob. With a capacitor across the wiper, the wiper voltage doesn't change if it loses contact with the track intermittently. (This trick only works for DC signals.)
Remember, op-amps are pretty cheap. I would recommend always buffering the signal before ADC unless the volumes are high enough that the cost savings are worth it.
Looks like you are thinking many good thoughts and I can't say any of your points are counterproductive when coupled with careful layout.
As for other techniques, maybe it's worth taking a look at the bigger picture. Here are some ideas for that.
When using 24 bit resolution (that is a LOT of resolution) for bio signals, you are often really using this for common mode/DC offset where the real signal resolution is only maybe 5-8 bits. There are other ways of achieving both high common mode rejection (instrumentation/differential amplifiers) and DC offset (Right-Leg-Drive) etc.
Bio signals are often very low frequency, so low-pass filtering at the inputs can reduce the noise pickup.
The very high impedance nature of typical bio signals results in very low currents. This makes the cable sensitive to noise. If you can shorten the cable, use shielded cables or bring the impedance down (by using an active electrode instead of a passive electrode), that can improve the situation.
As a final note: Think about where you want the return currents to run. That is the key to low voltage/low current design. Make sure you are not sharing the signals return current path with any other currents.
Best Answer
The datasheet says Low input voltage noise: 2.2 nV/√Hz. The parameter is really just input voltage noise, and they are saying that it's low to make the part sound more awesome.
The noise is usually modelled as having a constant spectral density. Thus, the spectral density of noise power would be specified in \$\mathrm{W/Hz}\$. The higher the bandwidth (for an ADC, this depends on your sampling rate), the more spectrum there is with noise in it, so the more total noise power there is in your measurement.
However, this parameter is voltage noise. Power is proportional to the square of voltage:
$$ P = \frac{E^2}{R} $$
So if \$R\$ is constant, and in this case it's the input impedance of the ADC, which is approximately constant, and you want just the voltage component of the noise (because the ADC measures voltage, not power), then you need to take the square root of the noise power, and you are left with a measurement in units \$\mathrm{V/\sqrt{Hz}}\$.
With this parameter, you can know how much voltage noise there will be, and thus, how much noise there will be in your measurements. So let's say your input bandwidth is 24 kHz. Take the square root of this, and multiply it by the input voltage noise spectral density to get the RMS noise voltage:
$$ \require{cancel} \frac{2.2\:\mathrm{nV}}{\cancel{\sqrt{\mathrm{Hz}}}} \sqrt{24000}\cancel{\sqrt{\mathrm{Hz}}} \approx 341\:\mathrm{nV_{(RMS)}}$$
This means that the measurements you get from the ADC will look like noise with an RMS amplitude of 341 nV was added to your signal if the input bandwidth is 24 kHz. More bandwidth means more noise, less bandwidth, less noise.
Further reading: ADC Input Noise: The Good, The Bad, and The Ugly. Is No Noise Good Noise?