So you need to measure supply current at 10 ksamp/s from 100µA to 30mA, which is 300:1 range.
That by itself sounds doable enough. Even a 10 bit A/D built into a microcontroller is enough resolution if the signal is amplified properly. 10 kHz sample rate is also quite doable. In fact, I'd want to sample faster than that and do a little low pass filtering and decimation in the micro. 100 kHz sample rate isn't even pushing it for something like a PIC 24H. At 40 MIPS that would leave 400 instructions/sample. That is much more than needed for a little low pass filtering and background bookeeping, so that checks out fine too.
The real question is what does the power feed look like and to what extent can you break into it. Are the units under test powered with LDOs? That would be useful, since a small current sense resitor before the LDO wouldn't effect the unit under test power voltage at all. You'd have to subtract off the LDO current, but that is doable. By putting the current sense before the LDO, you can afford to have it drop a little more voltage since the LDO will make sure the UUT still sees the same supply voltage. This of course assumes there is enough input voltage headroom to play with.
If you have to put the current sense directly in line with the UUT, then you have to carefully consider voltage drop versus sensitivity and therefore ultimately signal to noise ratio. Maybe 1Ω is reasonable. That would only drop 30mV max, which wouldn't effect most devices much at all. You'd need a differential amplifier and a overall gain of 100 so that 0-30mA results in 0-3.0V, which is just about the right target for a processor running at 3.3V. Various folks make such diff amps or specifically high side current sense amps. If this is a one off, I'd start with Analog Devices. A 10x diff amp with 1 MHz gain-bandwidth shouldn't be hard to find. That would need to be follwed by a ordinary 10x amp before the micro, again with 1 MHz gain-bandwidth being adequate. You could try doing the whole thing with a single 100x diff amp, but the gain-bandwidth product should be at least 10 MHz so the choices will be more limited.
I don't know why everyone else is focusing on the carrier frequency. The data throughput is a question about the modulating frequency (i.e., baud rate).
The Doppler effect applies to modulation, as well. If the transmitter is moving toward the receiver, the baud rate is increased by a factor of 1.000000092, and if it is moving away, the factor is 0.999999908 — the baud rate is reduced.
There is also relativistic time dilation, which is based on what fraction of the speed of light the relative velocity is. If it takes Δt seconds to transmit some number of symbols, they will arrive at the receiver in Δt' seconds:
$$\Delta t' = \frac{\Delta t}{\sqrt{1 - \frac{v^2}{c^2}}}$$
So yes, the data throughput from the receiver's point of view is reduced by a very tiny fraction at the speed you're talking about.
For a speed of 27.778 m/s, that fraction is about 4.28669×10-15, giving a ratio of 0.9999999999999957.
Best Answer
The 2.4 GHz carrier for the radio is generated with a dedicated voltage controlled oscillator. This oscillator will be locked to a low frequency reference with a PLL for stability. The data to be transmitted is not actually sent at 2.4 GHz - it gets generated by a digital to analog converter at several MSa/s. A mixer will be used to translate the output of the DAC up to the required RF channel frequency. There will also be dedicated signal processing logic that translates the actual packet data into baseband modulated samples that get sent to the DAC. The processor only provides the packet data to the beginning of the transmit chain, the rest of it is handled in dedicated digital and analog hardware. The receive chain will be similar to the transmit chain, except operating in the other direction and with a few additional components for tracking the carrier.