I'd recommend looking at 433MHz or 868MHz (ISM band) transmitters.
TI/Chipcon have the CC1101, a low cost Sub-1GHz transceiver. There's also an 8051 system-on-chip variant, the CC1110 which could remove the need for your AVR (see also CC430).
Silicon Labs have the Si403x or the system-on-chip Si4010.
All of these chips support data rates up to 128kbit/s, which is plenty of time to transmit a packet. The hardware CRC generation and checking will help weed out bad packets.
To stop nodes jabbering all over each other, you will need to invent some kind of MAC (Media Access Control). You could use carrier sense to wait for silence, with an exponential backoff (CSMA). Or, you could coordinate the timings of your nodes and assign a slot to each node. You may also benefit from having a master node transmitting a timer beacon to prevent clock drift.
In theory, ZigBee supports up to 2^16 nodes on a single PAN. ZigBee is built on 802.15.4, which provides a robust MAC layer and mechanisms for network management (joining, leaving, etc). However, an off-the-shelf module like the XBee may struggle with 1000 nodes and it certainly won't be cheap. For a volume ZigBee deployment, consider the TI CC2531 or Ember EM250/EM260. ZigBee usually runs at 2.4GHz, which will not provide such good penetration of terrain as 433/868MHz.
This suggests it's impossible to perfectly determine the instantaneous signal even in a theoretical channel with zero noise
I'd turn this around and say it's impossible to instantaneously determine the signal.
one has to assume that the signal is changing slowly (band-limited). How is this overcome in practice?
In practice, our message signals are bandlimited, so this is not a difficulty. In fact, our message signals generally have much less bandwidth than the carrier.
To approach your theoretical question, is a band limit strictly required, Imagine trying to modulate a 1 Hz carrier with a 1 MHz signal --- the result would be unusable. So in fact there must be some kind of limit.
Best Answer
One method is by actively steering the antenna (mechanically or electronically) to place a "null" in the direction of the jammer, reducing its signal strength significantly, while affecting the desired signal minimally, if at all.
Also, assuming the jammer signal strength isn't so strong that it saturates the receiver front end, advanced DSP techniques can be used to estimate and cancel the effects of the jamming signal. The communications protocol itself can be designed to optimize the ability to do this. The problem for the jammer is to mimic the desired signal closely enough to confuse the anti-jam algorithm.