Electronic – Tradeoffs between transmitting and listening with low-power 2.4GHz

protocolRF

On a number of 2.4GHz chips, listening for a signal is actually more expensive than transmitting. If one is trying to minimize power consumption with such devices, to what extent should one strive to minimize transmissions, and to what extent should one be willing to make extra transmissions in an effort to minimize listening time? Is there an "etiquette" when using low-power unregulated transmissions?

For example, suppose one wishes to have two devices remain 'in sync' with each other, but one doesn't need them to exchange much data. Both devices have agreed on a time, +/- 1ms each, when one will send the other a packet. Assume that transmitting a packet takes 100us; assume further that powering the transmitter for 100us costs as much as powering the receiver for 100us, so power may be measured in terms of active transmit+listen time.

One approach would be to have a receiver switch on for 2.1ms, and have the sender send one packet. Total active time would be 2.2ms. Another approach would be to have the transmitter blindly send five packets at 0.5ms intervals, and have the receiver listen for 0.6ms. Total active time would be only 1.1ms–a 50% savings. On the other hand, the latter approach would generate five times as much radio traffic as the former approach, thus creating more potential conflicts with other devices' packets.

Should devices try to minimize unnecessary transmissions, or is it acceptable to give priority toward minimizing battery usage?

EDIT/clarification

My question is not how best to minimize power usage, but to what extent one should factor in the negative externalities (social costs) of unnecessary transmissions. If many devices in the 2.4GHz band blindly transmit much more data than needed, in the hope that some will get through, that will greatly impair the utility of that band. On the other hand, it would be silly for the designer of a device to accept an extra 10% power drain so that it will cause only 0.1 percentage points of congestion in the 2.4GHz band instead of 0.2, if the designers of other devices that use more bandwidth don't make similar efforts. My intuition would be that trying to avoid causing 10% congestion at a frequency if one could get by almost as well causing only 5% would be good, even if it increased power consumption slightly, but reducing congestion from 0.01% to 0.00001%–even though it would be a thousand-fold reduction in congestion, would not be worth even a 1% increase in power draw. On the other hand, I don't know where the trade-offs should lie, or on what time scales congestion should be measured. A device which transmits solidly for one minute each hour, or one which transmitted for 2us every 120us, would seem more "obnoxious" than one which transmitted for 100us every 5ms, even though the first two would have a 1/60 duty cycle and the latter would have a 1/50 duty cycle.

Best Answer

Your question is unanswerable without the costs of the various tradeoffs. You give a relative cost of transmitting and receiving, but have provided no guidance on the cost in your situation of using the RF link. You have also not taken data errors into account. Those mean a message is lost, which may cause retries with their associated costs or perhaps lack of features with their own associated costs.

This whole thing is a cost/benefit analysis. Only you can make judgements on some of the costs, especially when they aren't the same physical quantity. For example, we have no way of knowing how much you value not interfering with other RF communication versus battery life.

This is the kind of thing where Monte Carlo analysis can be useful. You set up a bunch of probabilities, write simulations that model all the various interactions, then measure your various costs as a function of parameters you can tweak.