I've heard that sometimes it is recommended to "slow down" a digital line by putting a resistor on it, let's say a 100 ohm resistor between the output of one chip and the input of another chip (assume standard CMOS logic; assume the signalling rate is pretty slow, say 1-10 MHz). The described benefits include reduced EMI, reduced crosstalk between lines, and reduced ground bounce or supply voltage dips.
What is puzzling about this is that the total amount of power used to switch the input would seem to be quite a bit higher if there is a resistor. The input of the chip that is driven is equivalent to something like a 3-5 pF capacitor (more or less), and charging that through a resistor takes both the energy stored in the input capacitance (5 pF * (3 V)2) and the energy dissipated in the resistor during switching (let's say 10 ns * (3 V)2 / 100 ohm). A back-of-the-envelope calculation shows that the energy dissipated in the resistor is an order of magnitude greater than the energy stored in the input capacitance. How does having to drive a signal much harder reduce noise?