Electronic – Observing effect of household appliances on powerline communication

bandwidthfrequencypower line communicationspectrum analyzer

I had been recently working on Powerline communication modems (PLC). I put two PLC modems at different parts of my home and observed the data transfer rate between them. However, I observed that when I turned on an appliance in a home, I could see that there is a significant drop in the observed data rate. This should be attributed to the interference caused by turning on the appliance.

The following figure, which I also used in a poster that I recently published containing the same, shows the impact of different appliances on the data rate between the two end node.
enter image description here

The experiment involved turning on an appliance for 60 seconds and then turning it off for 60 seconds.

I was now trying to understand why such a behaviour was happening. So, went to the PLC modem manual and studied a bit. The PLC modem uses HomePlugAV standard and thus communicates in 2-28 MHz. I was able to observe the same on my spectrum analyzer, as shown below. Y axis is in dBm and X axis in Hz. The spectrum when PLC operates is clearly visible in the range of 2-28 MHz.

enter image description here

However, next when I put an appliance in parallel (emulating the setup when PLC communication rate would drop in home, when an appliance was turned on), I could see no change in the spectrum. What I was expecting to see was a distinct spectrum on the spectrum analyzer- maybe some frequencies get chopped of. My reasoning was that there could be 2 things affecting PLC rate:

  1. Attenuation caused by appliance switching
  2. Interference caused at particular frequencies by appliance switching

However, I am not able to quantify either of the two from my spectrum analysis. So, my question is: How can I find out what causes the rate to drop. Previous studies on narrow band PLC attribute this to interference caused by operating appliances as shown in the figure below.

enter image description here

I am unable to see such a change on my spectrum analyzer however. So, my questions would be:

  1. Is this principally a correct way to measure the desired phenomena- drop in rate vs interference?
  2. Am I doing something wrong that my spectrum analysis does not show such effects?
  3. Is there a simpler way to measure this phenomena?
  4. Is the frequency domain analysis sufficient?

To sense the powerline, I am using a 60 kHz HPF with 50 ohm matching impedance. Other thoughts on the experiment also welcome.

Possibly related: How can power line devices be affected by a power supply?

Best Answer

For a reduction in data rate due to interference that is as slight as that shown in your table you are very unlikely to see any deterioration of the spectrum whatsoever.

I am wondering what would be a good way to quantify this effect.

Do exactly what you did in the table - plot the effective data rate when an appliance is on compared to when it is off - the table tells you a lot so well done on that - it's useful to see. You could presume that the effective data rate reduction is due to packet retransmission due to packet corruptions. You could also assume that when data rates are higher, the packet length is shortened in order to make retries of smaller packets more liable to be received correctly - this also reduces overall data rate bearing in mind that the reduction would be greater if the packet length were kept bigger because of the probability of errors due to longer packets.