Electronic – Why do most PWM LED displays and backlights operate at a relatively low frequency

displayfrequencyledpwm

All around me are a bunch of electronics: digital clocks, my laptop, a refrigerator, a dimmable flashlight, and more.
What they all have is perceptible flickering of their displays due to PWM, especially when I make quick eye movements (i.e. normal everyday vision).

I've played around with PWM and LEDs before; flickering becomes comfortably imperceptible at 1000Hz or so, which is trivial for a microcontroller, even perhaps inconsiderable.

I realize some devices may be governed by mains frequency, but as far as I know, a lot of my electronics use filtered DC power.

Why can't every LED display be designed so that no flickering occurs?
I can think of a few reasons for our current situation:

  • We have a bunch of lazy engineers
  • Cost reasons — maybe they're using some absolute worst microcontroller to save a few pennies
  • Efficiency — I know PWM is more efficient than constant current, and I guess the higher the frequency the closer it is to constant current (can I assume that?), but I'd be surprised if there was a major difference between 100Hz and 1000Hz.
  • I am literally the only person bothered by this.

Thoughts, anyone? I do hope I'm not the only one.

Best Answer

It's not really PWM, rather it's multiplexing of displays. I won't go over the advantages of multiplexing in detail here, but it's not power efficiency, rather it's a reduction in cost and complexity of drive components. It's possible to use a few cheap parts to drive a 4-digit LED display (32 segments) with only 12 port pins (on a single sided PCB if necessary).

Most of this kind of product will be using an 8-bit processor rather than some. 32 bit thing, and usually at a relatively low clock frequency such as 4 or 8 MHz. They will not be equipped with a hardware display controller generally, so an ISR will do the work. If there are other things that are high priority then the display digit brightness might be visibly modulated in brightness due to jitter in multiplexing - some level of that would be deemed to be acceptable if not entirely imperceptible. Same thing with flicker in the display. Even so the micro might be spending more than 20% of its bandwidth just controlling the display. Faster clock would mean more power consumption of the micro, more EMI and more cost. For an 8 digit display muxed at 200Hz a new digit must be handled every 600usec or so, +/- 30 usec (that would be a pretty high quality display for an application without vibration). If there is a lot of vibration maybe 5x faster.

Although a designer could propose using, say, a small FPGA to totally eliminate all timing constraints and a 6-layer board to deal with the EMI, that would likely be their final act at a consumer product company. The attitude is that a 5 cent reduction in cost would be sufficient to hire another engineer.

Digital LED mains powered clocks are a special case, and some use a clever biplexing scheme powering the display from an unregulated centre-tapped transformer secondary, so the mux frequency is tied to the mains frequency.