Electronic – Advice on safe discharge circuit for HV Capacitors (400v)

dc/dc converterhigh voltagesafety

In designing a relatively high voltage DC-DC converter, my output capacitor happens to be a 100uF electrolytic charged to 400v nominal. Efficiency is a design constraint, so I would like to know of any common design practices that handle the discharging of these capacitors.

This system will be debugged and characterized, so I'd hate for someone to get a nasty shock. Is a resistor the best way? To get quiescent power draw low enough, I have to use ~1M.

Best Answer

Back in the Dark Ages, when Men Were Men and vacuum tubes ruled the world, this kind of thing was incredibly common. 300 VDC power supplies were everywhere. 500 VDC was not at all uncommon in higher-power amplifiers.

Standard procedure back then was to put bleeder resistors directly across the big electrolytic capacitors. Typical values were several Megohms. With the juice on, the bleeders were ALWAYS dissipating a small amount of power during operation, but, when the power was removed, the bleeder would drain the capacitor charge in a reasonably short time.

Occasionally, you'd run into equipment where the manufacturer cut costs to the very bone by eliminating the bleeders. Those units were NASTY. They could HURT you. I got a nasty little bite off of a guitar amp that had been cold for a couple of days, because it didn't have bleeders and I didn't notice it and the capacitors were in GOOD condition. Luckily, it got a knuckle, I figured out what was happening, and I did the old "screwdriver across the caps" trick to drain them manually, while muttering mildly unprintable things about the ancestry and probable destination of the individual who'd decided that leaving the bleeders off was a good idea.