Electronic – Charging a high voltage capacitor with power supply. How exactly

capacitorcapacitor charginghigh voltage

I have a semi-serious hobby project with a couple of friends where we need to charge a pulse operated capacitor rated to around 4kV with 1500nF capacitance.

For this I need a high voltage supply and correct configuration to do the charging. Now I've done this before at around 300V but simulating this process with multisim is nontrivial because in this case I don't know the inner workings of kV-level sources and now I have some questions:

If I were to use an off the shelf high voltage source of around 10kV maximum voltage and 20W power, how will the system react when the source is connected, possibly through appropriate resistor?
Someone suggested that exceeding 10kV would immediately break the capacitor as the voltage drop is all seen through the insulation inside the capacitor. I think this is incorrect and the voltage drop will mostly happen inside the HV supply, but I'm not entirely sure.

Let's use this cheap individual as our HV source example: Electrostatic Precipitator Power Supply With Output 300W 30KV

  1. Will an excess voltage (measured at the source) break the capacitor before actually charging it?

  2. If not, how does a typical HV source react to the situation? If connected straight to a capacitor (with effectively zero resistance), will the source just see the connection as a short circuit and promptly break or blow a fuse? Or will it just slowly start to ramp the voltage up, until the external capacitor is at equal voltage to source target (or until the capacitor breaks)?

Best Answer

I have a semi-serious hobby project with a couple of friends where we need to charge a pulse operated capacitor rated to around 4kV with 1500nF capacitance.

You didn't say exactly what voltage you wanted to charge the capacitor to, but generally speaking you shouldn't go all the way up to the rating if you want the device to be reliable.

For this I need a high voltage supply

You can get a high voltage module from distributors like Digikey or Mouser.

https://www.digikey.com/product-detail/en/xp-power/G30/1470-4014-ND/6802063

https://www.xppower.com/portals/0/pdfs/SF_G_Series.pdf

If I were to use an off the shelf high voltage source of around 10kV maximum voltage and 20W power, how will the system react when the source is connected, possibly through appropriate resistor?

Someone suggested me that the namely excessive 10kV would immediately break the capacitor as the voltage drop is all seen through the insulation inside the capacitor.

If you just put a resistor in series with the capacitor then the capacitor will likely break. If you put the capacitor at the output of a voltage divider then you can lower the voltage enough so that it wont break.

Will an excess voltage (measured at the source) break the capacitor before actually charging it?

No. The capacitor will break once it charges up past its rating. Therefore there must be something to limit the input voltage to the capacitor. For this a voltage divider is a good option.

If not, how does a typical HV source react to the situation? If connected straight to a capacitor (with effectively zero resistance), will the source just see the connection as about short circuit and promptly break or blow a fuse? Or will it just slowly start to ramp the voltage up, until the external capacitor is at equal voltage to source target (or until the capacitor breaks)

That question is impossible to answer in general without knowing the ratings of the components inside the power supply, and what fuses (if any) are used. But I would venture to say that the output will probably just ramp up. Nearly all DC power supplies include some capacitance at their output. You would just be putting a little more in parallel with whatever is already there.