Electronic – Why do MOSFET drivers need so much current

gate-drivingmosfet

I'm using a MOSFET driver to drive a 6 MOSFET H-bridge, with a 40V supply. At 25 KHz, the driver gets very hot, almost too hot to touch. The MOSFETs have a gate charge of 350 nC. Why does it take so much current to switch the MOSFETs? If the average switching current can be calculated with QxF (gate charge times frequency, which yields 9 mA), why does the driver become so warm? Does the power dissipation in the driver depend on the peak current or the average current? It seems that the average current should be the same regardless of the gate resistor, because the same charge needs to be delivered at the same frequency.

I'm using an Allegro A4935 MOSFET driver and IRFS7530TRL7PP MOSFETs.
Here is the schematic:
enter image description here

Best Answer

A current of the gate charge *switching frequency is consumed from the supply. With 6 FETs, this is then 6*350nC*25kHz = 53 mA. This generates a power of 53mA*40V=2.1 W. This power is dissipated between the IC and the gate driver resistors. That total power can be calculated without knowing the exact waveform shape.

To know how much is in the gate resistors vs. the IC, you would have to know the driver on-resistance of the IC, and then the power is shared in proportion to the resistances. The A4935 has gate driver resistance of about 10 ohm, and you have about 5 ohm gate resistors, so 67 % of the power is in the IC, and 33 % is in the 6 gate resistors, so the IC dissipates 1.4 W, and each resistor dissipates about 0.1 W.

'Too hot to touch' is about 70-80 C for a plastic package. Depending on your PCB heat sinking it is possible that a temperature rise from ambient (25 C) is caused by 1.4 W.