The purpose of 100 kΩ resistor between gate and source in MOSFETs

currentmosfetmosfet-driverresistorsvoltage

I've seen many MOSFETs where there's a 100k kΩ resistor between gate and source of the MOSFET.

Can someone tell me what the reason is and how important the value is?

Best Answer

The idea is to turn the MOSFET off reliably and reasonably quickly if the connection to the gate should ever go high impedance for any reason (during reset on an MCU GPIO, disconnected connector etc.).

If you have a series resistor from the driver (plus the driver source impedance) you've made a voltage divider, so generally you want the gate pulldown resistor much higher so that it doesn't affect the 'on' gate voltage too much.

The resistor should be low enough that any drain-gate leakage won't cause too much Vgs (so the MOSFET reliably remains off).

That gives quite a bit of range, but something in the range of 10-20kΩ up to a few hundred kΩ is pretty good.

The main reason you don't want to allow the MOSFET to partially turn on (other than turning it on might have some bad effect like starting a motor) is that if the load is heavy and the MOSFET turns partially on it can be damaged or destroyed due to heating. The MOSFET dissipates almost no power when 'off' (leakage current x voltage) and usually very little when 'on' (load current squared times Rds(on)), but there is a worst-case condition when the MOSFET resistance equals the load resistance.

For example, supply voltage 12V, load resistance 2Ω, Rds(on) 10mΩ, leakage 1uA.

With MOSFET off, the dissipation is 12uW, utterly negligible.

With MOSFET on, dissipation is about 0.36W, probably no heatsink required.

With MOSFET worst-case partially on (2Ω), current would be 3A and MOSFET dissipation would be 18W.