Electronic – Output capacitance of MOSFET

capacitancegate-drivingmosfetswitchingtransistors

I have read in several application notes that there exists a region called the Miller plateau where the current into the Cgd is large such that almost no current enters Cgs holding the Gate voltage almost constant.

The miller voltage is said to be the Vgs for which there is a rapid change in Vds for some load ZL and drain current Id.
So if for some gate voltage, the drain current is large and if the impedance is large too the Vds has to drop very low to accommodate that drain current.

My question is as follows: Does the rate at which the drain voltage falls depend on the output capacitance Cds of the MOSFET?
If so how does the output capacitance of the MOSFET discharge?

PS: No application note seems to consider the output capacitance of the MOSFET, for the switching characteristics.

Best Answer

The Miller Plateau is not a property of the FET itself, but of the FET in combination with its circuit (mostly the load).

This explanation is somewhat simplified and ignores some non-idealities of FETs.

FETs have capacitance between gate and source (mostly constant), and gate to drain (in high voltage LDMOS-type FETS, this is large when VGD is high, and small when VD >> VG.

Large FETs also have a high gm (change in drain current w.r.t change in VGS). If you have a FET with a 'perfect' current source load, then the drain voltage won't change if VGS is lower than the value required to conduct the load current; yet if VGS is just a little higher, the drain voltage will fall (to typically a few 100 mV). Thus, a small change in VGS will lead to a large change in VDS. The drain-gate capacitor's voltage has to change during this time, and the current to do this comes from the gate driver.

As the driver gets the gate voltage to just about the value to conduct the load current, then VGS remains constant, and VDS begins to fall. All the available gate driver current flows into the CDG capacitor (and then into the drain of the FET together with the load current).

Ideally (with a small gate driver current, constant ILOAD, ideal FET and no other parasitics), VGS would remain constant as VDS falls.

However some non-idealities will affect the behavior in reality.

The FET's drain-source (and drain-bulk) capacitance also had to discharge -- this current is also conducted by the FET.

FETs also have a non-zero output impedance -- it takes slightly higher VGS to support a certain drain current as VDS falls.

CDG is also non-linear and changes value at different VDG.

Thus, the Miller Plateau is not perfectly flat and doesn't occur at precisely the VGS required to support ILOAD.

The total drain current is ILOAD + CDS * dVDS/dt + CGS * dVDG/dt. Thus a slightly higher than expected (from DC measurements) VGS is required.

Output impedance of the FET means that slightly higher VGS is required as VDS falls. This increased VGS also takes some of the gate driver available current to incrementally charge the VGS capacitance.

A load resistance (as opposed to a constant current) will require noticeably different VGS (to support the increasing load current) as drain voltage falls.

In practice these non-idealities are usually quite small and a Miller Plateau is readily observable at close to the expected VGS.

For the specific question about the rate of VDS falling depending on output capacitance -- yes it does, but usually the effect is small -- compare ILOAD with CDS.dVDS/dt.

In high power and fast switching speed circuits (e.g. see DCDC converters) where switching times are in the ns range, there can be a noticeable effect of CDS capacitance on the drain voltage slew rate.