Electronic – Use MOSFET series resistor instead of shunt for current measurement

current measurementefficiencymosfetshunt

If I have a setup where a N MOSFET drives a heavy inductive load (say, 25A peak), can I use the MOSFET's internal resistance to measure said current the same way I would use a shunt resistor?

The value of the equivalent series resistance when the MOSFET is active (i.e. Rds-on) is usually very low and easy to find in the MOSFET's datasheet. I know it is not ideal, and will strongly depend on the FET's temperature a well as the current (so I have a loop there). Still, is there any serious impediment to this approach?

The reason I want to do this is because I have a system where I need to minimize component count as well as avoid any extra losses (i.e. shunts) when driving the inductive load, but can compute some corrections/linearization on a temperature-sensing micro-controller if needed.

I am almost 100% I saw a LiPo battery manager that appeared to do something similar, but I am unable to find it. As I recall, this IC estimated the charging current using something similar to what I just described. But maybe I am just mistaken.

Best Answer

equivalent series resistance ... easy to find in the MOSFET's datasheet

No. This seems to be your main misconception.

The datasheet tells you the guaranteed maximum, but not what it will actually be in any one device. Sometimes datasheets show typical specs, which are usually significantly less than the maximum. And of course any one device might be lower than typical too, but you don't know how much.

Then as others have said, RDSON has a strong dependence on temperature.

With calibration to the particular device, and maybe some correction for measured or assumed temperature, you might be able to detect very basic current thresholds, like "too high, shut down now". But anything you'd call a "current measurement" isn't really going to work.