We have a MOSFET that drives a load. The load has a sense resistor in its current path and its voltage is used to trigger a circuit that can disconnect the load if we detect an over current condition.
But it wasn't that easy. Many kinds of loads generate large pulses when connected (input capacitors, etc), so temporal overcurrents can actually trigger the latch, so we fed that current sense voltage into a capacitor. It looks like this:
The issue is that higher currents will kill the MOSFET (and other existing circuitry) exponentially faster, while higher voltages (from the current sense) will change the capacitor less than linearly faster.
If we tweak the value for a reasonable current/time to trigger the latch, a higher current will take too long, while tuning for a high current will overshoot at lower currents.
We have been thinking about using a thermistor instead of a capacitor (right response curve) but they all seem to be too slow for our case (at 200A we have about 40us to shut off). Now there's an idea of using an integrator op-amp to measure current*time…
Does any of that make sense? Is there a industry standard practice for this sort of case? This can't really be this complicated, it feels like it should be a common issue… can it? Isn't it? ๐
Our issue is not about the components (whether the op-amp will be able to drive the MOSFET at X Amps), it is about triggering in 'this is too high too long' rather than on 'high current but short and okay' situations. I've removed the device names accordingly ๐
Best Answer
The underlying problem description suggests a possibility of a low-tech solution rather than the high tech path. This is not to say that the mentioned integrator based approach won't work, of course.
Basically, how about two current limiting mechanisms in series:
This way, impulse currents such as for capacitor charging at start-up would get through, so long as they are under abs-max, while "slow-fuse" overcurrent protection would be in place for normal operating conditions.