I have been experimenting with simulating current limiting circuits. I am trying to limit current to ~500mA given a fixed 4.8V source. I have started using a circuit like the one found on this wikipedia page …
I have made a simulation of this circuit using CircuitLab. I show the results below. The circuit on the left uses a simple series resistor to do the current limiting while the circuit on the right is based on the Wikipedia circuit. I have tweaked the values of R_bias and R_load to common resistor values that prevent more than 480 mA being drawn from the source when the load is 0 Ohms. I also set hFE of the transistors to 65 to match some multimeter measurements I made of some power transistors that I have to hand. The values adjacent to the ammeters are the simulated values.
If I now make a 10-Ohm load, it becomes clear why a current limiting circuit is superior to a series resistor. The current limiting circuit drops its effective resistance, allowing more current through than when using a series resistor . .
However, the current limiting circuit is still providing some series resistance in this case. An ideal current limiter would have no resistance at all until the load attempts to draw more current than the limit. Is there a way to tune R_bias and R_load to better achieve this, and/or are there circuit tweaks that can help better achieve this?