I'm designing a circuit that will allow me to do transistor testing (similar to a curve tracer). One part of the circuit is a power supply that will apply different voltages to the collector / emitter.
** I have posted a question regarding the same circuit, however this is a completely different question. **
I need to monitor the current following into the transistor so I'm using high side sensing because:
- I want to detect possible shorts at the power supply (I>250mA)
- I have a few AD8418 current sense amplifiers that fit the job.
The AD8418 has a gain of 20 V/V so I can get the range I'm interested in using a stable 1 Ohm resistor.
Some online reading and searching brought to my attention that "high" values such as 1 Ohm are rarely used for current sensing. However, using a much lower values resistor will result in a much lower output voltage which will have to go additional amplification before being sampled by an ADC. I'm afraid that another stage will add noise, offset error and gain error.
Should I go ahead and use a "stable" 1 Ohm resistor or try another, more complex, solution?
Best Answer
This is mostly a question of:
If you're measuring 10 A, you're going to dissipate 100 watts through your resistor, but for you, 1 Ω may be perfectly fine. Let's run the numbers:
Sure, it might. But the AD8418 isn't perfect either. The key to a good design is to calculate what kind of accuracy you need. The offset and gain error can be ignored: That's just a multiplication and addition in the microcontroller. The extra noise, well. That can be filtered out, if it's a problem.
It seems to me that using 1 Ω is perfectly reasonable.