Measuring 230V AC current with low sample rate

acadccurrent measurement

I have a resistive load of order of 1kOhm, on a 230V AC circuit. The load's resistance varies a little over time – the current draw may float between some 0.2 – 0.25A, slowly (order of minutes).

I have an ADC connected to 1-wire bus of an SBC. Single sampling takes time of order of 1-10ms but for technical reasons I can only obtain a few samples per second ((and preferably even less; not nearly enough to trace extrema of the AC sine wave in software).

I need to find the current amperage of the load – not very precisely, just within some 10% precision. Simplicity of the circuit is preferred over speed or accuracy. I'd also prefer to keep at least rudimentary pretense of safety of the circuit.

How can I approach designing a circuit to get that measurement – something that will convert the AC current draw to some ~0-5V voltage levels for the ADC, and smoothed out enough to get values closer to RMS, and not random momentary points on the sine?

Best Answer

A rudimentary pretense of safety could be achieved with a current transformer. This plus the appropriate burden resistor will reasonably accurately give you a signal that is pretty closely related to current drawn.

Then I'd consider using an op-amp configured as a precision rectifier: -

enter image description here

This one is a full-wave rectifier and here is a half wave rectifier circuit: -

enter image description here

The output will follow the rectified value of the input. You can add a capacitor on the output and a parallel resistor to give you a peak value of the current - this will be a slow moving DC signal that follows the envelope of the AC current. RMS will be peak value divided by \$\sqrt2\$ for resistive loads as mentioned in the question.