Electronic – Clamp multimeter RMS current measurement, AC versus DC confusion

current measurementrms

This question doesn't relate to a specific meter, I am using AMPROBE AMP-220 manufactured by Beha-Amprobe. I believe am an experienced engineer but only in the digital world, not really in power and AC field.

I am apparently confused by understanding the current measurement, specifically the difference in the three measurement functions:

  • AC A
  • DC A
  • DC+AC A

The circuit under test is a 1200VA/60Hz transformer with two 37Vrms secondary coils sharing one wire (center tap), also called 37V/0/37V.

schematic

simulate this circuit – Schematic created using CircuitLab

My calculations of expected current is I = Urms/R; that is I = 37Vrms/1.6 Ohms = 23.125 Amps

I do see this current… but on 'unexpected' meter setting. I was planning to use the AC setting, but the current is on DC+AC setting.

Clamp meter is properly aligned with the wire. Can someone please in layman terms explain me why I measured these values:

  • AC setting shows 9.7A
  • DC setting shows 20.5A
  • DC+AC setting shows 22.95A (closest to my calculation)

Thank you for clarifying this. While my circuit currently has high power resistor to test the newly purchased clamp meter, the goal is to have lead-acid battery instead and to measure the Irms charging current. Which setting of the clamp meter is correct to measure this charging current?

Best Answer

This answer was given before the OP revealed he was using output rectification

It's possible that there is some form of rectification in your secondary circuit and that the two components of your current are actually 20.5 volts DC and 9.7 volts AC. If you do an RMS mathematical combination of the DC and AC currents you get 22.68 amps RMS.

$$RMS = \sqrt{DC^2 + AC^2}$$