Electronic – Finding out the accuracy of the current probe’s measurement

accuracycurrentoscilloscopeprobe

As part of power supply testing we are measuring the current profiles of various regulators.

Startup profiles includes current bursts with different current amplitudes and widths.

i want to understand the accuracy of the current probe*(MINIMUM CURRENT WHICH CAN BE MEASURED ACCURATELY UPTO 1% to 3%)* in measuring currents.

For example in my case,
I have TCP0030A current probe from Tek-

it has 2 ranges 5A and 30A.

1. whether in both ranges the accuracy differs or same

It has different specifications like

DC gain accuracy <3%(1% typical) – i am not sure what is this exactly?

Displayed RMS noise <75uArms

1mA sensitivity- Is this directly relates to Minimum current can be measured with this probe

If i want to find the accuracy of the measurement, how i can derive error from all above error sources.

Best Answer

The sensitivity (1 mA) is the smallest current that can be consistently sensed by the instrument, i.e. it is guaranteed that you will get some reading with anything above 1 mA.

The gain accuracy tells you your measurement accuracy as a percentage of the measurement itself, so if you are sensing 1 A you can expect +-3%, typically +-1%.

The figures above are true for a properly calibrated, degaussed, warmed up and auto zeroed probe.