Electronic – How to correctly find the RMS on ARM 7 LPC micro-controller

acadcarmmicrocontrollervoltage

I am trying to find the RMS of an input sine wave using the ARM 7 LPC 2119 micro-controller. I know the theory behind finding the RMS of such a waveform but I am having trouble implementing this in code, especially the sampling at equally spaced mid-ordinates. I also have just used timer interrupts to find the frequency of a sine wave using capture inputs, so I am aware of the basic functionalities in terms of programming an ARM 7 LPC 2119 micro-controller.

enter image description here

I am testing this functionality using a 3 V peak-to-peak sine wave with a frequency of 50 Hz. I have shifted the waveform upwards by 1.5 V so to avoid having any negative values going into the ADC pins of the micro-controller.

To sample at equally spaced intervals I am using timer interrupts, In a very similar way to this example here. Instead of switching on an LED, I am doing an ADC conversion.

Before using the following formula:

enter image description here

I need to make sure that I am sampling at given intervals, and to test this I have set my timer clock to provide a 20ms delay before entering the ADC function by doing so:

T0CTCR = 0x0; //Set Timer Mode
T0PR = 60000-1; //Increment T0TC at every 60000 clock cycles
//60000 clock cycles @60Mhz = 1 mS

T0MR0 = 21-1;   //Zero Indexed Count-hence subtracting 1
T0MCR = (1<<0) | (1<<1);//Set bit0 & bit1 to Interrupt & Reset TC on MR0  

Once this is done, I am enabling my timer interrupt:

  VICVectAddr0 = (unsigned )timer0ISR; //Pointer Interrupt Function (ISR)
  VICVectCntl0 = (1<<5) | 4; //(bit 5 = 1)->to enable Vectored IRQ slot 
  VICIntEnable = (1<<4); // Enable timer0 interrupt

The above code gives a 20 ms delay. With a sine wave at 50 Hz, this should give the same value on every reading, since the period of such a sine wave is also 20 ms. However this is not the case and I can see no visible pattern in my results.

The ADC function I am using works well on DC voltages and I can also confirm that the timer is indeed giving a 20 ms delay before entering the ADC function.

Am I missing something obvious here or could there be some other variables I am not taking into consideration? Any ideas would be appreciated.

Best Answer

The above code gives a 20 ms delay. With a sine wave at 50 Hz, this should give the same value on every reading, since the period of such a sine wave is also 20 ms. However this is not the case and I can see no visible pattern in my results.

Hmm...

Am I missing something obvious here or could there be some other variables I am not taking into consideration? Any ideas would be appreciated.

A 20 ms intersample time is no good for 50 Hz measurements and yes, because the AC frequency is not constant (and neither is your clock) measurements made might be drifting in an out of phase seemingly randomly.

Look at this sinewave sampled at nearly the same period: -

enter image description here

It produces aliasing i.e. the blue output should be constant but slight differences in sampling frequency and sinewave cause the blue waveform to undulate up and down at the "difference" frequency.

If you want to properly measure RMS by sampling you should be sampling at around 1 kHz or greater especially if you are trying to measure the RMS of an AC current (harmonic distortions can be significant).