I am trying to make a Low pass IIR filter in dsPIC(dsPIC33FJ12MC202) using mikroC compiler. I have generated filter coefficients using Filter Design Tool. My filter specifications are as follows:
LPF, Wp=4KHz, Ws=5KHz, Ap=0, As=60, Sampling about 20Khz
I am taking analog input from channel zero, filtering it and sending filtered value via UART. The code look something like this:
int adcValue;
char result[6];
const unsigned int BUFFER_SIZE = 8;
const unsigned int FILTER_ORDER = 3;
const unsigned int COEFF_B[FILTER_ORDER+1] = {0x17D9, 0x478B, 0x478B, 0x17D9};
const unsigned int COEFF_A[FILTER_ORDER+1] = {0x8000, 0xAF27, 0x383C, 0xF802};
const unsigned int SCALE_B = 1;
const unsigned int SCALE_A = 0;
unsigned int inext;
ydata unsigned int input[BUFFER_SIZE];
ydata unsigned int output[BUFFER_SIZE];
void filter(unsigned int adcValue)
{
unsigned int CurrentValue;
input[inext] = adcValue;
CurrentValue = IIR_Radix( SCALE_B,
SCALE_A,
COEFF_B, // b coefficients of the filter
COEFF_A, // a coefficients of the filter
FILTER_ORDER+1, // Filter order + 1
input, // Input buffer
BUFFER_SIZE, // Input buffer length
output, // Output buffer
inext); // Current sample
//CurrentValue = 2048;
output[inext] = CurrentValue;
inext = (inext+1) & (BUFFER_SIZE-1); // inext = (inext + 1) mod BUFFER_SIZE;
//Sending filtered value via UART
WordToStr(CurrentValue, result);
strcat(result, "\n\r");
UART1_Write_Text(result);
}
void main()
{
inext = 0; // Initialize buffer index
Vector_Set(input, BUFFER_SIZE, 0); // Clear input buffer
Vector_Set(output, BUFFER_SIZE, 0); // Clear output buffer
//Using R12, R13 for Rx, Tx
PPS_Mapping(12, _INPUT, _U1RX);
PPS_Mapping(13, _OUTPUT, _U1TX);
UART1_Init(115200);
ADC1_Init();
while(1)
{
adcValue = ADC1_Read(0);
Delay_us(50);
filter(adcValue);
}
}
The IIR_Radix function is provided by mikroC compiler, So I am expecting below 4Khz it should give output whatever the input is and above 4KHz it should attenuates input.
Currently I am simulating circuit using Proteus and when I apply Sine Wave generator(Amplitude 5v, Freq=1000 Hz) at ADC channel I am getting output which has maximum value 95(e.g 0, 22, 69, 95, 69, 22, 0…) on Virtual Terminal. But 1000 Hz is in passband so I should get whatever the input is, like(0, 25, 499, 1023, 499, 25, 0…). So why I am getting attenuated output even for passband frequencies?
I am analyzing output on virtual terminal inside proteus simulation, see on right.
Best Answer
You say that sampling frequency is "about 20kHz" (perhaps exactly 22.1kHz, typicaly for audio?). So the period of a 1kHz sine wave should be about 20 samples.
But your data has a period of 6 samples. It isn't 1kHz, but more like 3.5kHz. Getting somewhat close to the corner frequency, although not close enough to explain attenuation by a factor of 5.
Looking at the screenshot, though, every second value is zero. So your frequency is just under fs/2. And that is solidly in the stopband.