Electronic – Analog video chrominance decoding – PAL/NTSC

colorfpgantscprogrammable-logicvideo

I'm implementing an analog video decoder on FPGA.
I find some difficulties during chrominance decoding. I appreciate if you can help me.
These are the steps as I'm doing:

  1. I generate an NTSC ColorBar signal from a TV pattern Generator. I do acquisition through an RF board with the following parameters:: Local Oscillator=500 MHz, Sampling frequency=54 MHz(4*13.5 MHz), BW=6 MHz, ADC resolution: 12 bits

enter image description here

  1. After inverting the signal, I implement a digital LPF+bandpass FIR to separate luma and chroma.

enter image description here

  1. I'm generating localling, through a DDS, the chroma subcarrier signal (3.579545 MHz (NTSC)). Actually, it's a SIN + COS signals coded on 16 bits.

enter image description here

  1. I multiply, through an IP from Xilinx, the chrominance signal by COSINUS (V), and by another multiplier, the same chrominance by SINUS (U) in order to create a 90° phase between them.

enter image description here

  1. After that, I'm using 2 LPF with a cutoff frequency of 500 kHz and 1.3 MHz for U and V, respectively. Then, I get Red and Blue signals as below:

enter image description here

enter image description here

source

At this stage, it's OK.

PROBLEM:
I'm expecting to have the same output on all video lines! Or it's only the case for 4 or 5 successive lines, then, it changes and it doesn't represent Red and Blue levels. This is an extract from what I get:

enter image description here

Any explanation please?
Thanks in advance.

Best Answer

You need to adjust the DDS generated subcarrier to match the actual colorburst reference signal. So they must be compared (frequency and phase) and based on the result the DDS phase increment is tuned so that the generated burst matches burst of the sampled video do match. How to actually do that, there might be many ways, and there are many video technology books that describe digital decoding of sampled composite video.