Electronic – How to best “synchronize” image producer/consumer to handle different FPS

clockfpgavideo

I'm building a system that among other things has an imaging sensor spitting out digital video @ 30 fps and a display system that displays to a monitor @ 60 fps. The system currently utilizes a frame buffer controller that I've implemented in an FPGA and off chip memory. The reader (monitor) reads from the buffer at will and the writer (camera) pumps data into the buffer at will (it's a fully asynchronous controller). There's enough memory to store a few frames of data. The system works fine. I can display the video feed from the camera on a monitor without a hitch.

The issue that I'm facing is that the reader/writer have different clock rates/resolutions and, therefore, different fps. What this implies is that at some point the monitor displays data from the current imaging frame, but because it consumes data faster than the camera produces data "out runs" the camera and starts displaying data from a stale camera frame. This is OK provided that the camera doesn't move really quickly or that the same scene is imaged over and over. If, however, the camera moves rapidly or something in the scene moves rapidly (e.g. things in the scene move faster than the camera sampling rate) then it's clear that the current frame will be substantially different from the previous frame. That means that the monitor can display portions of both frames at the same time, which manifests itself in a line "moving" down the screen.

So — getting to my question — I'm wondering what's the best way to solve this problem. The monitor and camera have two independent clocks, which are asynchronous to each other. I have control (I generate) the pixel clock for the monitor, but the camera pixel clock is 100% asynchronous to the system (it comes out of a deserializer IC). A few more details: the FPGA is clocked at 100 MHz from an oscillator. I use a DCM inside to generate the 40 MHz pixel clock for the monitor and have several other pieces of the system running at 100MHz. The camera pixel clock is about 13 MHz (but can go up to 27 MHz). The camera FPS is generally stuck at 30 FPS, but can go slower depending on exposure control.

I could do something like always display the same frame twice so that the monitor is doing 60 fps, but 30 distinct frames/second. That would work OK, but if I ever change the camera frame rate to something arbitrary that doesn't divide nicely into 60 then I'm out of luck. Maybe more importantly if I change my monitor fps this works poorly. As a more robust extension of this idea I could think of ways to only display "whole frames". In other words don't start reading a new image frame until it's entirely written (or more robustly until you know that you won't over run the camera on that frame). This would mean that not every image frame gets the same number of display frames, but maybe that's OK.

Any suggestions? How do commercial systems typically handle this problem? In particular how do USB cameras handle this? That seems like a pretty appropriate analog. Presumably they stream data into a buffer on the machine side and the display software has to process the data/dump it to the graphics controller (I'm thinking of something like a call to glutswapbuffers).

Best Answer

In programming this technique is called double buffering. You have two memory buffers between the camera and the monitor. While the camera fills one of them the other is displayed on the monitor (whatever time it costs - 1, 2 or more frames). When the first buffer is full (the whole frame is read from the camera) the two buffers are swapped and the second buffer now is read from the camera and the first one is displayed on the monitor.

This way, some of the camera frames will be displayed 2 monitor frames long, some only 1 (if the camera is faster than half of the monitor frame rate) or 3 monitor frames (if the camera is slower than half of the monitor frame rate) but the synchronization will be automatically provided.

I hope, this explanation is clear enough and I understood the problem correctly.