How to convert RGB from YUV420p for ffmpeg encoder

ffmpegrgbvideo-encodingyuv

I want to make .avi video file from bitmap images by using c++ code.
I wrote the following code:

//Get RGB array data from bmp file
uint8_t* rgb24Data = new uint8_t[3*imgWidth*imgHeight];
hBitmap = (HBITMAP) LoadImage( NULL, _T("myfile.bmp"), IMAGE_BITMAP, 0, 0, LR_LOADFROMFILE);
GetDIBits(hdc, hBitmap, 0, imgHeight, rgb24Data , (BITMAPINFO*)&bmi, DIB_RGB_COLORS);

/* Allocate the encoded raw picture. */
AVPicture dst_picture;
avpicture_alloc(&dst_picture, AV_PIX_FMT_YUV420P, imgWidth, imgHeight);

/* Convert rgb24Data to YUV420p and stored into array dst_picture.data */
RGB24toYUV420P(imgWidth, imgHeight, rgb24Data, dst_picture.data); //How to implement this function?

//code for encode frame dst_picture here

My problem is how to implement RGB24toYUV420P() function, this function will convert RGB24 data from array rgb24Data to YUV420p and store into array dst_picture.data for ffmpeg encoder?

Best Answer

You can use libswscale from FFmpeg like this:

#include <libswscale/swscale.h>
SwsContext * ctx = sws_getContext(imgWidth, imgHeight,
                                  AV_PIX_FMT_RGB24, imgWidth, imgHeight,
                                  AV_PIX_FMT_YUV420P, 0, 0, 0, 0);
uint8_t * inData[1] = { rgb24Data }; // RGB24 have one plane
int inLinesize[1] = { 3*imgWidth }; // RGB stride
sws_scale(ctx, inData, inLinesize, 0, imgHeight, dst_picture.data, dst_picture.linesize);

Note that you should create an instance of the SwsContext object only once, not for each frame.

Related Topic