Android – Live video streaming application on Android

androidlivestreamingvideo

I am trying to build a live video streaming application that streams live video from Android.

Using the MediaRecorder class, I am able to capture the video data in the form of 3gp, with h263 codecs.

However, when I run my application and stream media, I get a 2-3 second delay at the server side.

Why am I getting this delay? Are there any internal buffers that I need to flush? Are there other ways of streaming video apart from using MediaRecorder class?

Best Answer

If you're set on RTMP streaming from Android, the best solution is MediaCodec + FFmpeg + librtmp. This avoids any hacky "detect the NAL Unit within the bytestream" business but requires Android 4.3. Skate where the puck is going...

I've developed an open source SDK that demonstrates RTMP streaming with FFmpeg + librtmp as pre-built shared libraries. The SDK is focused on HLS streaming, but RTMP support is present.

If you'd like help building FFmpeg yourself for Android (with or without librtmp), check out my guide.