Iphone – What’s the best way of live streaming iphone camera to a media server

avfoundationffmpegios4iphonevideo streaming

According to this What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer? is possible to get compressed data from iphone camera, but as I've been reading in the AVFoundation reference you only get uncompressed data.

So the questions are:

1) How to get compressed frames and audio from iPhone's camera?

2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?

Any help will be really appreciated.

Thanks.

Best Answer

You most likely already know....

1) How to get compressed frames and audio from iPhone's camera?

You can not do this. The AVFoundation API has prevented this from every angle. I even tried named pipes, and some other sneaky unix foo. No such luck. You have no choice but to write it to file. In your linked post a user suggest setting up the callback to deliver encoded frames. As far as I am aware this is not possible for H.264 streams. The capture delegate will deliver images encoded in a specific pixel format. It is the Movie Writers and AVAssetWriter that do the encoding.

2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?

Yes it is. However, you will have to use libx264 which gets you into GPL territory. That is not exactly compatible with the app store.

I would suggest using AVFoundation and AVAssetWriter for efficiency reasons.

Related Topic