You can use AVCaptureVideoDataOutput
and a sampleBufferDelegate
to capture raw compressed frames, then you just need to stream them over the network. AVFoundation
provides an API to encode frames to local video files, but doesn't provide any for streaming to the network. Your best bet is to find a library that streams raw frames over the network. I'd start with ffmpeg; I believe libavformat supports RTSP, look at the ffserver code.
Note that you should configure AVCaptureVideoDataOutput
to give you compressed frames, so you avoid having to compress raw video frames without the benefit of hardware encoding.
I posted this on the apple developer forum, we carrying on a lively (excuse the pun) discussion. This was in answer to someone who brought up a similar notion.
I think correct me if I am wrong, and give us an example how if you disagree that creating an mpeg ts from the raw h264 which you get from AVCaptureVideoDataOutput is not an
easy task unless you transcode using x264 or something similar. lets assume for a minute that you could easy get mpeg ts files, then it would be a simple matter of compiling them in an m3u8 container, launching a little web server and serving them.
As far as I know , and there are many many apps that do it, using localhost tunnels from the device are not a reject issue. So maybe somehow you could generate hls from the device I question the performance you would get.
So on to technique number 2
Still using AvCaptureVideoDataOutput, you capture the frames , wrap them in some neat little protocol , json or perhaps something more esoteric like bencode open a socket and send them to your server.
Ahh... good luck better have a nice robust network because sending uncompressed frames even over wifi is going to require bandwidth.
So on to technique number 3.
You write a new movie using avassetwriter and read back from the temp file using standard c functions, this is fine but what you have is raw h264, the mp4 is not complete thus it does not have any moov atoms, now comes the fun part regenerating this header. good luck.
So on to tecnique 4 that seems to actually have some merit
We create not one but 2 avassetwriters , we manage them using a gcd dispatch_queue, since after instantiation avassetwriters can only be used one time , we start the first one on a timer , after a pre-determined period say 10 seconds we start the second while tearing the first one down. Now we have a series of .mov files with complete moov atoms, each of these contained compressed h264 video. Now we can send these to the server and assemble them into one complete video stream. Alternately we could use a simple streamer that takes the mov files and wraps them in rtmp protocol using librtmp and send them to a media server.
Could we just send each individual mov file to another apple device thus getting device to device communication, that question has been misinterpreted many many times, locating another iphone device on the same subnet over wifi is pretty easy and could be done. Locating another device on tcp over celluar connection is almost magical, if it can be done its only possible on cell networks that use addressable ip's and not all common carriers do.
Say you could , then you have an additional issue because non of the avfoundation video players will be able to handle the transition between that many different seperate movie files. You would have to write your own streaming player probably based off of ffmpeg decoding. (thats does work rather well)
Best Answer
You most likely already know....
You can not do this. The AVFoundation API has prevented this from every angle. I even tried named pipes, and some other sneaky unix foo. No such luck. You have no choice but to write it to file. In your linked post a user suggest setting up the callback to deliver encoded frames. As far as I am aware this is not possible for H.264 streams. The capture delegate will deliver images encoded in a specific pixel format. It is the Movie Writers and AVAssetWriter that do the encoding.
Yes it is. However, you will have to use libx264 which gets you into GPL territory. That is not exactly compatible with the app store.
I would suggest using AVFoundation and AVAssetWriter for efficiency reasons.