It's certainly possible to develop on a Windows machine, in fact, my first application was exclusively developed on the old Dell Precision I had at the time :)
There are three routes;
- Install OSx86 (aka iATKOS / Kalyway) on a second partition/disk and dual boot.
- Run Mac OS X Server under VMWare (Mac OS X 10.7 (Lion) onwards, read the update below).
- Use Delphi XE4 and the macincloud service. This is a commercial toolset, but the component and lib support is growing.
The first route requires modifying (or using a pre-modified) image of Leopard that can be installed on a regular PC. This is not as hard as you would think, although your success/effort ratio will depend upon how closely the hardware in your PC matches that in Mac hardware - e.g. if you're running a Core 2 Duo on an Intel Motherboard, with an NVidia graphics card you are laughing. If you're running an AMD machine or something without SSE3 it gets a little more involved.
If you purchase (or already own) a version of Leopard then this is a gray area since the Leopard EULA states you may only run it on an "Apple Labeled" machine. As many point out if you stick an Apple sticker on your PC you're probably covered.
The second option is more costly. The EULA for the workstation version of Leopard prevents it from being run under emulation and as a result, there's no support in VMWare for this. Leopard server, however, CAN be run under emulation and can be used for desktop purposes. Leopard server and VMWare are expensive, however.
If you're interested in option 1) I would suggest starting at Insanelymac and reading the OSx86 sections.
I do think you should consider whether the time you will invest is going to be worth the money you will save though. It was for me because I enjoy tinkering with this type of stuff and I started during the early iPhone betas, months before their App Store became available.
Alternatively, you could pick up a low-spec Mac Mini from eBay. You don't need much horsepower to run the SDK and you can always sell it on later if you decide to stop development or buy a better Mac.
Update: You cannot create a Mac OS X Client virtual machine for OS X 10.6 and earlier. Apple does not allow these Client OSes to be virtualized. With Mac OS X 10.7 (Lion) onwards, Apple has changed its licensing agreement in regards to virtualization. Source: VMWare KnowledgeBase
There isn't a built-in way to do this, as far as I know. As you say, HTTP Live Streaming is for downloads to the iPhone.
The way I'm doing it is to implement an AVCaptureSession, which has a delegate with a callback that's run on every frame. That callback sends each frame over the network to the server, which has a custom setup to receive it.
Here's the flow: https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2
And here's some code:
// make input device
NSError *deviceError;
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&deviceError];
// make output device
AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init];
[outputDevice setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
// initialize capture session
AVCaptureSession *captureSession = [[[AVCaptureSession alloc] init] autorelease];
[captureSession addInput:inputDevice];
[captureSession addOutput:outputDevice];
// make preview layer and add so that camera's view is displayed on screen
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = view.bounds;
[view.layer addSublayer:previewLayer];
// go!
[captureSession startRunning];
Then the output device's delegate (here, self) has to implement the callback:
-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
// also in the 'mediaSpecific' dict of the sampleBuffer
NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}
EDIT/UPDATE
Several people have asked how to do this without sending the frames to the server one by one. The answer is complex...
Basically, in the didOutputSampleBuffer
function above, you add the samples into an AVAssetWriter
. I actually had three asset writers active at a time -- past, present, and future -- managed on different threads.
The past writer is in the process of closing the movie file and uploading it. The current writer is receiving the sample buffers from the camera. The future writer is in the process of opening a new movie file and preparing it for data. Every 5 seconds, I set past=current; current=future
and restart the sequence.
This then uploads video in 5-second chunks to the server. You can stitch the videos together with ffmpeg
if you want, or transcode them into MPEG-2 transport streams for HTTP Live Streaming. The video data itself is H.264-encoded by the asset writer, so transcoding merely changes the file's header format.
Best Answer
Im now just having a problem displaying a view/label/button on top of the movieplayer video. In Apple's MoviePlayer sample project, you can add a view on top of the video with this code:
The overlayview is a view that contains the requested label and button. However, when I apply the same code to a movie url containing the .m3u8 file it doesn't display the overlay view e.g.http://devimages.apple.com/iphone/samples/bipbop/gear1/prog_index.m3u8
The number of windows returned is just 1, as opposed to 2 in the previous example, even though the movie plays. When I check the url to see if it was ok, the movieURL returns null, but it still plays the streamed content. I tried removing the check for window count being greater than 1 but it still didnt display the overlay view.
Does anybody know a way around this? Thanks