Electronic – Transmitting a video stream through a microcontroller

bluetoothmicrocontrollervideovideo-transmitter

I'm working a a project that currently works well but this year we want to expand it by collecting data from it, independent of its current functions. We have set up an iPad App to control an Arduino through Bluetooth. The next step is transmitting video through Bluetooth to the iPad.

With this question I would like to focus on how a microcontroller can send a video stream through Bluetooth. I can't seem to find any way to even start this project.

How do you interface the camera with the microcontroller?

How do you then send that stream over Bluetooth?

Best Answer

Unfortunately in my experience what you'll quickly find is that video is obnoxiously difficult to manipulate without a lot of cpu power. Let's start from the beginning: how much video do you want?

There are a lot of choices here, but start small. Let's say that, for simplicities sake, you wanted a 640x480 frame of 8-bit black and white video, at 24 frames per second.

That's 640*480 pixels = 307,200 * 8 bits per pixel = 307,200 kilobytes per frame * 24 frames per second = 7,372,800 bytes per second or ~7.37 megabytes per second

So that is a baseline for data throughput for a camera outputting raw frames, and that's not including sound or color. Now you have a few paths you can take: you can start encoding the video stream, or you can get a lot of bandwidth. I don't know the throughput of bluetooth, so I can't help you there.

Encoding unfortunately takes one of three things: a lot of processing power, or specialized hardware, or (possibly) FPGA knowledge that I also don't have. Encoding will reduce bandwidth concerns, but at a pretty hefty cost. You would need to research compression and whatnot to figure out whether you could even get this over your interface and still have the arduino do useful work.

If you want to connect a camera to a microcontroller you'll find that even simply taking stills can be a pain based on how the still frames are delivered. There was a sparkfun camera which gave jpeg frames but didn't have a fixed period for how long the encoding took, and then would suddenly start spitting out the encoded frame as fast as it could, and since the frames were too big to fit in RAM on the microcontroller it had to spend all it's time getting the frame and spitting it out over whatever interface they were using to transmit.

tl;dr: you should establish what you need and probably try and and figure out whether using the microcontroller as the go-between is the best choice.

Good luck! I hope that helps.

Related Topic