Electrical – How to do IMU and camera “sensor fusion” tracking

cameraimusensor

I have some 50ms latency cameras on hand and an gyro+accelerometer+magnetometer 800Hz IMU. I would like to know how exactly I should do a sensor fusion of such an IMU and camera to fix the positional data from the IMU positional drift. I'm not able to find much resources online.

Reason I don't want to go with just camera is the latency of 50ms with it.

The optical markers for the camera can be LEDs, ORB-SLAM data or AruCo markers which I currently use which add another few ms latency to the camera tracking.

Maybe there is even an existing library or documented implementation I can use?

Thank you.

Best Answer

Begin by finding a way to convert SLAM or ArUco data into absolute position in some coordinate system. This is most complex part and I have no idea how to to this, never have worked with either.

Then apply Kalman filter to accelerometer and position data, exactly same way as it is done for gyro and accelerometer (there are plenty of examples for that on the web, including hundreds of videos on Youtube).

The idea is to integrate accelerometer over time twice to get speed first and position second. The result would be fast position value with severe drift. Then you apply noisy and slow absolute position from camera to correct that drift.

This is the same as integrating gyro data to get fast angle with drift and then correcting that drift with accelerometer/magnetometer inputs.