IMU Readings and Sensor Fusion

I’ve been having a look at ways to read the raw sensor data from the IMU and translate it into an absolute orientation estimate.

Here’s the problem.

Accelerometers measure forces of acceleration, so that when the board accelerates, it knows. Gravity is a force of acceleration (9.8ms-2, or 1g, towards the earth), so by combining the X,Y and Z vectors of the reading of a stationary accelerometer we can work out the direction of gravity, and hence the orientation of the board.

But as soon as you start moving the accelerometer – as will, of course, happen a lot in a flying quadcopter – it picks up these lateral forces of acceleration as well as the force of gravity (and vibrations affect it too, which are just very fast lateral movements). What this means is that the output of the accelerometer is very noisy, and only in the long term does it provide an accurate reading.

Gyroscopes have the opposite problem. They measure rotational velocity, and so no amount of lateral movement will affect them. However, in order to find the rotational position, i.e. orientation, we have to integrate the rotational velocity over time, and this means that the baseline rotation will drift quite badly over time.

I found a very nice graphic from peter-jan.com’s blog to illustrate this:

Accelerometer data vs. gyro data
Accelerometer data vs. gyro data

Now there exists a very complicated algorithm called the Kalman filter which can be adapted to the situation and basically combines several sensor values and, knowing their characteristics, outputs a single estimated value. However, it is extremely complex and I am not going to have time to get my head around the maths, let along implement it.

A more feasible option is the complementary filter, a simple equation that takes the short-term change in reading of the gyro, and the long-term average in reading of the accelerometer, and combines it into one more reliable reading. It usually goes something like this (again from Jan’s site):

Now I like the look of this very much, and it wouldn’t be hard at all to implement. However, it would still take quite a lot of fine-tuning to make it reliable, and I have a magnetometer (compass) that wouldn’t be used by the filter at all and would require separate integration. Perhaps it’s something I’ll do later when I have more time, but for now I’m in a bit of a hurry so I had a look at any libraries or existing code that I can use for this purpose. In the end, it will make it more efficient and the project has to be built on something.

Polulu, the manufacturer of my IMU unit, provides Arduino libraries to read the raw sensor data from the unit, but it doesn’t have any sensor fusion capability. However, they also reference a sketch that they put together to calculate an estimated orientation in X, Y and Z based on all three sensors: the accelerometer, gyroscope and magnetometer. It seems quite complex, so I’m not going to get my head into how it works right now, but the fact is it does and I’m going to use it for now.

As it comes with a basic python script to visualise the output orientation, I plugged the IMU in to the Arduino, uploaded the sketch and had a look.

IMU plugged in to Arduino
IMU plugged in to Arduino

Worked perfectly.

Leave a Reply

Your email address will not be published. Required fields are marked *