Using the MPU6050 for Quadcopter Orientation Control

Hello everybody, I've been working on an Arduino based quadcopter for the last few months. A key component to make the quadcopter balance is an orientation sensor that periodically reports the yaw/pitch/roll which can be used as input to a PID controller that adjusts the RPM of the quadcopter motors. The InvenSense MPU6050 is a popular and cheap sensor that has an accelerometer, gyroscope and a temperature sensor on board. The accelerometer reports the components of the acceleration vector along the local coordinate system of the sensor. If the sensor itself is not accelerating, the only accelerating force that the sensor is subjected to is gravity. Since the gravity vector points straight down, the components of the gravity vector along the axis of the local coordinate system of the sensor can be used to estimate the orientation of the sensor (in the absence of other accelerating forces).

The gyroscope reports the angular velocity along the axes which can be integrated to obtain the current orientation. As has been reported in numerous articles on the web, gyro measurements are subject to drift, while accelerometer measurements are noisy. In order to obtain optimal orientation measurements, one must combine the results of the gyroscope and the accelerometer. This is typically done using a complementary filter or a kalman filter. The complimentary filter is much simpler to implement and produces results that are very close to that of the kalman filter.

The MPU6050 also has a MPU (Motion Processing Unit) that performs sensor fusion on-board (using some unknown algorithm) and reports the orientation in yaw/pitch/roll or quaternion format.

When I started working on the quadcopter, I read a lot of articles about using the MPU6050 about determining orientation. Some of the articles used the raw accelerometer and gyroscope readings and performed their own sensor fusion. Others used the data from the MPU directly. I was not sure if one approach is better than the other. In the last couple of weeks, I've been experimenting with both of these methods. I have developed an oscilloscope application using QT to plot the data from the MPU and the results of the sensor fusion. In this article, I'd report my results and also provide a link to the code for my oscilloscope application and the Arduino code.

Performing Sensor Fusion:

Sensor fusion involves combining the data obtained from the accelerometer and the gyroscope. The accelerometer reports the components of the acceleration vector along the axis of the coordinate system attached to the sensor. The angle of rotation along the x/y axis can be obtained from these components. The accelerometer can't be used to obtain the angle about the z axis (yaw). There is confusion about the trigonometry that should be used to obtain the pitch and roll. Some people recommend simply dividing the x and y components by the z component. I found that this doesn't produce results that agree with the pitch/roll reported by the MPU. The formula that worked for me is:

 

angle_x =  atan(ay/(ax^2 + az^2)*RAD_TO_DEGREES

angle_y =  atan(ax/(ay^2 + az^2)*RAD_TO_DEGREES

The accelerometer produces instantaneous estimates of the pitch/roll, but these estimates are subject to a lot of noise. Therefore, the pitch/roll angles from the accelerometer can’t be used directly without some filtering.

3691165027?profile=original

3691164865?profile=original

The gyroscope measures the angular velocity measured in degrees per second. In order to obtain angle of rotation about x/y/z axes, the angular velocity measurements need to be integrated over time. This is easily done by multiplying the current angular velocity by the time elapsed since the last measurement and accumulating the product. While gyro measurements are very stable and noise free, they have a tendency to drift over time. This can be seen in the figure below. Over time, the accumulated gyro measurements drift farther away from the measurements reported by the MPU

3691164920?profile=original

Fortunately, the accelerometer and gyroscope measurements can be combined using a kalman filter or a complimentary filter. There are numerous excellent tutorials about the Kalman filter available on the web, so refer to any of those. The complimentary filter is much simpler and involves a linear combination of the accelerometer and the accumulated gyroscope data with most of the weight being placed on the gyroscope measurements. This tends to produce measurements that closely follow the ones produced by the on-board MPU. The yaw measurements don't agree as the accelerometer can not be used to calculate the rotation about the z axis. 

3691165050?profile=original

Note that the gyroscope and accelerometer measurements must be scaled before being used. This is not so important for the accelerometer measurements as they are used in a fraction and the scale factor cancels out, however for the gyroscope measurements, using the correct scale is important. The scale factor can be obtained as follows:

uint8_t READ_FS_SEL = mpu.getFullScaleGyroRange();

GYRO_FACTOR = 131.0/(READ_FS_SEL + 1);

This can be seen in the table on page 12 of the MPU6050 user manual available from InvenSense.

Calculating the Sensor Offsets

The I2Cdev library provides an API to get and set the offsets for the accelerometer and the gyroscope. The MPU6050 manual from InvenSense doesn’t mention offsets anywhere, so it’s a bit unclear how to calculate these offsets or even how important it is to set them. I’m calculating the offsets by averaging the gyro and accelerometer readings for 1000ms when the sensor is placed at rest on a flat surface. Then, I subtract the offset values thus calculated from the readings obtained when the application is running. In my experience, using or not using the offsets doesn’t affect the results.

Conclusion:

It’s the easiest to just use the yaw/pitch/roll data provided by the MPU6050 onboard MPU instead of doing your own sensor fusion. The interrupts arrive frequently enough (about 1KHz) for effective quadcopter orientation control. It was definitely instructive to try the sensor fusion to understand how the accelerometer/gyroscope readings behave and how they can be combined, however this doesn’t provide any benefit over using the MPU output for quadcopter control. Using the MPU output will also save you from spending time to do the sensor fusion in software.

My Code:

There are two parts of the code – the code that is uploaded to the Arduino chip and the application code used to generate the oscilloscope graphs. Let’s first talk about the Arduino code. When I first started experimenting with the Arduino system, I used the Arduino IDE to edit and build the sketches. However the Arduino IDE is only suitable for small projects. As far as I know, there is no way to create projects containing multiple source files. Morever, every time you make any change to your sketch, the Arduino IDE recompiles and links your sketch and all needed Arduino core files to generate the final .hex that is uploaded to the microprocessor. This is redundant as the core files could be compiled into a lib which can be linked against your sketch, thereby eliminating the need to recompile the whole core SDK. Lastly, the IDE is limited in its editing capabilities and many editor features such as find and replace and automatic code formatting that I’m accustomed to as a visual studio user are missing. However, the Arduino IDE does provide a very convenient way to experiment with the Arduino sample sketches without having to create projects for each sketch.

I spent some time trying to figure out if there was any way to configure visual studio to compile Arduino code. After trying long and hard, I finally realized that there is no way to change the default MSVC tool chain that Visual Studio uses to the GCC based toolchain used by Arduino. If anyone has done this successfully, please let me know.

The next option was to use Eclipse. Fortunately, there is a way to set up Eclipse to use the WinAVR toolchain and compile your sketches (which is basically C code). This set up is described here:

http://playground.arduino.cc/Code/Eclipse

In my own projects, I compile the core Arduino code into a lib which is linked into my application. Since I rarely modify the core Arduino files, this saves me from having to recompile the core when I make a change to my application code. The WinAVR Eclipse plug-in essentially generates a make file that can be run using the command prompt to build your .hex and push it to the hardware. You’ll have to specify the baud rate and COM port in the AVRDude settings for the upload to work correctly.

For communicating with the MPU6050, I’m using the excellent I2Cdev library developed by Jeff Rowberg.  

For my oscilloscope code, I’m using the QWT framework which provides 2D plotting functionality. This can be downloaded from:

http://sourceforge.net/projects/qwt/

I’ve modified the Oscilloscope sample to add the yaw/pitch/roll plots and add an extra curve to each plot. In order to use my sample, you must install QT and QWT and set the QTDIR environment variable. In my application QWT is installed in qt/qwt-6.1.1. If you install qwt in a different location, you’ll have to change the paths to the qwt header/libs in the visual studio project settings.

My code is located at: https://github.com/ankur6ue/Embedded

The Eclipse project is in: Projects/Eclipse/MPU6050Read

The Oscilloscope project is in: Projects/Win32/QtProjects/Oscilloscope

Let me know if you have trouble using the code and I’ll post more info. 

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  •   Interesting read.  The best way to learn is to play with the sensors and experiment with code.  That's how I started four years ago.  I have written my own quadcopter control software from scratch including sensor libraries and complementary filters for combining gyro, accel, mag, gps data.  Keep playing with it and learning.

      Using the MPU data at 100Hz should work for keeping a quad stable (but it might not perform quite as well as if it was running faster).  I prefer to do the fusion on the arduino at 400Hz, but it requires avoiding trig functions and floating point divides religiously.

    Good luck and have fun.  Phillip

  • Nice, but maybe you should try those two fusion methods by adding some linear acceleration to see the comparisons.

    I do think that there may be a threshold for filtering large linear acceleration mixed measurements for MPU6050, also you can do this in your software and obviously it is more flexible. So may be the final fusion results will be much different in some uncommon condition like high-frequency oscillation or long-term linear acceleration mixed.

    Another interesting point that you do fusion by yourself is that you can fuse as many measurements as you like, which cannot be realized by hardware. Think about adding optical flow sensor or GPS. 

  • Hi Thomas, thanks. I wasn't aware of this Visual Micro plug-in for Visual Studio. I did do quite a bit of research on trying to use Visual Studio to build for Arduino using the GCC toolchain and wasn't able to get Visual Studio to not use the MSVC tool chain. I'll see if this plugin addresses the issue. Atmel Studio 6.2 was the first IDE I used, and while it works, it is painfully slow. Opening up the property dialog sometimes takes 10 seconds. Eclipse is much better in terms of responsiveness.

    I'm using the Arduino Uno which uses the Atmel 328P running at 16MHz. The sensor interrupts are actually arriving at 100 Hz instead of 1KHz, which does leave enough time to run the PID controller and even do some Serial.println as long as the length of the text string is not super long. For a reasonably heavy body like the quadcopter (~10 pounds) with a good amount of inertia, I tend to think that sending control inputs at 100Hz should be sufficient? Can you point me to any information about the appropriate frequency for a PID controller loop?

    For the real time oscilloscope plot, I'm using a producer/consumer model with the producer thread sampling the serial port at an adjustable frequency (about 50Hz works fine). The size of a packet read depends on the serial port sampling frequency.At a sampling frequency of 50Hz, it's about 90bytes, which is big enough to contain the MPU as well as the sensor fusion data. 

    As far as the title of the post, I think the current title is ok. I just wanted to share the results of my experiments with the community. Some will find it more useful than others.

This reply was deleted.

Activity