# Can acceleration in global coordinates be calculated and exported via DCM?

For a rover application .....

I am modifying the 1.9 firmware to send out acceleration in global coordinates, but I am not getting satisfactory results.

My process is to perform a matrix operation by multiplying the acceleration vector (in body frame) by the DCM.  This should transform the vector into the global frame.

Accel_x = AvgX*DCM_Matrix[0][0]+AvgY*DCM_Matrix[0][1]+AvgZ*DCM_Matrix[0][2];
Accel_y = AvgX*DCM_Matrix[1][0]+AvgY*DCM_Matrix[1][1]+AvgZ*DCM_Matrix[1][2];
Accel_z = AvgX*DCM_Matrix[2][0]+AvgY*DCM_Matrix[2][1]+AvgZ*DCM_Matrix[2][2];

The output does not behave correctly.  If the vector was transformed into the global frame, lifting one side of the rover should only have a momentary effect.  But the result is a persistent value.  This make me think that gravity is still effecting the vector.  As a result I think my attempt to transform the vector to global frame has failed.

Any ideas?

Views: 549

### Replies to This Discussion

It think I can help,but I don't understand what you are trying to do. In version 2.X there are libraries to do all kinds of vector/matrix operations.

Here's how it's done now.

// multiplication by a vector
template <typename T>
Vector3<T> Matrix3<T>::operator *(const Vector3<T> &v) const
{
return Vector3<T>(a.x * v.x + a.y * v.y + a.z * v.z,
b.x * v.x + b.y * v.y + b.z * v.z,
c.x * v.x + c.y * v.y + c.z * v.z);
}

Looks like the same thing you have. It's beyond me how all that template<t> stuff works, but the math looks the same. Its just abc,xyz instead of [1][0]. It must be your math that's wrong.

You are likely correct about the libraries.  I have only have the IMU for about two weeks and due to the lack of documentation I am discovering capabilities daily.

My ultimate goal is to add odometry capability to the IMU.  This should be possible if I can output an acceleration vector in globe coordinates.  Then integrate twice to get displacement.

They have tried that on multicopters for position holding and I believe it does not work. There is too much noise in the accel signals and with the double integration the distance solution is very unstable. I think you just need to use a real odometer counting wheel turns and use the magnatometer and/or ahrs for direction of travel.

I was hopping that for a rover I could, from time to time, stop and zero (by offset) the error.  But I have not found a good solution yet. Error grows too fast.

Regarding wheel encoders: my rover is an indoor & outdoor.  Slippage makes this method less than satisfactory.  But to date, its better than what I am getting from the IMU.

Dr B,

I just had an idea. I have read before, I think here, about some one hacking an optical mouse encoder chip. The lens had to be changed to get the right focal length, and the chip outputs serial data for x & y travel. This my be just what the Dr ordered ;)

Interesting idea.  I'll have to look into this.

Hiya,

This is exactly what I'm trying to do too.

I only got my board last week so am still trailing the web for all the required information. So far I've got it up and running and am outputting a custom string of gyro motion and heading.

Next step is to get the displacement from the accelerometers. I had nievely assumed that this might be standard output and someone might have already done the hard work, but it appears not :)

Surely we can't be the first to try it...

If you find anything, I'd be very interested.

I have came to the conclusion that the data from the IMU has too much noise to do this.  I believe this is why acceleration is not a preconfigured output.

:( that's a shame.

Did you take a look at the optical flow sensor notes?

http://diydrones.com/profiles/blog/show?id=705844%3ABlogPost%3A206819

This looks like it may be a more promising solution.

Hi Dr. B,

If I understand you correctly, you are trying to integrate the acceleration to determine position estimation. Since this is a double integration process, noise drift will get you twice. One simple test you can do is let the robot sit on the bench and see how long it takes before the drift makes the estimate of its position some pre-defined distance wrong (For example, in 5 minutes the drift might put the stationary robot as estimated 10 feet north.).

Another thing to note is that at least one accelerometer will measure gravity, or 1 g at all times. If the robot is tilted this gravity reading will be spread over one or more accelerometers (the magnitude over all the accels will be one g plus any body motion accelerations). This means you now need accurate angular measurement of the robot (perhaps an inclonometer?). Then you can know the orientation of the robot relative to gravity, and you can subtract that accl reading from each of the 3-axis accels to measure only the accel due to the movement of the robot.

You can quickly see this becomes a messy problem because now you have introduced angular estimation into your linear accelerometer problem as well.

Hi Dr. B

I have some experience in navigation and I think you may have two problems here.

First I agree with Cyrus above you have to consider the gravity vector. An accelerometer really measures the specific force. That is, it measures the net force on it's proof mass and the output is divided by the proof mass value to get an acceleration reading. So in addition to forces created by motion (accelerations) the accelerometers will also sense 1G of gravity which will show up in one or more of the accelerometer output readings, depending orientation of the accelerometer triad. As mentioned above the components of the1G gravity vector must be subtracted from the accelerometer readings, and this has to be done before integration is used on acceleration components to get velocity components and velocity components are integrated again to get position components (double integration). If you don't remove the gravity vector your position solution will be a rover in free fall, not supported by the ground!

I think your second problem could be in the Direction Cosine matrix (DCM).  lf the DCM is accurate, it will transform a vector from the body frame to a global coordinate frame, and the transpose of this matrix will transform from global frame to body frame. A good choice is the North, East, Down (NED), right handed global coordinate frame. An accurate DCM that goes from body to global NED can be used to transform the acceleration + gravity readings (measured by the body accelerometers in the body frame) to the acceleration + gravity in the NED frame. Then just subtract off the 1 G gravity from the D (down axis) and you have the real acceleration in the NED  (non rotating inertial) frame. Now you are ready to do the double integration.

If you want the body accelerations without gravity, just use the transpose of the DCM and transform the 1G vector into the components of the body frame, and subtract that from the accel readings and you are done.

Sorry for the long post but two more points.

1) The DCM has a property that it's transpose is equal to it's inverse, so if you multiply DCM by DCM' (the ' indicates a transpose) you get an identity matrix, that is 1.0 along the diagonal and zero everywhere else.

2) An accurate DCM needs to be established by a process known as alignment before it can be used to navigate (start integration of accelerometer readings) and maintained throughout the navigation run. As Cyrus states the angular measurements from the gyros gets involved here. All of this takes very good sensors (expensive) to work well.

I hope this helps.

I have came to the conclusion that the data from the IMU has too much noise to do this.  I'm implementing procedure documented here

1

2

3

4

5

6

7

8

9

10

## Groups

320 members

118 members

26 members

879 members

• ### ArduCopter User Group

1794 members

Season Two of the Trust Time Trial (T3) Contest has now begun. The fifth round, 3D model building requiring contestants to survey and reconstruct a building or natural scene from aerial images.

A list of all T3 contests is here.