Using Quaternions

I've tried several sketches with the ArduIMU V3, the stock DCM version and FreeIMU and Martin's DMP.  The stock DCM works for me, but I want to DIY.  Right now I have FreeIMU loaded and I changed the quaternion sketch so it will work with ArduIMU Test.  The FreeIMU quaternion sketch is supposed to work with the processing file FreeIMU cube and it does.  The key to working with the cube program is how the cube program uses the quaternions.  I borrowed the conversion from quaternions to euler and stuck it into the sketch. 

As long as I stay within one quadrant, it works great.  As soon as I rotate too far, then pitch might become yaw and roll might become pitch, etc.  So, I looked at how the processing sketch converts the quaternions and I am confused.  It looks like at the bottom of the processing sketch there are 4 functions and I'm only using the void function in my sketch. I'm trying to figure out where "a" and "b" come from and how "conj" will translate to my arduino sketch. If you can help me understand this, then I will share my sketch. Thanks.

 

 


// See Sebastian O.H. Madwick report
// "An efficient orientation filter for inertial and intertial/magnetic sensor arrays" Chapter 2 Quaternion representationvoid quaternionToEuler(float [] q, float [] euler) {
euler[0] = atan2(2 * q[1] * q[2] - 2 * q[0] * q[3], 2 * q[0]*q[0] + 2 * q[1] * q[1] - 1); // psi
euler[1] = -asin(2 * q[1] * q[3] + 2 * q[0] * q[2]); // theta
euler[2] = atan2(2 * q[2] * q[3] - 2 * q[0] * q[1], 2 * q[0] * q[0] + 2 * q[3] * q[3] - 1); // phi
}float [] quatProd(float [] a, float [] b) {
float [] q = new float[4];

q[0] = a[0] * b[0] - a[1] * b[1] - a[2] * b[2] - a[3] * b[3];
q[1] = a[0] * b[1] + a[1] * b[0] + a[2] * b[3] - a[3] * b[2];
q[2] = a[0] * b[2] - a[1] * b[3] + a[2] * b[0] + a[3] * b[1];
q[3] = a[0] * b[3] + a[1] * b[2] - a[2] * b[1] + a[3] * b[0];

return q;
}// returns a quaternion from an axis angle representation
float [] quatAxisAngle(float [] axis, float angle) {
float [] q = new float[4];

float halfAngle = angle / 2.0;
float sinHalfAngle = sin(halfAngle);
q[0] = cos(halfAngle);
q[1] = -axis[0] * sinHalfAngle;
q[2] = -axis[1] * sinHalfAngle;
q[3] = -axis[2] * sinHalfAngle;

return q;
}// return the quaternion conjugate of quat
float [] quatConjugate(float [] quat) {
float [] conj = new float[4];

conj[0] = quat[0];
conj[1] = -quat[1];
conj[2] = -quat[2];
conj[3] = -quat[3];

return conj;
}

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • I figured this much out, the functions I listed above are for orienting the cube with the IMU.  I commented out all of them except for the void function and then ran processing to see what broke.  I couldn't hit h to adjust it, so those functions aren't critical.  I can see needing them if I want to "level" the IMU in a plane, so I'd still like to know how to translate that code into arduino if anyone can help there.

    I have no idea where the values for float []a and []b come from.  I figure they would translate to float a[] and float b[], but float new[] I don't know. 

     

This reply was deleted.

Activity

DIY Robocars via Twitter
RT @chr1sa: Just a week to go before our next @DIYRobocars race at @circuitlaunch, complete with famous Brazilian BBQ. It's free, fun for k…
2 hours ago
DIY Robocars via Twitter
How to use the new @donkey_car graphical UI to edit driving data for better training https://www.youtube.com/watch?v=J5-zHNeNebQ
Monday
DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
Nov 26
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
Nov 25
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
Nov 23
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord: https://discord.gg/EPsZHkg9Nx) https://t.co/PNDewvJdrb
Nov 23
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
Nov 23
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20
DIY Robocars via Twitter
Incredible training performance with Donkeycar https://www.youtube.com/watch?v=9yy7ASttw04
Nov 9
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge https://t.co/nZQTff5…
Oct 31
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉 https://t.co/NtKnO…
Oct 23
DIY Robocars via Twitter
RT @IndyAChallenge: Congratulations to @TU_Muenchen the winners of the historic @IndyAChallenge and $1M. The first autonomous racecar comp…
Oct 23
DIY Robocars via Twitter
RT @JoeSpeeds: 🏎@TU_Muenchen #ROS 2 @EclipseCyclone #DDS #Zenoh 137mph. Saturday 10am EDT @IndyAChallenge @Twitch http://indyautonomouschallenge.com/stream
Oct 23
More…