All (but perhaps Bill P and Paul B most of all :-),

Based on earlier threads, I had the following idea for the orientation estimation of a quadrotor using only 3 gyroscopes and 3 accelerometers. I'm not sure if it would work.

I propose to track the "gravity vector".  I'll update the vector as per the usual DCM algorithm (iterate the vector each timeframe via the cross product of omega and the vector, where omega is the gyro output compensated via a PI feedback of the cross product of itself and the accelerometer outputs). This effectively gives two of the three orientation components, and has already been described elsewhere as "DCM lite". However, I would only update the PI loop when the magnitude of the gravity vector is sufficiently close to 1 g.

The third component comes from integrating the dot product of the compensated gyro signals with the gravity vector - i.e. the component of angular velocity that is around the gravity vector. I'm pretty confident that this gives me a uniquely defined orientation, without singularities (assuming I restrict the angle to +/- 180 deg or equivalent). This scheme seems analogous to the "axis-angle" formulation or even the quaternion formulation in that 4 parameters are used to specify the 3 degrees of freedom (3 components of a vector and an angle).

My experience is that the MEMS gyro drift is pretty small and slowly varying, so that as long as the quad is maneuvering (tilting), all 3 gyros will have some compensation from the PI feedback of the accelerometer error, even without magnetometer feedback (it won't be perfect, but it might be good enough for the 15-30 minutes that the battery will last).

Does this sound like it might work? If so, how would I transform the 4 parameters into a representation of orientation in the inertial frame (DCM, Euler angles, etc) for use in groundstation displays? My idea was to take the cross product of the gravity vector with the body z axis and reverse it to indicate how the body moved, then simply rotate the inertial body around the gravity vector by the integrated angle. However, this did not appear to work in my simulation. Perhaps my code had an error, or perhaps there is something wrong with these ideas.

As for control, my idea is that the pitch & roll r/c joystick inputs would serve as virtual angular velocity commands to move a "desired" gravity vector (for a rate command system). Then the cross product between the desired and estimated gravity vector can be used to form the error signal that drives the PID loops. I could have the yaw joystick command the rate about the gravity vector, but I think it would be more intuitive for it to command the body axis yaw rate (and easier to implement)

Thanks
Roy

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • T3
    Roy,

    It sounds to me that your idea might work, but only if you limit to manuevers that do not involve much centrifugal effects.

    With regard to the gyro yaw drift in the earth frame, the more the quadrocopter tilts back and forth, the better will be the reduction in the drift. Someone (was it you?) asked me about this recently, so I ran my roll-pitch-yaw demo, where I intentionally created a significant amount of initial yaw rate error by yawing the board during power up. At first, the representation was "dizzy", but by rolling and pitching the board, I was able to all but cancel it out. Of course, if you use the controls to keep the quadrocopter level, then the effect will not be there to help. In any case, you will not be able to achieve "lock", of course, but the residual drift rate should be only a few degrees a minute, so what you will have will be a "heading hold".

    With regard to integrating the dot product of the gyro signals with the gravity vector, of course you do that by taking the dot product of the gyro vector with the bottom row of the direction cosine matrix (DCM Lite), I think that is what you meant when you referred to the "gravity vector". The integration will give you the yaw angle.

    With regard to combining everything back into a single representation, there are two approaches that you could use. If you are careful, either one should work.

    A. Work with Euler angles and DCM lite. In this approach, you already have the yaw angle by integrating the dot product. You get the roll and pitch angles from appropriate operations of the elements of the bottom row of the direction cosine matrix. So, then you have all three Euler angles. If you want, you can then transform them into direction cosines, using the standard equations.

    B. Work with direction cosines directly. This is a bit of a step away from DCM lite. In this approach, you simply run the full DCM algorithm, you simply do not perform yaw drift compensation. Essentially, you will be doing what I tried recently with my roll-pitch-yaw demo without a GPS for yaw compensation.

    But, back to centrifugal effects....without centrifugal compensation, certain manuevers will cause problems. Hovering will be ok, and straight line flying will be ok, and gentle turns will be ok. But if you tried to roll the quadrocopter a bit, and fly it around a banked turn, eventually the centrifugal effect would cause trouble.

    Best regards,
    Bill
This reply was deleted.

Activity

DIY Robocars via Twitter
RT @chr1sa: Donkeycar 4.4 released with tons of new features, including path learning (useful with GPS outdoors), better Web and Lidar supp…
Nov 27
DIY Robocars via Twitter
RT @NXP: We are already biting our nails in anticipation of the #NXPCupEMEA challenge! 😉 Did you know there are great cash prizes to be won…
Nov 24
DIY Robocars via Twitter
RT @gclue_akira: レースまであと3日。今回のコースは激ムズかも。あと一歩 #jetracer https://t.co/GKcEjImQ3t
Nov 24
DIY Robocars via Twitter
UC Berkeley's DIY robocar program https://roar.berkeley.edu/
Nov 24
DIY Robocars via Twitter
RT @chr1sa: The next @DIYRobocars autonomous car race at @circuitlaunch will be on Sat, Dec 10. Thrills, spills and a Brazilian BBQ. Fun…
Nov 24
DIY Robocars via Twitter
RT @arthiak_tc: Donkey car platform ... Still training uses behavioral cloning #TCXpo #diyrobocar @OttawaAVGroup https://t.co/PHBYwlFlnE
Nov 20
DIY Robocars via Twitter
RT @emurmur77: Points for style. @donkeycar racing in @diyrobocars at @UCSDJacobs thanks @chr1sa for taking the video. https://t.co/Y2hMyj1…
Nov 20
DIY Robocars via Twitter
RT @SmallpixelCar: Going to @diyrobocars race at @UCSDJacobs https://t.co/Rrf9vDJ8TJ
Nov 8
DIY Robocars via Twitter
RT @SmallpixelCar: Race @diyrobocars at @UCSDJacobs thanks @chr1sa for taking the video. https://t.co/kK686Hb9Ej
Nov 8
DIY Robocars via Twitter
RT @PiWarsRobotics: Presenting: the Hacky Racers Robotic Racing Series in collaboration with #PiWars. Find out more and register your inter…
Oct 23
DIY Robocars via Twitter
RT @Hacky_Racers: There will be three classes at this event: A4, A2, and Hacky Racer! A4 and A2 are based around UK paper sizing and existi…
Oct 23
DIY Robocars via Twitter
Oct 23
DIY Robocars via Twitter
Oct 19
DIY Robocars via Twitter
Oct 18
DIY Robocars via Twitter
RT @NeaveEng: Calling all UK based folks interested in @diyrobocars, @f1tenth, @donkey_car, and similar robot racing competitions! @hacky_r…
Oct 13
DIY Robocars via Twitter
RT @araffin2: 🏎️ After hours of video editing, I'm happy to share a best of my Twitch videos on learning to race with RL. 🏎️ Each part is…
Oct 13
More…