All (but perhaps Bill P and Paul B most of all :-),

Based on earlier threads, I had the following idea for the orientation estimation of a quadrotor using only 3 gyroscopes and 3 accelerometers. I'm not sure if it would work.

I propose to track the "gravity vector".  I'll update the vector as per the usual DCM algorithm (iterate the vector each timeframe via the cross product of omega and the vector, where omega is the gyro output compensated via a PI feedback of the cross product of itself and the accelerometer outputs). This effectively gives two of the three orientation components, and has already been described elsewhere as "DCM lite". However, I would only update the PI loop when the magnitude of the gravity vector is sufficiently close to 1 g.

The third component comes from integrating the dot product of the compensated gyro signals with the gravity vector - i.e. the component of angular velocity that is around the gravity vector. I'm pretty confident that this gives me a uniquely defined orientation, without singularities (assuming I restrict the angle to +/- 180 deg or equivalent). This scheme seems analogous to the "axis-angle" formulation or even the quaternion formulation in that 4 parameters are used to specify the 3 degrees of freedom (3 components of a vector and an angle).

My experience is that the MEMS gyro drift is pretty small and slowly varying, so that as long as the quad is maneuvering (tilting), all 3 gyros will have some compensation from the PI feedback of the accelerometer error, even without magnetometer feedback (it won't be perfect, but it might be good enough for the 15-30 minutes that the battery will last).

Does this sound like it might work? If so, how would I transform the 4 parameters into a representation of orientation in the inertial frame (DCM, Euler angles, etc) for use in groundstation displays? My idea was to take the cross product of the gravity vector with the body z axis and reverse it to indicate how the body moved, then simply rotate the inertial body around the gravity vector by the integrated angle. However, this did not appear to work in my simulation. Perhaps my code had an error, or perhaps there is something wrong with these ideas.

As for control, my idea is that the pitch & roll r/c joystick inputs would serve as virtual angular velocity commands to move a "desired" gravity vector (for a rate command system). Then the cross product between the desired and estimated gravity vector can be used to form the error signal that drives the PID loops. I could have the yaw joystick command the rate about the gravity vector, but I think it would be more intuitive for it to command the body axis yaw rate (and easier to implement)

Thanks
Roy

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • T3
    Roy,

    It sounds to me that your idea might work, but only if you limit to manuevers that do not involve much centrifugal effects.

    With regard to the gyro yaw drift in the earth frame, the more the quadrocopter tilts back and forth, the better will be the reduction in the drift. Someone (was it you?) asked me about this recently, so I ran my roll-pitch-yaw demo, where I intentionally created a significant amount of initial yaw rate error by yawing the board during power up. At first, the representation was "dizzy", but by rolling and pitching the board, I was able to all but cancel it out. Of course, if you use the controls to keep the quadrocopter level, then the effect will not be there to help. In any case, you will not be able to achieve "lock", of course, but the residual drift rate should be only a few degrees a minute, so what you will have will be a "heading hold".

    With regard to integrating the dot product of the gyro signals with the gravity vector, of course you do that by taking the dot product of the gyro vector with the bottom row of the direction cosine matrix (DCM Lite), I think that is what you meant when you referred to the "gravity vector". The integration will give you the yaw angle.

    With regard to combining everything back into a single representation, there are two approaches that you could use. If you are careful, either one should work.

    A. Work with Euler angles and DCM lite. In this approach, you already have the yaw angle by integrating the dot product. You get the roll and pitch angles from appropriate operations of the elements of the bottom row of the direction cosine matrix. So, then you have all three Euler angles. If you want, you can then transform them into direction cosines, using the standard equations.

    B. Work with direction cosines directly. This is a bit of a step away from DCM lite. In this approach, you simply run the full DCM algorithm, you simply do not perform yaw drift compensation. Essentially, you will be doing what I tried recently with my roll-pitch-yaw demo without a GPS for yaw compensation.

    But, back to centrifugal effects....without centrifugal compensation, certain manuevers will cause problems. Hovering will be ok, and straight line flying will be ok, and gentle turns will be ok. But if you tried to roll the quadrocopter a bit, and fly it around a banked turn, eventually the centrifugal effect would cause trouble.

    Best regards,
    Bill
This reply was deleted.

Activity

DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
5 hours ago
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
yesterday
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
Wednesday
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord: https://discord.gg/EPsZHkg9Nx) https://t.co/PNDewvJdrb
Wednesday
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
Wednesday
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20
DIY Robocars via Twitter
Incredible training performance with Donkeycar https://www.youtube.com/watch?v=9yy7ASttw04
Nov 9
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge https://t.co/nZQTff5…
Oct 31
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉 https://t.co/NtKnO…
Oct 23
DIY Robocars via Twitter
RT @IndyAChallenge: Congratulations to @TU_Muenchen the winners of the historic @IndyAChallenge and $1M. The first autonomous racecar comp…
Oct 23
DIY Robocars via Twitter
RT @JoeSpeeds: 🏎@TU_Muenchen #ROS 2 @EclipseCyclone #DDS #Zenoh 137mph. Saturday 10am EDT @IndyAChallenge @Twitch http://indyautonomouschallenge.com/stream
Oct 23
DIY Robocars via Twitter
RT @DAVGtech: Another incident: https://t.co/G1pTxQug6B
Oct 23
DIY Robocars via Twitter
RT @DAVGtech: What a great way to connect why @diyrobocars community is so valuable and important! Have to start somewhere @IndyAChallenge…
Oct 23
More…