3689713872?profile=original

In this last post of the series I shall overview the main program including the control algorithms for the Groundhog.  Code is written in Python, using Dronekit and OpenCV all running on a Raspberry Pi 3.

As we are flying indoors without GPS and also without optical flow, we are using quaternions to control the vehicle in the GUIDED_NOGPS flight mode of ArduCopter.  To be honest, I've not come across anyone else doing this before, so it must be a good idea...

What's a quaternion?


There are many references to this - check out Wikipedia for starters.  However, a quaternion is just another way for specifying the attitude of a body in a frame of reference, other than the traditional yaw, pitch, roll.  Whilst it's slightly harder to get your head around, quaternions are fundamentally (and by that I mean mathematically) more sound.  For example, working with pitch, roll and yaw can lead to gimbal lock, in which your use of trig functions can cause /div zero errors at the extremes (like multiples of 90 degrees).

With a quaternion, we specify the new attitude as a specified rotation around a  vector - an axis of rotation.  Think about the vector describing the current attitude.  We can map that vector to one describing the target attitude by specifying a rotation (of so many degrees) around an axis of rotation.  (More generally in robotics, we add a translation as well, but that's another story).

I am a visual learner, so after doing the maths I really understood it using two cable ties (start and finish attitude vectors) and a barbecue stick (axis of rotation).  Try it.

So in short, we are going to send a stream of quaternions to the Pixhawk to tell it how to change attitude.  Fortunately some very clever people have written some functions for this...

Code


All code is posted to my GitHub repository.

Code overview


There are just three program files:

  1. follow.py
    The main program called from the control console.
  2. PiVideoStream.py
    A class to set up a threaded reading and image analysis of each frame read by the PiCam2.
  3. ST_VL6180X.py
    A class for reading the short range laser tof rangefinder.  Full credit to author A.Weber.

Calling the code


The main program is invoked to either connect directly to the Pixhawk or to connect to the software in the loop simulator (see previous post).  Navigate to the local library on the RPi3 and open a console:

  • Connecting to the Pixhawk: >python follow.py
  • Connecting to the SITL: >python follow.py --connect "udp:192.168.0.xxx:14550"


In the latter instance, xxx is the address of the RPi on the local network.

follow.py

Libraries


Several standard libraries are used, some which need to be specifically loaded using pip.  In particular, note imutils from Adrian Rosebrock's excellent blog at pyimagesearch.com.  I preferred to keep the VL6180X in the local folder rather than fully import it via pip.

# import the necessary packages

from dronekit import connect, VehicleMode, LocationGlobalRelative
from pymavlink import mavutil # Needed for command message definitions
import time
import numpy as np
from pyquaternion import Quaternion
from PiVideoStream import PiVideoStream
from imutils.video import FPS
from picamera.array import PiRGBArray
from picamera import PiCamera
import argparse
import imutils
import cv2
import sys
from ST_VL6180X import VL6180X
from datetime import datetime, timedelta

Pixhawk/RPi3 Connection String

3689713849?profile=original


Optionally, a connection string is used when the program is invoked.

The presence of a connection string is tested at several points in the program to decide, for example, whether to command a take-off if using the SITL (and obviously not if really flying!).

Note also the serial port specified on the RPi3 - "/dev/serial0".  This works as the bluetooth has been disabled as per my previous post.  The baud rate also has to be set in Mission Planner for the Pixhawk to connect at the same speed.

#--------------------------SET UP CONNECTION TO VEHICLE----------------------------------

# Parse the arguments
parser = argparse.ArgumentParser(description='Commands vehicle using vehicle.simple_goto.')
parser.add_argument('--connect',
help="Vehicle connection target string. If not specified, SITL automatically started and used.")
args = parser.parse_args()

connection_string = args.connect

# Connect to the physical UAV or to the simulator on the network
if not connection_string:
print ('Connecting to pixhawk.')
vehicle = connect('/dev/serial0', baud=57600, wait_ready= True)
else:
print ('Connecting to vehicle on: %s' % connection_string)
vehicle = connect(connection_string, wait_ready=True)

Initialise the Video Thread


Start the video thread.  This is straight out of pyimagesearch.com.  It is essential to use a separate thread to capture video on the RPi to get any useful performance, otherwise the main thread is held up by the slow camera IO.   Here, we are capturing and processing at around 20fps.  I decided to offload the image processing onto the video thread as well just to compartmentalise all image operations off of the main thread.  You should also note the RPi doesn't really parallel process - but using a separate thread  allows the main thread to get on with it while the video thread is hanging around for IO (in this case).

#--------------------------SET UP VIDEO THREAD ----------------------------------

# created a *threaded *video stream, allow the camera sensor to warmup,
# and start the FPS counter
print('[INFO] sampling THREADED frames from `picamera` module...')
vs = PiVideoStream().start()

Initialise the ToF Sensor

3689711549?profile=original


So we have attached the VL6180X sensor to a rear arm with a view to keeping the Groundhog around 15cm from the floor.  The rangefinder is connected to the RPi directly using i2c - NOT the Pixhawk.  So the RPi will sense and control the altitude directly.

#--------------------------SET UP TOF RANGE SENSOR ----------------------------------

tof_address = 0x29
tof_sensor = VL6180X(address=tof_address, debug=False)
tof_sensor.get_identification()
if tof_sensor.idModel != 0xB4:
print"Not a valid sensor id: %X" % tof_sensor.idModel
else:
print"Sensor model: %X" % tof_sensor.idModel
print"Sensor model rev.: %d.%d" % \
(tof_sensor.idModelRevMajor, tof_sensor.idModelRevMinor)
print"Sensor module rev.: %d.%d" % \
(tof_sensor.idModuleRevMajor, tof_sensor.idModuleRevMinor)
print"Sensor date/time: %X/%X" % (tof_sensor.idDate, tof_sensor.idTime)
tof_sensor.default_settings()
tof_sensor.change_address(0x29,0x80)
time.sleep(1.0)

Quaternion Control Functions


Control of the Pixhawk is effected using Dronekit (with python).  As well as having it's own set of commands (API) it provides an interface which encodes more directly to messages using the mavlink protocol.

We are using the set_attitude_target message, which is almost the only method we have of controlling the Pixhawk indoors, without GPS or optical flow.  This allows us to encode a quaternion in the local frame to request a change in attitude.

Here's the low level function into which we feed the quaternion describing the change in attitude required.  Some understanding of how it works is necessary.  As I could find no documentation detailing the function, much of this has been gained by trial and error and may not be complete.

w,x,y,z: q, the normalised quaternion (so that 1 = w2+x2+y2+z2). 2 means squared!
thrust - 0.5 to stay put, higher to go up, lower to go down. (max 1).
body roll and pitch rate set to 1 to match default setting in Pixhawk.
body yaw rate was made equal to the requested yaw, otherwise generally no yaw was evident (no idea as to why).

#--------------------------FUNCTION DEFINITION FOR SET_ATTITUDE MESSAGE MODE--------------------
# Define set_attitude message
def set_att_msg_mode(w,x,y,z,thrust): msg = vehicle.message_factory.set_attitude_target_encode( 0, 0, #target system 0, #target component 0b0000000, #type mask [w,x,y,z], #q 1, #body roll rate 1, #body pitch rate z, #body yaw rate thrust) #thrust
vehicle.send_mavlink(msg)


The quaternion itself was calculated from a separate function, below.  This allowed for the more usual change in roll, pitch and yaw to be converted.

Some useful observations were made using the SITL beforehand (running ArduCopter 3.4).

  1. Requested changes to yaw in the local frame were effected as expected.
  2. Requested changes to pitch and roll were momentarily effected, but quickly reversed as the Pixhawk re-assumed control to maintain flight stability (even in GUIDED_NOGPS flight mode).
  3. A stream of commands of at least 10Hz seemed necessary to maintain programmatic control e.g. to maintain a pitch of -10 degrees for forward motion.
#--------------------------FUNCTION DEFINITION FOR SET_ATTITUDE MESSAGE --------------------
def set_attitude (pitch, roll, yaw, thrust): # The parameters are passed in degrees # Convert degrees to radians degrees = (2*np.pi)/360 yaw = yaw * degrees pitch = pitch * degrees
roll = roll * degrees

# Now calculate the quaternion in preparation to command the change in attitude
# q for yaw is rotation about z axis
qyaw = Quaternion (axis = [0, 0, 1], angle = yaw )
qpitch = Quaternion (axis = [0, 1, 0], angle = pitch )
qroll = Quaternion (axis = [1, 0, 0], angle = roll )

# We have components, now to combine them into one quaternion
q = qyaw * qpitch * qroll

a = q.elements

set_att_msg_mode(a[0], a[1], a[2], a[3], thrust)

Take off (SITL only)


This function is straight out of the Dronekit examples and required only if testing in the SITL.

#-------------- FUNCTION DEFINITION TO ARM AND TAKE OFF TO GIVEN ALTITUDE ---------------
def arm_and_takeoff(aTargetAltitude):
""" Arms vehicle and fly to aTargetAltitude. """

print ('Basic pre-arm checks')
# Don't try to arm until autopilot is ready
while not vehicle.is_armable:
print ('Waiting for vehicle to initialise...')
time.sleep(1)

print ('Arming motors')
# Copter should arm in GUIDED mode
vehicle.mode = VehicleMode("GUIDED")
vehicle.armed = True

# Confirm vehicle armed before attempting to take off
while not vehicle.armed:
print ('Waiting for arming...')
time.sleep(1)

print ('Taking off!')
vehicle.simple_takeoff(aTargetAltitude) # Take off to target altitude

while True:
# print "Global Location (relative altitude): %s" % vehicle.location.global_relative_frame
if vehicle.location.global_relative_frame.alt>=aTargetAltitude*0.95:
break
time.sleep(1)

Control Functions


The main program operates a very simple finite state machine with three states:

  • 'tracking' - the red line is tracked but no attempt is made to send attitude messages to the Pixhawk.
  • 'following' - the red line is tracked and full command is assumed unless the operator switches out of GUIDED_NOGPS mode or the line is lost.
  • 'lost' - a rotate and gain height algorithm is designed to re-acquire the red line.


Recall the code is all in my github repository.  Therefore, I will only work through the 'following' function here to avoid repetition.

  • The state variable vstate is set to 'following'.
  • Initialise flags and other variables.
  • You will see some commented out references to get the altitude if testing outdoors and at regular altitude.
  • Frame per second counter straight out of pyimagesearch.com, used as needed or commented out.
#-------------- FUNCTION DEFINITION TO FLY IN VEHICLE STATE FOLLOWING---------------------
def following (vstate):
print vstate

#The vehicle process images and uses all data to fly in the following state.
# It sends attitude messages until manual control is resumed.

red1Good = red2Good = False # Set True when returned target offset is reliable.
yaw = roll = 0
target = None # Initialise tuple returned from video stream

#altitude = vehicle.location.global_relative_frame.alt

# Initialise the FPS counter.
# fps = FPS().start()

  • So vstate controls how long we stay in the loop (i.e. following state)
  • Read image frame which returns yaw signal (-1 to 1) and roll signal (-1 to 1).
    Remember these are derived from the gradient of the line on the image and the intercept on the x axis.
  • red1Good et al are booleans to indicate that the values are good to be used.
  • Get current height and set thrust to adjust to between 160 and 200mm.  No point trying to be too precise.
 while vstate =="following":

# grab the frame from the threaded video stream and return left line offset
# We do this to know if we have a 'lock' (goodTarget) as we come off of manual control.
target = vs.read()
yaw = target[0]
red1Good = target[1]
roll = target[2]
red2Good = target[3]

# update the FPS counter
# fps.update()

# Get the altitude information.
tofHeight = tof_sensor.get_distance()
# print "Measured distance is : %d mm" % tofHeight

# adjust thrust towards target
if tofHeight > 200:
thrust = 0.45
elif tofHeight < 160:
thrust = 0.55
else:
thrust = 0.5

  • After a check for the transmitter flight mode (must still be in GUIDED_NOGPS)
  • Set yaw and roll as a multiple of the signals passed from image analysis.
  • Fix the pitch
  • Send the request to the Pixhawk.
  • Drop out of the loop if we have lost the lock.
 # Check if operator has transferred to autopilot using TX switch.
if vehicle.mode == "GUIDED_NOGPS":
# print "In Guided mode..."
# print "Global Location (relative altitude): %s" % vehicle.location.global_relative_frame

if (red1Good or red2Good) :
yaw = yaw * 100 # Set maximum yaw in degrees either side
roll = roll * 20 # Set maximum roll in degrees either side
pitch = -4
#print pitch, yaw, roll, thrust
set_attitude (pitch, roll, yaw, thrust)
else:
vstate = "lost"

else:
# print "Exited GUIDED mode, setting tracking from following..."
vstate = "tracking"


We keep looping around this finite state machine in the main program.

# MAIN PROGRAM

vstate = "tracking" # Set the vehicle state to tracking in the finite state machine.

# If on simulator, arm and take off.
if connection_string:

print ('Basic pre-arm checks')
# Don't try to arm until autopilot is ready
while not vehicle.is_armable:
print ('Waiting for vehicle to initialise...')
time.sleep(1)

print ('Arming motors')
# Copter should arm in GUIDED mode
vehicle.mode = VehicleMode("GUIDED")
vehicle.armed = True

# Confirm vehicle armed before attempting to take off
while not vehicle.armed:
print ('Waiting for arming...')
time.sleep(1)

# Get airborne and hover
arm_and_takeoff(10)
print "Reached target altitude - currently in Guided mode on altitude hold"

vehicle.mode = VehicleMode("GUIDED_NOGPS")

while True :

if vstate == "tracking":
# Enter tracking state
vstate = tracking(vstate)
#print "Leaving tracking..."

elif vstate == "following":
# Enter following state
vstate = following(vstate)
#print "Leaving following"

else:
# Enter lost state
vstate = lost(vstate)
#print "Leaving lost"

Performance

Low altitude hold


The short range ToF sensor worked remarkably well and the height was able to be maintained at the 20cm mark or so.  However, the turbulance resulting from the ground effect complicated issues we were having with tracking.  Therefore, we decided to run at more normal altitude of 1.5 m using the Pixhawk barometer.  This also worked surprisingly well.

Once the control and tracking is working satisfactorily, I would have no hesitation in using the VL6180X ToF sensor again.

Tracking


The image analysis and tracking worked very well.  It was clear that the PiCam was able to distibguish the red line more easily than many competitors in the early stages.  No adjustments were made to the image analysis code during the competition, except use the 'lost' mode to make the field of view as wide as possible to help find the red line.

Attitude Control


So this is where it didn't go so well - but we learned alot of lessons!

Firstly, we were initially much too conservative with the maximum limits of pitch, roll and yaw.  This meant that there was little control at all and the turbulence from the ground effect simply sent the Groundhog off the line.  Our conservative approach was understandable as the Groundhog is quite a formidable machine indoors and safety was paramount.  At least we had replaced the 12" carbon fibre props with 10" plastic ones!  It took the first day of the competition to become confident to raise the attitude limits and realise we had to get off the ground - at least to start with.

The really big bug-bear was yaw control.  We had assumed that the quaternion function would use the Pixhawk control algorithms to set the yaw accurately.  This turned out not to be the case.  Once we had increased the sensitivity for yaw, we managed a full lap of the track albeit rather gingerly.  That at least put us in the top four teams!  However, we soon found the Groundhog would overshoot the target yaw and soon start an oscillation completely characteristic of a system with insufficient damping.  Recall the Groundhog weighs in at 3Kg?  So we were now paying for that big time!

Of course what we needed was a PID control loop at least for the yaw control which was the major problem. It was very frustrating watching the perfectly tracked line disappear increasingly off the side of the screen with every swing of the pendulum. Unfortunately, we simply ran out of time to introduce the necessary code in way that felt safe.

Lessons Learned


On the plus side:

  • The Groundhog proved a worthy airframe for the competition, despite it's big size;
  • The image analysis worked well, except for the intermittent segmentation faults caused by one particular function.  In particular, the 'ranging' regions of interest to detect two points on the line and calculate its gradient was very successful.
  • The novel use of the short range ToF sensor for low altitude flight worked well;
  • Using GUIDED_NOGPS mode and the quaternion controller works, although more flight testing to ascertain the attitude limits would be useful.


On the minus side:

  • Yaw control in particular must be implemented with a PID controller;
  • The small (10") plastic props reduced the yaw rate quite considerably from the usual performance capability.  This may have contributed to the yaw oscillations and may need to be mitigated in the future.

Next Steps


The date for the next MAAXX Europe event has already been set for March 2018 and Groundhog will be there.  As a minimum, it will be equipped with necessary PID controllers and will at last be able to fly at its cruising altitude of 20cm, hopefully for 30 minutes at a time!  It might also have some other tricks up its sleeve by then...

Maybe see you there.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Hi Mike! 

    This is so interesting that I feel like implementing it at home in my free time. 

    Do you have any videos? Will be good to see your drone in action :) 

This reply was deleted.