Mike Isted's Posts (16)

Sort by

Pixhawk 2 with Jetson TX2 Build

3689733877?profile=original

I have just completed my rebuild of The Groundhog hexacopter with a Pixhawk2  and NVIDIA Jetson TX2 companion computer.  This is in preparation for the coding blogs I am posting and further work integrating depth cameras and machine learning.  For those interested, details of the build are posted on my blog at www.mikeisted.com.

Read more…

Coding UAVs with ROS. Subscribing to FCU data.

3689733401?profile=original

I have just published Part 2 of this series, designed to develop skills in Python and ROS so as to achieve flexible control of a UAV.  It's based on a Pixhawk FCU with Raspberry Pi companion computer.  The Pixhawk is running PX4, but it would also work with minor modifications with Arducopter.  

I have also re-released Part 1 with some important updates (necessary due to recent changes in ROS).

We are heading towards a system that will fly autonomously and recognise and track objects in it's vicinity using some machine learning.

Part 2 is on my blog site here.  I hope some may find it useful!

Read more…

3689733329?profile=original

We recently ran a 2-week UAV build and coding mechatronics Summer School at UWE Bristol for students from the Ecole des Mines d’Ales. Great students and a great venue made for a very successful course and much fun was had by all!

Huge thanks to @UWEBristol and of course the students themselves.

Drones were standard 450 frames, Pixhawk running Arducopter.  Companion computer was Raspberry Pi running Dronekit and a standard USB webcam.  All programming in Python.

More details at on my blogsite here.

Read more…

3689731059?profile=original

If you have heard of Robot Operating System and want to use it to monitor and control UAV flight, this post will get you started…

More specifically, this post details how to set up a Pixhawk flight controller running PX4 firmware, with a Raspberry Pi3 companion computer running Robot Operating System. This combination gives flexible control over the flight control unit and the ability to integrate a very wide range of features such as depth-sensing cameras and machine learning networks.

If this is of interest, the full post is on my blogsite at https://wp.me/p3ZZr-nX.

Read more…

3689730229?profile=original

I have just posted the latest in a series of posts outlining a further step in the evolution of the GroundHog hexacopter into a multi-role UAV.  It is based on a Pixhawk flight controller with a Jetson TX2 companion computer.  It has now been fitted with an Intel RealSense D435 depthcam and this weekend I was able to take it out for a flight test.

The video from the flight test is available on YouTube and as usual my full post is on my blog

The GroundHog with the Jetson TX2 development board and D435....

3689730299?profile=original.

Read more…

3689729702?profile=original

Here's a quick technical post for anyone attempting to harness the capabilities of a Realsense D435 camera on a Jetson TX2.  For me, this is about getting usable depth perception on a UAV, but it has proved more problematic than I originally anticipated.

The Intel Realsense D435 depthcam is small enough to mount on a UAV and promises enough range for object detection and avoidance at reasonable velocities.  My intention is to couple it with the machine learning capabilities of the Jetson TX2 to improve autonomous flight decision making.

The problem is that the Intel Realsense SDK2 does not apparently support the ARM processor of the Jetson TX2, as I write.  This post links to my blog article which aims to provide some simple installation instructions that now work for me, but took a long time to find out!

(Full blog article link is https://mikeisted.wordpress.com/2018/04/09/intel-realsense-d435-on-jetson-tx2/)

Read more…

MAAXX Europe 2018 DroneJam MasterClass

3689729560?profile=original

Last year I entered the MAAXX Europe Autonomous Drone Racing competition with my 'Groundhog' hexacopter, after which I published the code and other information.  This seemed to be of help to those flying with similar companion-computer based systems.  This year I ran a 'DroneJam Masterclass' for students wishing to make a start in autonomous drone coding and once again, I am releasing all the code and a blog of the event itself.

The Competition

In common with last year, the competition is to build a drone to autonomously navigate around a circuit, marked out by a red line.  However, the event was altogether bigger this year with three flight arenas, many activities organised by sponsors, FPV racing and even a full two-seat glider flight simulator.  It was great to see this event continue to build on last years’ strengths and to become more popular still.  Its has also attracted international attention with one team coming all the way from Moscow (more of them in my blog!).

3689729350?profile=original

The Masterclass

The idea was to produce a competitive drone equipped with camera, optical flow, Pixhawk flight control unit and enough code to get drone-coding newbies off the ground, so to speak.  The code was to be implemented in Python on a Raspberry Pi companion computer.  So it wasn’t particularly expected that the students would compete in the main competition, more that they would learn from a real world experience.  Who would have known that our newbie team would come 2nd?

Code, Configuration etc.

I have documented all the code on GitHub and put together a full blog with links at https://mikeisted.wordpress.com/2018/04/05/maaxx-europe-2018-dronejam-coding-masterclass/

I hope it is of interest!

Read more…

3689716205?profile=original

The Groundhog is being developed to line-follow at low altitude and higher speeds.  This video is of the field testing taking the speed up to 1.75 m/s.  It also explains why the Groundhog now sports sunglasses.

The Track

An oval of 50mm red webbing, bends of radius approx 3m and straights of approx 15m. Testing took place in early morning with glancing sunlight on dew-soaked webbing - great for walking the dog but not so good for computer-vision.

https://youtu.be/T3pq6Uqg9Ko

On-board at 1.75 m/s

https://youtu.be/SLBZ4z9_k1M

Image recognition is by OpenCV on a Raspberry Pi 3 as explained in previous blogs.

Image Processing


Frame rates of over 40 fps are achieved by:

  1. Capturing and processing images in a separate thread;
  2. Processing only two roaming regions of interest as shown by the blue bands above;
  3. Using trigonometry to calculate bearings instead of image transformations.


The reality is that the PiCam is only delivering 30fps, so many frames get processed twice.

Flight Control


The position and bearing of the line is calculated in NED space from the orientation of the UAV and pitch of the camera.  A velocity vector is calculated to partly follow the line and also to 'crab' over it.  The velocity vector is sent to the Pixhawk via Dronekit, set to a fixed magnitude.  The Pixhawk is running Arducopter in Guided flight mode.  In these videos, the velocity is set at 1 m/s,  1.2 m/s, 1.4 ms and finally 1.75 m/s.

Performance

  1. Line following was reliable at all velocities up to 1.4 m/s at an altitude of 2m.
  2. The polaroid filter was effective at cutting out much surface reflection, permitting better performance when flying into the sun.
  3. At 1.75 m/s and 2m altitude, lock was lost occasionally when turning into the sun. This was due to more of the line passing out of the field of view and the challenging conditions due to surface reflection.  This is clearly seen from the on-board video above.
  4. Raising the altitude to 3m increased the field of view and allowed for reliable lapping of the track, even at 1.75 m/s.  However, the intention is to fly lower, not higher!
  5. A separate yaw test (full lock in loiter) indicated that despite being 3Kg, the maximum rate of rotation was not a limiting factor to following the track at higher speeds.

What Next?


To get lower and faster, we need to find a way to keep the track in the field of view. Some possibilities are:

  1. Use yaw more aggressively;
  2. Use a wide angle lens.


So back to the drawing board, but in the next field test maybe we'll target velocity of 2m/s at 1.5m altitude - so faster and lower.

More information on the Groundhog at www.mikeisted.com

Read more…

The Groundhog UAV - Curved line following

3689715794?profile=original

It's been a few week since tests were conducted of The Groundhog line-following a straight red line.  The Groundhog was an entry in the MAAXX Europe 2017 competition and although it didn't win, the lessons learned are being implemented so that it can be more competitive next year.

Time and weather have eventually converged to permit testing for the oval track - in this case a 50m oval comprising 50mm wide red webbing.

The test turned out to be quite successful, with following speeds of 1.5m/s achieved under autonomous control provided by an on-board Raspberry Pi 3.  This is significantly faster than the winning UAV in MAAXX Europe this year, which is quite pleasing!

The YouTube video shows both on-board and off-board camera footage, the former demonstrating the roaming regions of interest used by OpenCV to maintain a lock under varying lighting conditions.

https://youtu.be/UGCw27Atork

The next steps are to increase the speed still further and control the altitude more precisely.

More information on the Groundhog can be found at www.mikeisted.com.

Read more…

3689714584?profile=original

Several lessons were identified here  from the entry of The Groundhog hexacopter in the MAAXX Europe competition earlier this year.

Current developments are around correcting the issues so that we get a UAV successfully lapping the oval track at a minimum average speed of 1m/s.

A number of changes in approach have been made from that previously blogged.  Recall the platform is based on a combination of Pixhawk/Raspberry Pi3/OpenCV/Dronekit.

Image analysis:

  1. The birds eye view image transformation in OpenCV was causing segmentation faults on the RPi.  Instead the position and bearing of the detected line is calculated using straight trigonometry.
  2. Improvements made to the ranging ROI bands to further speed-up the frame rate.  This is now at a reported 50fps (which is faster than the PiCam is supplying them).

Control algorithms:

  1. The use of quaternions has been temporarily suspended in favour of control by velocity vectors.

As in MAAXX Europe, it makes sense to initially test on a straight line.  Initial testing was conducted outdoors using red-seatbelt webbing for the line.  It was not possible to fly below about 2m as the propwash blew the line away (will sort that next time!).

Initial Testing (Links to YouTube Video).

https://youtu.be/YBgSgMYjt8s

Read more…

3689713872?profile=original

In this last post of the series I shall overview the main program including the control algorithms for the Groundhog.  Code is written in Python, using Dronekit and OpenCV all running on a Raspberry Pi 3.

As we are flying indoors without GPS and also without optical flow, we are using quaternions to control the vehicle in the GUIDED_NOGPS flight mode of ArduCopter.  To be honest, I've not come across anyone else doing this before, so it must be a good idea...

What's a quaternion?


There are many references to this - check out Wikipedia for starters.  However, a quaternion is just another way for specifying the attitude of a body in a frame of reference, other than the traditional yaw, pitch, roll.  Whilst it's slightly harder to get your head around, quaternions are fundamentally (and by that I mean mathematically) more sound.  For example, working with pitch, roll and yaw can lead to gimbal lock, in which your use of trig functions can cause /div zero errors at the extremes (like multiples of 90 degrees).

With a quaternion, we specify the new attitude as a specified rotation around a  vector - an axis of rotation.  Think about the vector describing the current attitude.  We can map that vector to one describing the target attitude by specifying a rotation (of so many degrees) around an axis of rotation.  (More generally in robotics, we add a translation as well, but that's another story).

I am a visual learner, so after doing the maths I really understood it using two cable ties (start and finish attitude vectors) and a barbecue stick (axis of rotation).  Try it.

So in short, we are going to send a stream of quaternions to the Pixhawk to tell it how to change attitude.  Fortunately some very clever people have written some functions for this...

Code


All code is posted to my GitHub repository.

Code overview


There are just three program files:

  1. follow.py
    The main program called from the control console.
  2. PiVideoStream.py
    A class to set up a threaded reading and image analysis of each frame read by the PiCam2.
  3. ST_VL6180X.py
    A class for reading the short range laser tof rangefinder.  Full credit to author A.Weber.

Calling the code


The main program is invoked to either connect directly to the Pixhawk or to connect to the software in the loop simulator (see previous post).  Navigate to the local library on the RPi3 and open a console:

  • Connecting to the Pixhawk: >python follow.py
  • Connecting to the SITL: >python follow.py --connect "udp:192.168.0.xxx:14550"


In the latter instance, xxx is the address of the RPi on the local network.

follow.py

Libraries


Several standard libraries are used, some which need to be specifically loaded using pip.  In particular, note imutils from Adrian Rosebrock's excellent blog at pyimagesearch.com.  I preferred to keep the VL6180X in the local folder rather than fully import it via pip.

# import the necessary packages

from dronekit import connect, VehicleMode, LocationGlobalRelative
from pymavlink import mavutil # Needed for command message definitions
import time
import numpy as np
from pyquaternion import Quaternion
from PiVideoStream import PiVideoStream
from imutils.video import FPS
from picamera.array import PiRGBArray
from picamera import PiCamera
import argparse
import imutils
import cv2
import sys
from ST_VL6180X import VL6180X
from datetime import datetime, timedelta

Pixhawk/RPi3 Connection String

3689713849?profile=original


Optionally, a connection string is used when the program is invoked.

The presence of a connection string is tested at several points in the program to decide, for example, whether to command a take-off if using the SITL (and obviously not if really flying!).

Note also the serial port specified on the RPi3 - "/dev/serial0".  This works as the bluetooth has been disabled as per my previous post.  The baud rate also has to be set in Mission Planner for the Pixhawk to connect at the same speed.

#--------------------------SET UP CONNECTION TO VEHICLE----------------------------------

# Parse the arguments
parser = argparse.ArgumentParser(description='Commands vehicle using vehicle.simple_goto.')
parser.add_argument('--connect',
help="Vehicle connection target string. If not specified, SITL automatically started and used.")
args = parser.parse_args()

connection_string = args.connect

# Connect to the physical UAV or to the simulator on the network
if not connection_string:
print ('Connecting to pixhawk.')
vehicle = connect('/dev/serial0', baud=57600, wait_ready= True)
else:
print ('Connecting to vehicle on: %s' % connection_string)
vehicle = connect(connection_string, wait_ready=True)

Initialise the Video Thread


Start the video thread.  This is straight out of pyimagesearch.com.  It is essential to use a separate thread to capture video on the RPi to get any useful performance, otherwise the main thread is held up by the slow camera IO.   Here, we are capturing and processing at around 20fps.  I decided to offload the image processing onto the video thread as well just to compartmentalise all image operations off of the main thread.  You should also note the RPi doesn't really parallel process - but using a separate thread  allows the main thread to get on with it while the video thread is hanging around for IO (in this case).

#--------------------------SET UP VIDEO THREAD ----------------------------------

# created a *threaded *video stream, allow the camera sensor to warmup,
# and start the FPS counter
print('[INFO] sampling THREADED frames from `picamera` module...')
vs = PiVideoStream().start()

Initialise the ToF Sensor

3689711549?profile=original


So we have attached the VL6180X sensor to a rear arm with a view to keeping the Groundhog around 15cm from the floor.  The rangefinder is connected to the RPi directly using i2c - NOT the Pixhawk.  So the RPi will sense and control the altitude directly.

#--------------------------SET UP TOF RANGE SENSOR ----------------------------------

tof_address = 0x29
tof_sensor = VL6180X(address=tof_address, debug=False)
tof_sensor.get_identification()
if tof_sensor.idModel != 0xB4:
print"Not a valid sensor id: %X" % tof_sensor.idModel
else:
print"Sensor model: %X" % tof_sensor.idModel
print"Sensor model rev.: %d.%d" % \
(tof_sensor.idModelRevMajor, tof_sensor.idModelRevMinor)
print"Sensor module rev.: %d.%d" % \
(tof_sensor.idModuleRevMajor, tof_sensor.idModuleRevMinor)
print"Sensor date/time: %X/%X" % (tof_sensor.idDate, tof_sensor.idTime)
tof_sensor.default_settings()
tof_sensor.change_address(0x29,0x80)
time.sleep(1.0)

Quaternion Control Functions


Control of the Pixhawk is effected using Dronekit (with python).  As well as having it's own set of commands (API) it provides an interface which encodes more directly to messages using the mavlink protocol.

We are using the set_attitude_target message, which is almost the only method we have of controlling the Pixhawk indoors, without GPS or optical flow.  This allows us to encode a quaternion in the local frame to request a change in attitude.

Here's the low level function into which we feed the quaternion describing the change in attitude required.  Some understanding of how it works is necessary.  As I could find no documentation detailing the function, much of this has been gained by trial and error and may not be complete.

w,x,y,z: q, the normalised quaternion (so that 1 = w2+x2+y2+z2). 2 means squared!
thrust - 0.5 to stay put, higher to go up, lower to go down. (max 1).
body roll and pitch rate set to 1 to match default setting in Pixhawk.
body yaw rate was made equal to the requested yaw, otherwise generally no yaw was evident (no idea as to why).

#--------------------------FUNCTION DEFINITION FOR SET_ATTITUDE MESSAGE MODE--------------------
# Define set_attitude message
def set_att_msg_mode(w,x,y,z,thrust): msg = vehicle.message_factory.set_attitude_target_encode( 0, 0, #target system 0, #target component 0b0000000, #type mask [w,x,y,z], #q 1, #body roll rate 1, #body pitch rate z, #body yaw rate thrust) #thrust
vehicle.send_mavlink(msg)


The quaternion itself was calculated from a separate function, below.  This allowed for the more usual change in roll, pitch and yaw to be converted.

Some useful observations were made using the SITL beforehand (running ArduCopter 3.4).

  1. Requested changes to yaw in the local frame were effected as expected.
  2. Requested changes to pitch and roll were momentarily effected, but quickly reversed as the Pixhawk re-assumed control to maintain flight stability (even in GUIDED_NOGPS flight mode).
  3. A stream of commands of at least 10Hz seemed necessary to maintain programmatic control e.g. to maintain a pitch of -10 degrees for forward motion.
#--------------------------FUNCTION DEFINITION FOR SET_ATTITUDE MESSAGE --------------------
def set_attitude (pitch, roll, yaw, thrust): # The parameters are passed in degrees # Convert degrees to radians degrees = (2*np.pi)/360 yaw = yaw * degrees pitch = pitch * degrees
roll = roll * degrees

# Now calculate the quaternion in preparation to command the change in attitude
# q for yaw is rotation about z axis
qyaw = Quaternion (axis = [0, 0, 1], angle = yaw )
qpitch = Quaternion (axis = [0, 1, 0], angle = pitch )
qroll = Quaternion (axis = [1, 0, 0], angle = roll )

# We have components, now to combine them into one quaternion
q = qyaw * qpitch * qroll

a = q.elements

set_att_msg_mode(a[0], a[1], a[2], a[3], thrust)

Take off (SITL only)


This function is straight out of the Dronekit examples and required only if testing in the SITL.

#-------------- FUNCTION DEFINITION TO ARM AND TAKE OFF TO GIVEN ALTITUDE ---------------
def arm_and_takeoff(aTargetAltitude):
""" Arms vehicle and fly to aTargetAltitude. """

print ('Basic pre-arm checks')
# Don't try to arm until autopilot is ready
while not vehicle.is_armable:
print ('Waiting for vehicle to initialise...')
time.sleep(1)

print ('Arming motors')
# Copter should arm in GUIDED mode
vehicle.mode = VehicleMode("GUIDED")
vehicle.armed = True

# Confirm vehicle armed before attempting to take off
while not vehicle.armed:
print ('Waiting for arming...')
time.sleep(1)

print ('Taking off!')
vehicle.simple_takeoff(aTargetAltitude) # Take off to target altitude

while True:
# print "Global Location (relative altitude): %s" % vehicle.location.global_relative_frame
if vehicle.location.global_relative_frame.alt>=aTargetAltitude*0.95:
break
time.sleep(1)

Control Functions


The main program operates a very simple finite state machine with three states:

  • 'tracking' - the red line is tracked but no attempt is made to send attitude messages to the Pixhawk.
  • 'following' - the red line is tracked and full command is assumed unless the operator switches out of GUIDED_NOGPS mode or the line is lost.
  • 'lost' - a rotate and gain height algorithm is designed to re-acquire the red line.


Recall the code is all in my github repository.  Therefore, I will only work through the 'following' function here to avoid repetition.

  • The state variable vstate is set to 'following'.
  • Initialise flags and other variables.
  • You will see some commented out references to get the altitude if testing outdoors and at regular altitude.
  • Frame per second counter straight out of pyimagesearch.com, used as needed or commented out.
#-------------- FUNCTION DEFINITION TO FLY IN VEHICLE STATE FOLLOWING---------------------
def following (vstate):
print vstate

#The vehicle process images and uses all data to fly in the following state.
# It sends attitude messages until manual control is resumed.

red1Good = red2Good = False # Set True when returned target offset is reliable.
yaw = roll = 0
target = None # Initialise tuple returned from video stream

#altitude = vehicle.location.global_relative_frame.alt

# Initialise the FPS counter.
# fps = FPS().start()

  • So vstate controls how long we stay in the loop (i.e. following state)
  • Read image frame which returns yaw signal (-1 to 1) and roll signal (-1 to 1).
    Remember these are derived from the gradient of the line on the image and the intercept on the x axis.
  • red1Good et al are booleans to indicate that the values are good to be used.
  • Get current height and set thrust to adjust to between 160 and 200mm.  No point trying to be too precise.
 while vstate =="following":

# grab the frame from the threaded video stream and return left line offset
# We do this to know if we have a 'lock' (goodTarget) as we come off of manual control.
target = vs.read()
yaw = target[0]
red1Good = target[1]
roll = target[2]
red2Good = target[3]

# update the FPS counter
# fps.update()

# Get the altitude information.
tofHeight = tof_sensor.get_distance()
# print "Measured distance is : %d mm" % tofHeight

# adjust thrust towards target
if tofHeight > 200:
thrust = 0.45
elif tofHeight < 160:
thrust = 0.55
else:
thrust = 0.5

  • After a check for the transmitter flight mode (must still be in GUIDED_NOGPS)
  • Set yaw and roll as a multiple of the signals passed from image analysis.
  • Fix the pitch
  • Send the request to the Pixhawk.
  • Drop out of the loop if we have lost the lock.
 # Check if operator has transferred to autopilot using TX switch.
if vehicle.mode == "GUIDED_NOGPS":
# print "In Guided mode..."
# print "Global Location (relative altitude): %s" % vehicle.location.global_relative_frame

if (red1Good or red2Good) :
yaw = yaw * 100 # Set maximum yaw in degrees either side
roll = roll * 20 # Set maximum roll in degrees either side
pitch = -4
#print pitch, yaw, roll, thrust
set_attitude (pitch, roll, yaw, thrust)
else:
vstate = "lost"

else:
# print "Exited GUIDED mode, setting tracking from following..."
vstate = "tracking"


We keep looping around this finite state machine in the main program.

# MAIN PROGRAM

vstate = "tracking" # Set the vehicle state to tracking in the finite state machine.

# If on simulator, arm and take off.
if connection_string:

print ('Basic pre-arm checks')
# Don't try to arm until autopilot is ready
while not vehicle.is_armable:
print ('Waiting for vehicle to initialise...')
time.sleep(1)

print ('Arming motors')
# Copter should arm in GUIDED mode
vehicle.mode = VehicleMode("GUIDED")
vehicle.armed = True

# Confirm vehicle armed before attempting to take off
while not vehicle.armed:
print ('Waiting for arming...')
time.sleep(1)

# Get airborne and hover
arm_and_takeoff(10)
print "Reached target altitude - currently in Guided mode on altitude hold"

vehicle.mode = VehicleMode("GUIDED_NOGPS")

while True :

if vstate == "tracking":
# Enter tracking state
vstate = tracking(vstate)
#print "Leaving tracking..."

elif vstate == "following":
# Enter following state
vstate = following(vstate)
#print "Leaving following"

else:
# Enter lost state
vstate = lost(vstate)
#print "Leaving lost"

Performance

Low altitude hold


The short range ToF sensor worked remarkably well and the height was able to be maintained at the 20cm mark or so.  However, the turbulance resulting from the ground effect complicated issues we were having with tracking.  Therefore, we decided to run at more normal altitude of 1.5 m using the Pixhawk barometer.  This also worked surprisingly well.

Once the control and tracking is working satisfactorily, I would have no hesitation in using the VL6180X ToF sensor again.

Tracking


The image analysis and tracking worked very well.  It was clear that the PiCam was able to distibguish the red line more easily than many competitors in the early stages.  No adjustments were made to the image analysis code during the competition, except use the 'lost' mode to make the field of view as wide as possible to help find the red line.

Attitude Control


So this is where it didn't go so well - but we learned alot of lessons!

Firstly, we were initially much too conservative with the maximum limits of pitch, roll and yaw.  This meant that there was little control at all and the turbulence from the ground effect simply sent the Groundhog off the line.  Our conservative approach was understandable as the Groundhog is quite a formidable machine indoors and safety was paramount.  At least we had replaced the 12" carbon fibre props with 10" plastic ones!  It took the first day of the competition to become confident to raise the attitude limits and realise we had to get off the ground - at least to start with.

The really big bug-bear was yaw control.  We had assumed that the quaternion function would use the Pixhawk control algorithms to set the yaw accurately.  This turned out not to be the case.  Once we had increased the sensitivity for yaw, we managed a full lap of the track albeit rather gingerly.  That at least put us in the top four teams!  However, we soon found the Groundhog would overshoot the target yaw and soon start an oscillation completely characteristic of a system with insufficient damping.  Recall the Groundhog weighs in at 3Kg?  So we were now paying for that big time!

Of course what we needed was a PID control loop at least for the yaw control which was the major problem. It was very frustrating watching the perfectly tracked line disappear increasingly off the side of the screen with every swing of the pendulum. Unfortunately, we simply ran out of time to introduce the necessary code in way that felt safe.

Lessons Learned


On the plus side:

  • The Groundhog proved a worthy airframe for the competition, despite it's big size;
  • The image analysis worked well, except for the intermittent segmentation faults caused by one particular function.  In particular, the 'ranging' regions of interest to detect two points on the line and calculate its gradient was very successful.
  • The novel use of the short range ToF sensor for low altitude flight worked well;
  • Using GUIDED_NOGPS mode and the quaternion controller works, although more flight testing to ascertain the attitude limits would be useful.


On the minus side:

  • Yaw control in particular must be implemented with a PID controller;
  • The small (10") plastic props reduced the yaw rate quite considerably from the usual performance capability.  This may have contributed to the yaw oscillations and may need to be mitigated in the future.

Next Steps


The date for the next MAAXX Europe event has already been set for March 2018 and Groundhog will be there.  As a minimum, it will be equipped with necessary PID controllers and will at last be able to fly at its cruising altitude of 20cm, hopefully for 30 minutes at a time!  It might also have some other tricks up its sleeve by then...

Maybe see you there.

Read more…

3689713468?profile=original

In this short blog series I'm outlining the hardware and software of The Groundhog, my entry into the recent MAAXX-Europe autonomous drone competition held at the University of the West of England, Bristol.

Connecting the Raspberry Pi 3 to the Pixhawk took quite some working out, so I am hoping that by publishing my own step by step checklist, it may help others save a little time.

Code


All code is posted to my GitHub repository.

Connecting the Pi to the Pixhawk


I have previously blogged on how to connect a Pixhawk (running arducopter) to a Raspberry Pi 2 using the UART interface.  The hardware connection is identical for the Raspberry Pi 3.  However, there are some critical differences in setting up the Pi and in the code required to make the connection.

Configuring the Pi


Here is my installation checklist for installing Dronekit on the Pi and making the connection between the Pi and Pixhawk.

Start with Pixel installed on the Pi3 and do basic Pi configuration and updates:

Install Jessie Pixel
sudo raspi-config
Expand filesystem
Disable serial for OS
Reboot
sudo apt-get update


Now start to set up the development environment:

sudo easy_install pip
sudo apt-get install python-dev
sudo apt-get install screen python-wxgtk2.8 python-matplotlib python-opencv python-numpy libxml2-dev libxslt-dev


Install pymavlink and mavproxy to provide comms protocols between the Pi and Pixhawk.

sudo pip install pymavlink 
sudo pip install mavproxy


Install Dronekit and libraries to simplify control of the Pixhawk.

sudo pip install dronekit
git clone http://github.com/dronekit/dronekit-python.git


Now sort out the clash on RPi3 between our need to use the UART and it's use by the Bluetooth interface.  There are several versions of this config change about but this was the one that worked for me.

In RPI, add 2 lines at the end of /boot/config.txt (it will disable bluetooth and you can use /dev/ttyAMA0 to make the connection):

enable_uart=1
dtoverlay=pi3-disable-bt


So now we switch to the Pixhawk, connected to a PC using Mission Planner.  The version needs to be at least 3.4.0 which allows for GUIDED_NOGPS mode.  We also need to make sure Telem2 is properly configured, which we do from the Full Parameters list.

Check pixhawk firmware is 3.4.0
Check Serial 2 parameters:
SERIAL2_PROTOCOL '1'
SERIAL2_BAUD '57'
BRD_SER2_RTSCTS '0'


Now we can make the physical connection between the Pi and Pixhawk and test it out.

Note on power:  At this point, I had the RPi and Pixhawk independently powered, so the 5V connection between the two units was disconnected.  As long as they have a common ground, all is good.

Make physical connection

Test connection.  On the Pi:
sudo -s
mavproxy.py --master=/dev/ttyAMA0 --baudrate 57600 --aircraft MyCopter
param show ARMING_CHECK
watch HEARTBEAT (You should see mavlink heartbeat messages going both ways, every second)
Cntl - c
Reboot RPi


If you can see heartbeat messages during the test, like those below, all is good!

> HEARTBEAT {type : 6, autopilot : 8, base_mode : 0, custom_mode : 0, system_status : 0, mavlink_version : 3}
< HEARTBEAT {type : 6, autopilot : 8, base_mode : 0, custom_mode : 0, system_status : 0, mavlink_version : 3}
< HEARTBEAT {type : 13, autopilot : 3, base_mode : 81, custom_mode : 0, system_status : 3, mavlink_version : 3}

Setting up the Software in the Loop Simulator (SITL)


Beyond making the basic connection work, I found the SITL crucial to allow the developing software to be frequently tested.  Basically, the RPi needs something to connect to on the desk, and that cannot be the Pixhawk itself.

As a rudimentary test-environment to make sure the control commands are in the right ball-park, I found the SITL to be great.  Any more than that might be being a little optimistic.  I confess there is much configuration possible of the SITL itself which I have not played with (that's my excuse!).  But for example, don't expect the SITL to replicate anything approaching the inertia of a 3Kg hexacopter travelling at 40mph.

I installed the SITL on a PC, connected to the Pi3 through a standard home network.  The ardupilot site has instructions for the build and installation here, which worked after several attempts.  However all the issues were down to me not following the instructions precisely.

See more posts at mikeisted.com

Read more…

3689711559?profile=original

In this short blog series I’m outlining the hardware and software of The Groundhog, my entry into the recent MAAXX-Europe autonomous drone competition held at the University of the West of England, Bristol.

In this post I shall overview the approach taken to the image recognition system used to track the line being followed around the track.  Remember the line is red, about 50mm across and forms an oval track 20m by 6m.  We are attempting to race around as fast as we can, avoiding other UAVs if necessary.

Please note this blog is not a line by line treatment of the code, indeed the code provided is neither tidied or prepared as if for ‘instruction’.  In fact it is the product of much trial and change over the competition weekend!  Nevertheless, I hope it provides some useful pointers for the key ideas developed.

Approach

Image recognition is undertaken by the Raspberry Pi 3 companion computer using Python 2.7 and OpenCV 3.0.   I am indebted to Adrian Rosebrock of pyimagesearch for his many excellent blog posts on setting up and using OpenCV.

So we are trying to follow a red line which forms an oval track about 20m x 6m.  The line is about 50mm across.  However, we have limited processing power with the on-board Raspberry Pi, so we need to minimise the load.  Here’s the approach:

When following the line:

  1. Present a steady and level image by use of a roll/pitch servo gimbal.  This is driven directly by the Pixhawk.
  2. Reduce the image resolution to 320 x 240 pixels to reduce computational load.
  3. Convert the image to a top-down view using inverse perspective mapping.  This will help us accurately measure the angle of the line ahead.
  4. Blur the image and filter for the desired colour (red) using a mask.  This leaves us with the line in white against a black background.
  5. Apply a ‘band’ region of interest across the top of the image.  Use contours to identify where the line being followed intersects this band, giving us one coordinate.
  6. Do the same along the bottom of the image, giving us another coordinate.
  7. Use the coordinates to calculate and return the bearing of the line and intercept on the x-axis.
  8. The bearing is the signal for yaw, the intercept is the signal for roll.

If the line is lost:

  1. Return the bearing of any part of the line that can be located.

Additional features:

  1. Maintain a 5 frame moving average for the bearing and offset to reduce noise.
  2. Maintain a ‘confidence’ level to assess the quality of the lock and this a means to establish if it has been lost.
  3. Rather than be fixed, the two region of interest bands range from the bottom and top of the image until they find the line.  This gives the best chance of locating the line in sub-optimum conditions, whilst still a valid giving bearing and offset.
  4. The image capture and processing code is implemented as a class running in a separate thread on the RPi.  This permits much more efficient use of the RPi and again credit is given directly to Adrian Rosebrock at pyimagesearch.
  5. When following, the width of the image is variable, widening as the bearing increases to try to keep the line in view.

Competition Evaluation

The approach worked well during the competition.  In fact, it was clear that the ability of the Groundhog to detect and measure the line exceeded most other competitors.  Other notable points were:

  • The returned bearing and offsets seemed sensible;
  • The lock was maintained reliably whenever the line was in view, even if only partially.
  • Performance ranged continuously from around 13fps (widest image) to 19fps (narrowest view).

However:

  • The 5 point moving average for the bearing and offset seemed to produce noticeable lag.  As the lock seemed very reliable without it, the moving average was removed later in the competition, which seemed to improve the response times.
  • The optimum camera angle was difficult to achieve.  Furthermore, moving the camera angle changes it’s perspective and hence the homography used for the inverse perspective mapping exercise.  Calculation of the homography is fixed in the code and so does not take account of this change, thus creating error in calculated angles.  Ideally the homography would be calculated from the camera angle dynamically.
  • Repeated application of the homography to warp the image caused frequent segmentation faults.  This remained a problem throughout the competition and I suspect was due to an imperfect compilation of OpenCV.

Image Sequence

3689712491?profile=original

The unedited image of a test line on my desk. Obviously the parallel lines run towards a vanishing point.

3689712641?profile=originalThis view is after the image has been warped to get the ‘top-down’ view.  It also shows the upper and lower regions of interest, with a successful lock of the line in each.

Image Processing Code

Please see my website post here.

The code is also to be placed shortly on GitHub and I'll edit this post as soon as that happens.

Read more…

Post 2. MAAXX-Europe. The Groundhog Hardware

3689711612?profile=original

The Groundhog was at least twice as big and probably three time as heavy as many other competitors.  Why?  Because it is built for endurance (flight time 35mins+) and also because it's what I have as my development platform.  It normally flies outdoors of course...

Ah, so that means no gps and flying less than 30cm from the ground also rules out an optical flow camera (they can't focus that close).  So how to control this thing?

The answer (as you will see) is using the GUIDED_NOGPS mode now available in Arducopter, but I suspect, little used.  This is because almost the only control available is to set the pose (pitch, roll, yaw) using quaternions.  Not velocity, not even altitude.  It's going to be a little like controlling a 3Kg ball bearing sat on top of a football.  Fun.  But we are getting ahead of ourselves.  Back to hardware...

Specs:

  • Mass - 3.0 Kg
  • Diameter - 0.98 m (with guards and reduced props to fit competition regs).
  • Airframe - HobbyKing Tarot 680 Pro
  • Motors - Multistar Elite 3508-268KV High Voltage Endurance
  • Props - 10 x 4.5 plastic (normally runs 12 x 5 carbon fibre)
  • ESC - Afro HV 20A MultiRotor ESC High Voltage 3~8s
  • Batteries - 2 x Multistar High Capacity 4S 8000mAh Multi-Rotor Lipo Pack (in series)
  • Flight Controller - Pixhawk running ArduCopter v3.4
  • Companion computer - Raspberry Pi 3
  • Camera - PiCam 2 with 9g servo gimbal for roll and pitch.
  • Receiver - FrSky 8BR
  • Ancilliary power - 2 x Turnigy HV SBEC 5A Switch Regulator (8-42V input)
  • Maxbotix sonar (I2C)
  • Sparkfun VL6180 TOF Range Finder (I2C)
  • Prop guards and superstructure - 3d printed and 4mm pultruded carbon fibre tubing.

Airframe

The hexacopter Tarot 680 Pro is my general purpose airframe for just about everything.  Here the props are turned downward facing to get them closer to the ground to increase ground effect.  Smaller 10 inch props were fitted to bring the overall size down to less that 1m, as required by the competition.  This meant the motors were running faster than usual, reducing efficiency.  Hey ho.

Power Train

The two 4s batteries are configured to run in series, so 8s is provided to the high efficiency motors which are designed for endurance.  The power module for the Pixhawk was taken from just one battery - making sure that they had a common earth!

Power for the RPi and servo gimbal is provided by 2 x SBECs, taking power from across both batteries.  The feed for the SBECs is taken from across the supply of one of the speed controllers.

Key lessons:

  • Don't involve the Pixhawk power rail in any way to power the RPi or servos.  It may brown out.
  • Use one SBEC for the servos.
  • Use a separate SBEC for the RPi.

Pixhawk

Flight control is provided by the Hobbyking version of a standard Pixhawk.  It is attached upside-down to the underside of the top plate (underneath the RPi).  Don't forget to reconfigure Arducopter in Mission Planner to tell it the Pixhawk is upside-down (mode 12).

Raspberry Pi 3

In earlier blogs I had overclocked the RPi 2 to beef up performance.  This is not required for the RPi 3, which is a big improvement and is used out of the box with the Pixel operating system.  However, configuring the serial port (for connection to the Pixhawk) correctly took ages, as it is also used by the Bluetooth interface.  Pay close attention in the forthcoming post to the change in the config file required and then the correct syntax to open the serial port and it will be easy.  I'll put this in the post on Control, but it took weeks and weeks to eventually sort out!

The RPi is connected to the Pixhawk using the serial cable in precisely the same way as detailed in one of my previous posts here.

Sensors

Vision

A standard PiCam 2 is controlled for pitch and roll using a gimbal controlled from the Pixhawk.  This gives a steady and level view of the track to reduce the image processing burden.  It worked beautifully, and incidentally is exactly the same strategy also used by the 2nd place team (so I know it's a good idea!).  Also, we want a forward looking camera so we can see further and go faster - not just looking down.

3689711559?profile=original

Altitude

For low level flight, a VL6180 time of flight range finder sensor more usually used for gesture recognition is deployed and strapped to one of the front arms.  This can give altitude to mm resolution between 10 and 20 cms.  This is connected to the I2C interface on the RPi as there is no provision for this in ArduCopter for the Pixhawk.

3689711549?profile=original

That's it for now(I'll discuss the Maxbotix later).

Coming next: Image recognition using Python and OpenCV.

Read more…

3689711287?profile=original

The Challenge

I recently entered the first MAAXX-Europe competition to be held at the University of the West of England (UWE).  On the surface it's a simple challenge in which UAVs must follow an oval circuit of one red line, around 20m by 6m.  However, this proved to be anything but simple and with few rules about how many could be on the circuit at the same time...  you get the idea!

So before you read further, I didn't win (I came 4th of 6, which I am pleased with as a hobbyist up against university teams).  However, my hexacopter did complete a lap and I now know exactly what worked really well and what didn't!  And it's that knowledge I wish to share as part of the payback for all the open-source community help I have had in the last year.

So this might be of interest to you if you are interested in UAV image recognition, control systems and the systems integration of a companion computer (Raspberry Pi alongside a Pixhawk).

It's also a big thanks to Steve Wright of UWE and colleagues for laying on the event, which together with the UK UAV Challenge held annually at the UK Drone Show, promise to greatly accelerate our capabilities in this expanding market.

The Idea

To build an endurance UAV built for high efficiency, flying 20cm from the floor.  This would:

  • help win the endurance prize for the most completed laps.  Even going very slowly, a 30min+ flight time between battery changes would build up lots of laps.
  • Utilising ground effect would improve efficiency still further by 25%+.
  • Flying so low would avoid the carnage above given that several UAVs might be flying at once and avoid the need for collision avoidance strategies.

Because it was built to run at low altitude and is of fulsome size, when asked at the competition it was immediately named 'The Groundhog'.  Great concept, great potential - what could go wrong?

Blog Areas

What I'm covering:

  1. The hardware (airframe, avionics, etc.)
  2. Image recognition.
  3. Control.
  4. Lessons learned and what's next.

I'll deal with each area, post the code (warts and all) and evaluate the effectiveness of each.

3689711512?profile=original

The track and The Groundhog in action.  Line recognition was good - both accurate and reliable.  Except for the segmentation faults!

Read more…