Line follower project using a machine learning approach on Raspberry Pi and Pixhawk

The aim of this project is to have a quadcopter follow a line based on the classification of images taken by a Pi Camera and run through a simple machine learning algorithm running on Raspberry Pi. I'm using an F450 quadcopter frame, Pixhawk running the latest ArduCopter firmware, Raspberry Pi 3 Model B and 8MP Pi Camera. The motivation for this project is a paper published by A Giusti and team in IEEE Robotics and Automation Letters, "A Machine Learning Approach to Visual Perception of Forest Trails for Mobile Robots". The video of their project can be found here: https://www.youtube.com/watch?v=umRdt3zGgpU

How it works:

The quadcopter is mounted with only one Pi Camera connected to the Raspberry Pi pointed down at the track. The video feed from the Pi Camera will be converted to frames using opencv and the optimized images run through a supervised machine learning algorithm running on the RPi. The machine learning algorithm is a simple artificial neural network that classifies the images into one of 3 classes i.e. whether the track is in the center of the image, to the left, or to the right. The neural network has already been trained on a PC and the final weights obtained after training are what's used with the pixel values of the processed images on the RPi to predict what class the image is (left, center or right).

3 GPIO pins on the RPi are used to communicate where the track lies on the image. If the track is on the left of the image, the GPIO pin reserved for left class will have binary output 1, and the other two pins will read 0. If the track is at the center, the GPIO pin reserved for center class will read 1 and the other two will read 0.

Based on the binary values at the 3 GPIO pins, the throttle, pitch and yaw values are to be given using DroneKit and MAVLink. For example, if the quad has veered off to the left, the track pattern will be on the right of the image. The machine learning algorithm will classify it as class 'right', and GPIO pin for right class will read 1. If this pin reads 1, then a small fixed yaw value should be given to steer the copter back to the right.

I am doing this as my final year project. I'm new to MAVProxy and DroneKit, but have been able to set them up on my RPi and can send simple commands like arm/disarm using MAVProxy on the command line. However I need help understanding and using DroneKit to write this entire autonomous mission. I've had a look at other projects like the Red Balloon Finder and the code seems all too overwhelming for a newbie like myself. I know python but this is my first time using companion computers with Pixhawk. I'm currently reading through the DroneKit and MAVProxy documentation but it's a lot to grasp in a short time. I'd be really glad if any dronekit/ardupilot devs would be willing to help me with the coding.

I'm looking to get some help from anyone who's worked with DroneKit or MAVProxy. Any other feedback or alternative ideas on how I can do this mission are also welcome.

Views: 722

Reply to This

Replies to This Discussion

This is super cool. We do much the same thing with the simple OpenMV cam in rovers. It's a very small and light single-board camera computer that has line following built in and speaks native MAVLink

Wow that's neat. The OpenMV cam looks well suited for this project. Wish I'd heard of it earlier. I'll consider using it for one of my future projects.

I've designed simple arduino-based and atmel-based line follower bots before that use infrared sensors. I thought I could apply this to drones but didn't realise how extensive MAVLink and DroneKit was and how complicated writing code for an autonomous mission using these is. Would really appreciate some help with writing DroneKit missions.

Basically, you just want your line follower code on the Pi to spit out angular offsets based on how off-center the line is, and PID those into left/right corrections in turn for the drone. Then send them to the Pixhawk using RC_Override commands with Dronekit. 

Yes, thank you. I just read the channel overrides section in the DroneKit documentation (it's really well written). I was looking at writing a guided mission by defining the velocity vectors and yaw angle, but will look into this approach as well.
Two questions:
1) If I use RC_Override, will I still have the ability to regain control of the quad using the transmitter in case the mission goes awry?
2) Considering I have a project deadline end of May, would it be quicker/easier to write a guided mission or to use the RC_Override approach?

1) yes

2) the second

This is exactly what the MAAXX Europe competition is about.  My blog on this starts here and links to other resources to hook up a RPi 3 to a Pixhawk, using Dronekit and Python.

Hope it helps :-)

Thanks Chris, I'll have a look at both approaches.

Wow, thanks Mike! This is exactly what I'm trying to do, except using a machine learning approach. I'll have a good read through your blog. The dronekit code will save me a lot of time.

The MAAXX Europe Competition sounds really interesting. Maybe I'll enter after my project and see you there :)

For information, the winning UAV at MAAXX Europe 2017 used channel overrides.

Interesting. I thought using guided commands would be the right way to do it, since channel overrides have been dis-commended in the DroneKit documentation. Maybe someone could elaborate the pros and cons between RC_Overrides and Guided commands.

I think channel overrides are seen as an easy and effective way to do it - they also work indoors, so they work if you don't have gps.  Yes, Dronekit does mention they don't want to use them...

So I think your choices are:

1. Channel overrides.  These are intuitive as they replicate manual flying.  They work with no GPS and are easy to implement.  But, I guess you need to calibrate attitude response against channel PWM range etc.

2. Velocity Vectors. You need GPS or optical flow for this.   I think these only work in NED space, so the maths is a little more complicated (having to convert from Body frame to NED).  Upside is that it gives you 'absolute' control of velocities.

3. Quaternions.  Will work without GPS or optical flow, but not well documented and difficult to implement.  Fun if you want to try something new, but probably not recommended on a deadline.

Given a short deadline and the interface to machine control I'd go for channel overrides.  Given the simplicity of the interface to the neural network, don't forget you will also need PID controllers not to end up suffering as I did in the MAAXX Europe competition!  On that front, some may question whether a RPi is reliable as a PID controller, given that Linux is not a real time OS...

Thank you very much for that intuitive reply, Mike. That helps clear things out a lot. I'll have a look at PID as well. Your blog's provided some useful insight to my project.

I'll be sure to make a blog post of my own about this project once I'm done.

Reply to Discussion

RSS

© 2017   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service