The aim of this project is to have a quadcopter follow a line based on the classification of images taken by a Pi Camera and run through a simple machine learning algorithm running on Raspberry Pi. I'm using an F450 quadcopter frame, Pixhawk running the latest ArduCopter firmware, Raspberry Pi 3 Model B and 8MP Pi Camera. The motivation for this project is a paper published by A Giusti and team in IEEE Robotics and Automation Letters, "A Machine Learning Approach to Visual Perception of Forest Trails for Mobile Robots". The video of their project can be found here: https://www.youtube.com/watch?v=umRdt3zGgpU

How it works:

The quadcopter is mounted with only one Pi Camera connected to the Raspberry Pi pointed down at the track. The video feed from the Pi Camera will be converted to frames using opencv and the optimized images run through a supervised machine learning algorithm running on the RPi. The machine learning algorithm is a simple artificial neural network that classifies the images into one of 3 classes i.e. whether the track is in the center of the image, to the left, or to the right. The neural network has already been trained on a PC and the final weights obtained after training are what's used with the pixel values of the processed images on the RPi to predict what class the image is (left, center or right).

3 GPIO pins on the RPi are used to communicate where the track lies on the image. If the track is on the left of the image, the GPIO pin reserved for left class will have binary output 1, and the other two pins will read 0. If the track is at the center, the GPIO pin reserved for center class will read 1 and the other two will read 0.

Based on the binary values at the 3 GPIO pins, the throttle, pitch and yaw values are to be given using DroneKit and MAVLink. For example, if the quad has veered off to the left, the track pattern will be on the right of the image. The machine learning algorithm will classify it as class 'right', and GPIO pin for right class will read 1. If this pin reads 1, then a small fixed yaw value should be given to steer the copter back to the right.

I am doing this as my final year project. I'm new to MAVProxy and DroneKit, but have been able to set them up on my RPi and can send simple commands like arm/disarm using MAVProxy on the command line. However I need help understanding and using DroneKit to write this entire autonomous mission. I've had a look at other projects like the Red Balloon Finder and the code seems all too overwhelming for a newbie like myself. I know python but this is my first time using companion computers with Pixhawk. I'm currently reading through the DroneKit and MAVProxy documentation but it's a lot to grasp in a short time. I'd be really glad if any dronekit/ardupilot devs would be willing to help me with the coding.

I'm looking to get some help from anyone who's worked with DroneKit or MAVProxy. Any other feedback or alternative ideas on how I can do this mission are also welcome.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Yes, check out Mike Isted's blog for a more detailed description on how to do this. The code is also explained there.

    https://mikeisted.wordpress.com/

    (Posts 1 to 5 of the MAAXX Europe challenge.

    Mike Isted
    Director, Aisymmetrix. Intelligent machines.
  • Was this project ever completed? 

    I'm trying to do the same thing as a side project and was hoping you could help me on the code for the line detecting and then following.

  • Thank you very much for that intuitive reply, Mike. That helps clear things out a lot. I'll have a look at PID as well. Your blog's provided some useful insight to my project.

    I'll be sure to make a blog post of my own about this project once I'm done.

  • I think channel overrides are seen as an easy and effective way to do it - they also work indoors, so they work if you don't have gps.  Yes, Dronekit does mention they don't want to use them...

    So I think your choices are:

    1. Channel overrides.  These are intuitive as they replicate manual flying.  They work with no GPS and are easy to implement.  But, I guess you need to calibrate attitude response against channel PWM range etc.

    2. Velocity Vectors. You need GPS or optical flow for this.   I think these only work in NED space, so the maths is a little more complicated (having to convert from Body frame to NED).  Upside is that it gives you 'absolute' control of velocities.

    3. Quaternions.  Will work without GPS or optical flow, but not well documented and difficult to implement.  Fun if you want to try something new, but probably not recommended on a deadline.

    Given a short deadline and the interface to machine control I'd go for channel overrides.  Given the simplicity of the interface to the neural network, don't forget you will also need PID controllers not to end up suffering as I did in the MAAXX Europe competition!  On that front, some may question whether a RPi is reliable as a PID controller, given that Linux is not a real time OS...

  • Interesting. I thought using guided commands would be the right way to do it, since channel overrides have been dis-commended in the DroneKit documentation. Maybe someone could elaborate the pros and cons between RC_Overrides and Guided commands.

  • For information, the winning UAV at MAAXX Europe 2017 used channel overrides.

  • Thanks Chris, I'll have a look at both approaches.

    Wow, thanks Mike! This is exactly what I'm trying to do, except using a machine learning approach. I'll have a good read through your blog. The dronekit code will save me a lot of time.

    The MAAXX Europe Competition sounds really interesting. Maybe I'll enter after my project and see you there :)

  • This is exactly what the MAAXX Europe competition is about.  My blog on this starts here and links to other resources to hook up a RPi 3 to a Pixhawk, using Dronekit and Python.

    Hope it helps :-)

  • 3D Robotics

    1) yes

    2) the second

  • Yes, thank you. I just read the channel overrides section in the DroneKit documentation (it's really well written). I was looking at writing a guided mission by defining the velocity vectors and yaw angle, but will look into this approach as well.
    Two questions:
    1) If I use RC_Override, will I still have the ability to regain control of the quad using the transmitter in case the mission goes awry?
    2) Considering I have a project deadline end of May, would it be quicker/easier to write a guided mission or to use the RC_Override approach?
This reply was deleted.

Activity

DIY Robocars via Twitter
RT @a1k0n: Did I get rid of hand-tuned parameters? Yes. Am I still hand-tuning more parameters? Also yes. I have a few knobs to address the…
Monday
DIY Robocars via Twitter
RT @a1k0n: I'm not going to spoil it, but (after charging the battery) this works way better than it has any right to. The car is now faste…
Monday
DIY Robocars via Twitter
RT @a1k0n: Decided to just see what happens if I run the sim-trained neural net on the car, with some safety rails around max throttle slew…
Monday
DIY Robocars via Twitter
Saturday
DIY Robocars via Twitter
RT @SmallpixelCar: @a1k0n @diyrobocars I learned from this. This is my speed profile. Looks like I am too conservative on the right side of…
Sep 24
DIY Robocars via Twitter
RT @a1k0n: @SmallpixelCar @diyrobocars Dot color is speed; brighter is faster. Yeah, it has less room to explore in the tighter part, and t…
Sep 24
DIY Robocars via Twitter
RT @a1k0n: I'm gonna try to do proper offline reinforcement learning for @diyrobocars and throw away all my manual parameter tuning for the…
Sep 23
DIY Robocars via Twitter
RT @circuitlaunch: DIY Robocars & Brazilian BBQ - Sat 10/1. Our track combines hairpin curves with an intersection for max danger. Take tha…
Sep 22
DIY Robocars via Twitter
RT @SmallpixelCar: Had an great test today on @RAMS_RC_Club track. However the car starts to drift at 40mph. Some experts recommended to ch…
Sep 11
DIY Robocars via Twitter
RT @gclue_akira: 世界最速 チームtamiyaのaiカー https://t.co/1Qq2zOeftG
Sep 10
DIY Robocars via Twitter
RT @DanielChiaJH: Always a good time working on my @diyrobocars car at @circuitlaunch. Still got some work to do if I’m to beat @a1k0n howe…
Sep 10
DIY Robocars via Twitter
RT @SmallpixelCar: My new speed profile for @RAMS_RC_Club track https://t.co/RtLb7TcgIJ
Sep 10
DIY Robocars via Twitter
RT @SmallpixelCar: Practiced at @RAMS_RC_Club today with my new @ARRMARC car https://t.co/AEu2hCx89T
Aug 28
DIY Robocars via Twitter
Aug 24
DIY Robocars via Twitter
RT @gclue_akira: 柏の葉で走行させてるjetracerの中身 #instantNeRF #jetracer https://t.co/giVvuE4hP7
Jul 4
DIY Robocars via Twitter
Cool web-based self-driving simulator. Click save when the AI does the right thing https://github.com/pncsoares/self-driving-car
Jul 4
More…