image processing for precision landing

Hi all,

I’m looking for a straightforward solution to have a quad land on a small outdoor target area of 1ft x 1ft. Fully autonomous. Acceptable accuracy would be +/- 2 inches of the target center.

We know the GPS location of the landing site, which the quad will use as a waypoint to get to the area and start hovering at altitude. Then for the actual precision landing and descent, a visual marker could be used, ideally I wouldn’t need anything electronic on the ground to make things simple (unless it's really cheap as in less than $50 and leads to a much simpler solution). Also all computing systems must be onboard the quad.

I was thinking of combining a Pixhawk, a Raspberry Pi 3 and its V2 camera module (8MP) to do computer vision with OpenCV. I would like to keep things simple and if possible limit the image recognition to basically a color mask + find contours. First the Pixhawk will take the quad to the GPS location. Then in “locate & descend” mode the RPi3 would start scanning and feed the Pixhawk with (x,y,z) velocity vectors to get closer to target and land.

Will this be good enough? Any potential roadblocks that I should anticipate?

While searching on the forum I found a UC Berkeley project [1] that seems related although it’s 2 years old. I also came across the PX4Flow work but I’m hoping I can do without.

Thanks!

[1] http://diydrones.com/profiles/blog/show?id=705844%3ABlogPost%3A1789944

You need to be a member of diydrones to add comments!

Join diydrones

Replies

  • Thanks guys, I'm starting to see more clear now.

    I think there are really two components.  1) data gathering + measurement accuracy, and 2) vehicle control.  (1) and (2) form a closed feedback loop.

    For (1) I can see how IR will have an edge, but let's assume for a minute that (1) is solved and gives you reliable, high enough frequency measurements with velocity vectors, and let's focus on (2).

    All the tests and videos I've seen seem relatively similar: the approach is monotonous.  The vehicle approaches the targets and descends, then it's a hit or miss.

    Why not keep trying to hit until time runs out?  With this strategy the vehicle should descend and if the target goes out of the field of view, it goes back up or moves around in the horizontal plane to find the target again.  Same thing when it gets very close (a couple of centimeters) to touching down.  If it's 3cm above and the vehicle is not about to hit the target with say 1cm accuracy, it should just stabilize or go higher and try again, but NOT descend.

    I'm still busy building and I'll get to testing this stuff soon, in the meantime I'd be curious to hear and get some feedback.

  • @Anthony E

    Great questions. I can only answer them from the IR-LOCK perspective. They may or may not be applicable to other sensor systems. ... (I will probably add more content later. I have been super-busy lately)

    John A.B. is correct when he says that reliable and accurate tracking does not always translate into accurate copter control.

    Multiple control strategies have been tested published. Here are two of them. 

    (1) Precision Landing with Accuracy Management (blog post link)

    (2) Precision Landing Slowdown Feature (video link)

     

  • Daniel Nugent demonstrated his precision landing within SITL here

  • This is helpful thanks.  What's the best way to locally test my CV code?  SITL?  I'd like to test the RPI3 looking for the target and driving Pixhawk.

  • Antony,

    The RPI 3 experimentation is here,

    As for speed, if you read through the literature, most successful controllers are running over 20 Hz; 10 Hz being the bare minimum to keep a quad flying.

    Here is a good representation of the different automation layer and the corresponding speed:

    3702290901?profile=original

    Reference:  https://www.ais.uni-bonn.de/papers/IAS_2014_Nieuwenhuisen_Layered_P...

  • Developer

    In one word latency.

    Everything in a traditional autopilot is more or less done after the fact. A sensor picks up data, and then the control reacts and calculates the needed corrections. But it's always at least one step behind. But since the sensor and control rates are high and has low latency, this works for most situations.

    But in visual computing there will be much more latency. Typically ~100ms just for the camera capture, and even more in the visual computing part. And you can guess the kind of trouble this will lead to. And trying to predict into the future based on previous movements, only works in stable systems without unpredictable external influences.

  • @John Arne Birkeland - I can definitely imagine wind gusts making this an entertaining dance to watch.  Assuming no wind, what else should I be worried about?

  • @Patrick Poirier - Was it an RPI3, or 2?  Your post says "Raspberry Pi 2".  Regarding houghcircle, I noticed this algo is pretty resource hungry.  I'm going to stick to color based for now.  As for the frame rate,  even 8Hz seems like decent at least to start?

  • Randy has kindly responded to my request on youtube, and here is the code for the new setup:

    https://github.com/yankailab/OpenKAI  There is lot of interesting stuff in this code :-)

    I doubt that the RPI can do this job, when I experimenting with balloon popper on my RPI3  here , I could not get much better than 8 fps with basic blob detect (houghcircle was under 6 fps), which is not really efficient as a UAV controller. So then I ordered an ODROID XU4 , and the results are much better, I will get updates this fall , but for the moment I enjoy FPV under the sun .. ;-)

    yankailab/OpenKAI
    Open Kinetic AI: a framework for intelligent motions of Robots - yankailab/OpenKAI
  • Developer

    Any idea why the vehicle doesn't hit the center of the target?  Is the software configured to just be happy with putting all 4 legs down on the support structure?  I was hoping for centimeter-accuracy with such hardware.

    Having accurate tracking does not automatically translate to accurate landings when you are flying in the real world with lots of complex internal and external physic and aerodynamic interactions going on.

This reply was deleted.

Activity