### Coaxial Autonomous Landing on a moving platform via Optical Flow

Hey everyone, I've been trying to tackle the issue of autonomous landing for a little while now, specifically restrained to optical flow techniques. I have hit a roadblock and wanted to see if anyone on here could shed some advice on the issue.

Essentially what is happening is I have a downward looking vision chip that can successfully put a Coaxial helicopter into a hover using a minimization of OF. I want that chip to now track a moving surface underneath it, particularly a landing platform. The problem lies in that the helicopter has no way of knowing the linear velocity of the thing it is tracking. Because the OF is only calculating a height/speed ratio, there is no way to know exactly how far the surface it is tracking has displaced in the X and Y direction. I am trying to avoid using other sensors that would give me height over ground because my objective is to operate using only vision chips. Therefore, unless I constrain the speed of the landing platform and keep the height of my UAV at a very fixed height to match the linear velocity of the platform, does anyone have any suggestions on how to remedy this issue?

For reference, I am programming via an Arduino board and I am using a Centeye 16X16 vision chip.

• Developer

Your options will be limited tracking a moving object with a 16x16 sensor. Here is one suggestion as how you might be able to match speed with a moving target while not knowing absolute height and speed.

Try feeding the X and Y values of the OF vector into PID loops controlling the speed of the copters X and Y movement  (later you can try to make it better by including Yaw control also). With the PID's tuned to the copters behavior it should then constantly change and match the copter speed in the X and Y directions, trying to keep the OF vector as close to 0,0 as possible.

• @Kenneth, this is the method I am currently employing to hold my position over ground. The reason this works is because I have limited my conditions to a flat, still surface over which the helicopter hovers. That way, the magnitude of displacement of OF in any direction is directly related to the movement of the UAV, for which I can correct with my compensator. The magnitude of difference I am seeing in this problem, however, is not just related to UAV movement, but to the relative motion between the platform and the UAV. Therefore, if I see a displacement of say 200 units (computed from the integration of OF) in the pos x direction, I am unable to know whether the platform truly moved 200 units with a certain speed or 100 units with double that speed (it would compute the same magnitude of displacement) Therefore, compensation is impossible because I do not know the gain with which to feedback to my motors. Therein lies the problem. I hope this sparks someone's interest because, as an undergrad with relatively limited knowledge on this subject, I am at a bit of a standstill.

Thanks for the other comments as well, it has sparked some consideration.

• Also consider: If you have disabled GPS usage, this is an identical challenge to landing on an elevated surface on a windy day.

• Rather than matching features from camera A frame N to camera A frame N-1, match features from camera A to camera B to camera C (or at least, the highest quality pair),as well as to the previous frames, as well as against your target profile image.  Given a rigid mounting model with a known baseline between cameras, trigonometry gives you the depth map for your feature matches.  Welcome to stereo computer vision.

I can suggest that you may encounter two simpler physical problems in your project, however: Non-laminar air turbulence around a moving object, and the mechanics of landing gear for an object tilted at an unpredictable angle.  Presently, it's difficult to get a smooth (no bouncing), repeatable landing even on a flat surface if you're under manual control.

• On the wiki there's a link to a paper about using 2 vision chips to address your issue (height calculation).

Good luck and tell us about your progress!

• Try using an Xbox Kinect camera.

• Michael,

Fly the helicopter over the tracked object until optical flow is zero in all directions.  The helicopter horizontal speed is then the object speed.

This reply was deleted.