Copter Localization w/ IR Landmark

The video shows localization data collected during a short flight over an IR marker, which is used to estimate the relative location of the copter with respect to the visual landmark. Note that this was a manually flown flight, but the ultimate goal is automation. Detecting visual landmarks/features is a fundamental task in many forms of robot localization and navigation. For example, the Snapdragon Flight includes 4 camera sensors for visual-inertial odometry.

3689677626?profile=original

3689677544?profile=original

The plot in the video shows the copter’s vision-based position estimation, versus the traditional position estimation. The red data is logged by APM:Copter running on Pixhawk with a 3DR GPS module. The blue data is derived from IR-LOCK sensor data, as it detects a MarkOne Beacon at approximately 50Hz. LidarLite is used for AGL altitude measurements. The presented data looks nice, since it was a fairly tame test. We need to calibrate the lens before we can correctly handle larger pitch/roll angles.

3689677598?profile=original

You can think of this as a ‘flipped’ version of the StarGazer indoor robot localization system, where a unique visual landmark is placed on the ceiling. However, the copter localization problem is a bit more tricky, due to the extra degrees of freedom. It can pitch, roll, ascend, etc. So the copter localization estimation also depends on the flight controller’s state estimation. And ideally, all of the data would be fused together.

3689677665?profile=original

One of the key advantages of having a uniquely identifiable visual landmark is that it can be used to remove drift in velocity and/or position estimations, which is typically the function of the GPS. This can also be accomplished by developing a local map (i.e., SLAM) …. With the MarkOne Beacon, we can also operate at night, but the video would be even more boring. :) Robust vision performance in variable lighting conditions typically requires some form of IR projection. (see Intel RealSense specs)

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Thomas, this is excellent work and a great example of development to come!

  • @lot

    It would be nice to get an alternative altitude reading, just in case the rangefinder happens to be pointed at un-level terrain. 

  • Why not use more than one beacon, with different colors for recognize them, or make a pattern of beacons for estimate altitude or orientation.

  • Thomas #1

  • great work! figure out a interesting solution for precision landing and indoor navigation.
  • @Jiro

    The standard GPS data would most-likely not be accurate enough for the calculation to work very well. 

    @Laser Developer

    That is a fair question. The method you describe is essentially the motion capture method. Also, check out PreNav's system

    One of the reason I do things the opposite way is because of the sun. I can filter out the sun's reflections, but if the sensor is directed toward the sun, that causes detection problems. And I am aiming for VERY reliable detection. The multi-camera system that you suggest could solve that issue, but that level of complication in the setup is not desirable for some applications (although, it may be perfectly fine for other applications).  

    The extra degrees of freedom shouldn't be a problem after the data is properly fused with the other sensor data. Admittedly, the data fusion and filtering is mathematically complicated, but it is what it is. 

    Due to the growing processing power, I imagine we will continue to add sensors that are providing redundant information, and the EKF will be actively weighting the data sources to estimate the copter's state. For example, you could have optical flow, GPS, and stereo vision simultaneously influencing the velocity estimate. Then, if somebody turns out the lights, the GPS measurements will be weighted higher, as the other sensor data deviates from the model.... It's going to get complicated no matter what. :) 

    Motion Capture for Robotics
    Industry leading precision motion capture and 3D tracking systems for Robotics, Quadrotors, and drone applications
  • @Thomas, I've been meaning to ask why you've chosen this more complicated method - IR beacon on the ground and camera in the air - rather than the other way around with the IR beacon in the air and the camera on the ground. It seems to my simple way of thinking that this removes the degrees of freedom problem and allows for two ground based cameras to give both position and altitude. Of course you would need to feed the position information back to the bird but since you are in close line of sight you might be able to use a simple bluetooth link.

  • Very interesting testing.

    Can you estimate altitude instead of position estimation from GPS position data and angle data of IR lock?

This reply was deleted.