Following Leon's great blog post showing the use of a camera from a WII remote as a 3D position sensor for his helicopter, I've been tinkering and now have a Wii remote camera working as a sensor on my Arducopter. Thanks to Chris for encouraging me to blog this work that was previously described in the forums
Currently the only sensor we have that tells us where we are relative to something else in our environment is the sonar. Some of the other sensors tell us where we are relative to a theoretical datum (eg GPS), or a variable datum (eg air pressure) these could both see you trying to land your aircraft below ground or off in a tree somewhere if your unlucky. The accelerometers tell us how we are oriented relative to gravity but nothing about where we are. The gyros tell us how we are moving. The recent work on optical flow which senses the surroundings also tells us how the aircraft is moving but not where it is.
My hypothesis is that a combination of an optical sensor such as the wii camera and ground based beacons could be used for accurate relative position determination to allow for precision navigation. This could be useful for landings, take offs, indoor navigation and loitering without GPS.
The Wii Camera can track up to four infra red blobs. By sensing the location of a set of predefined IR beacons we can, through simple trigonometry, determine our height above the beacons, our displacement from the beacons and our rotation (yaw) relative to the beacons. This information could then be used to control the navigation algorithms to control the aurducopter. An example, would be to a precise landing on a 'helipad'.
Something similar to this has been done before by the University of Tubingen - have a look at their video
The Wii camera has its limitations.
The calculations required to determine position from camera data are relatively simple - less complex than the optical flow calculations by the look of them.
I've mounted a wii camera to my arducopter, created an arduino library to communicate with it, modified the arducopter code to work with it and done some test flying. The test flying so far is "hand flying" (with and without motors running) as the back yard is too small for free flight and it was getting late.
The "flights" are above a simple IR target that is just two IR leds approx 15cm apart. (Cost per target is less than a dollar). I've logged Sonar, Baro and Wii camera altitudes. Here is the graph of the three altitude/range sensors - click for a bigger view
The blue line is the Wii sensor output - note when the algorithm can't see both the target leds it outputs a zero value. The only "calibration" required is to know the distance between the target LEDs. - in theory this could be eliminated if you used sonar range to 'calibrate' the target when you first see it.
The correlation between sonar and IR ranging is pretty darn good! In fact one could argue that - in this very limited test - the sonar shows noise (probably due to the BBQ cover flapping in the AC down wash) while the Wii altitude is either rock steady or nil.
The code also calculates x/y displacement from a position vertically above the target in millimetres and rotation (yaw) relative to the target. I'm not using any of this just yet.
I need to do some further work to cope with aircraft roll and pitch effects (currently the code assumes the aircraft is always level) I assume I can use the optical flow code as an example of how to sort this out.
Next steps for me
* free flying and Alt-hold tests
* clean up the code
* have a look at modifying loiter to use the targets
* find and use a lens to get a wider field of view
* come up with an precision guidance to landing concept and convince someone to help me code it
What do you think?
I'm keen to hear what people think of this type of sensor being used with arducopters.
Is it useful? How could you use it?
A couple of outside the box ideas to kick you off: