Following Leon's great blog post showing the use of a camera from a WII remote as a 3D position sensor for his helicopter, I've been tinkering and now have a Wii remote camera working as a sensor on my Arducopter. Thanks to Chris for encouraging me to blog this work that was previously described in the forums
Currently the only sensor we have that tells us where we are relative to something else in our environment is the sonar. Some of the other sensors tell us where we are relative to a theoretical datum (eg GPS), or a variable datum (eg air pressure) these could both see you trying to land your aircraft below ground or off in a tree somewhere if your unlucky. The accelerometers tell us how we are oriented relative to gravity but nothing about where we are. The gyros tell us how we are moving. The recent work on optical flow which senses the surroundings also tells us how the aircraft is moving but not where it is.
My hypothesis is that a combination of an optical sensor such as the wii camera and ground based beacons could be used for accurate relative position determination to allow for precision navigation. This could be useful for landings, take offs, indoor navigation and loitering without GPS.
The Wii Camera can track up to four infra red blobs. By sensing the location of a set of predefined IR beacons we can, through simple trigonometry, determine our height above the beacons, our displacement from the beacons and our rotation (yaw) relative to the beacons. This information could then be used to control the navigation algorithms to control the aurducopter. An example, would be to a precise landing on a 'helipad'.
Something similar to this has been done before by the University of Tubingen - have a look at their video
The Wii camera has its limitations.
- You have to sacrifice a Wii remote to obtain one - there is no known alternative source for the camera - mine cost $1.99 on ebay.
- The Field of View is quite narrow (+/- 20 deg). Leon has hot glued a small lens on his camera to get a wider field of view.
- The camera has an I2C interface - great - at 3 volts - not so great. It also needs a 24Mhz clock signal - all do-able with off the shelf components.
The calculations required to determine position from camera data are relatively simple - less complex than the optical flow calculations by the look of them.
I've mounted a wii camera to my arducopter, created an arduino library to communicate with it, modified the arducopter code to work with it and done some test flying. The test flying so far is "hand flying" (with and without motors running) as the back yard is too small for free flight and it was getting late.
The "flights" are above a simple IR target that is just two IR leds approx 15cm apart. (Cost per target is less than a dollar). I've logged Sonar, Baro and Wii camera altitudes. Here is the graph of the three altitude/range sensors - click for a bigger view
The blue line is the Wii sensor output - note when the algorithm can't see both the target leds it outputs a zero value. The only "calibration" required is to know the distance between the target LEDs. - in theory this could be eliminated if you used sonar range to 'calibrate' the target when you first see it.
The correlation between sonar and IR ranging is pretty darn good! In fact one could argue that - in this very limited test - the sonar shows noise (probably due to the BBQ cover flapping in the AC down wash) while the Wii altitude is either rock steady or nil.
The code also calculates x/y displacement from a position vertically above the target in millimetres and rotation (yaw) relative to the target. I'm not using any of this just yet.
I need to do some further work to cope with aircraft roll and pitch effects (currently the code assumes the aircraft is always level) I assume I can use the optical flow code as an example of how to sort this out.
Next steps for me
* free flying and Alt-hold tests
* clean up the code
* have a look at modifying loiter to use the targets
* find and use a lens to get a wider field of view
* come up with an precision guidance to landing concept and convince someone to help me code it
What do you think?
I'm keen to hear what people think of this type of sensor being used with arducopters.
Is it useful? How could you use it?
A couple of outside the box ideas to kick you off:
1) fit the aircraft with an IR illuminator (from a security camera) and use IR reflective markers on the ground.
2) get the aircraft to carry IR beacons and drop them when it wants to loiter for precision loiter ability. Or drop a trail of cheap beacons Hansel and Gretel style to find its way 'home' without GPS
Just wondering if anyone has had any further luck integrating the Wii Ir sensor into their copter. Im trying to use the optical flow roll pitch compensation with loiter to feed in corrected x,y values as rc overrides.
The source code is great but a little guidance would be great. Anyone have any ideas as to how to do this? @agmatthews have you flown with your code? how did you actually implement it? Userhooks?
Thanks for bumping this! This is actually super interesting work!
The code in Arducopter 3.0 is very, very good. Automatic landings are possible, but sometimes they are not as precise as we'd like. Altitude can be off a little bit, and it can drift due to GPS errors.
Using a system like this would allow a user to lay out a landing pad with IR beacons that can be used on final approach to really nail the landing.
Bump - would love to know the latest news around this effort
i too am very interested in this, would be neat to set up a helipad at the RTL site with the IR leds. when the RTL is triggered the quad comes hone and loiters in the area till it locks onto the helipad. then orientate itself and land.
what sort of minimum altitude could you track with this?
Agmathews, any more news on this. I am really interested in it, I think it could mean a huge step for precision landing (landing on charging station), and tracking objects. I was just wondering if you got it worked out or put it on the shelf for some reason. Inquiring minds want to know.
What I'd like to do next is use more than two IR blobs in the target. If I can use three or four blobs I can remove any ambiguity with the orientation of the target - presently I cant tell if the target is rotated 180 degs
I'd like to test with 4 LEDs in this pattern L___L_____L_________L where 'L' is an IR LED in the target.
To do this I need to sort the blobs / LED's in order left to right an then calculate the distances and rotation from there. With the WII camera the order of the IR blobs seems somewhat random.
I'm thinking I need to calculate the distances - in pixels - between each blob and the others then sort these from least to greatest to sort out which blob is which?
Can anyone point me in the direction of any techniques or algorithms for pattern matching / sorting this type of data?
Yet another test log - this one showing the rotation (yaw) derived from the Wii camera.
Note the Wii camera is not aligned with the compass - neither is the target. So the rotation is the relative yaw between the aircraft and the target. What is important is that the red line (the wii rotation) tracks the green line (compass values) perfectly with a uniform offset.
Here's another test log showing the ranging results from the wii camera (with the code above)
The blue line is the Wii Camera derived range.
There is little doubt it is the best source of height above ground when it is available
I have packaged up the Wii camera code into a library available here
This library includes the basic code for calculating range, displacement and rotation for a two blob IR target
You should be able to unzip it to your arduino libraries directory and run the example included
You will of course need a Wii camera, an interface circuit and a couple of IR target LEDs