Using a Wii remote camera for Arducopter navigation is possible.
Following Leon's great blog post showing the use of a camera from a WII remote as a 3D position sensor I've been playing around and having a think about how we could use the Wii remote camera as a sensor on the Arducopter.
First some of the thinking.
Currently the only sensor we have that tells us where we are relative to something else in our environment is the sonar. Some of the other sensors tell us where we are relative to a theoretical datum (eg GPS), or a variable datum (eg air pressure) these could both see you trying to land your aircraft below ground or off in a tree somewhere. The accelerometers tell us how we are oriented relative to gravity but nothing about where we are. The gyros tell us how we are moving. The new work on optical flow which senses the surroundings also tells us how the aircraft is moving but not where it is.
My suggestion is that a combination of an optical sensor such as the wii camera and ground based beacons could be used for relative position determination to allow for precision landings, take offs and even indoor navigation and loitering.
The Wii Camera can track up to four infra red blobs. By sensing the location of a set of predefined IR beacons we can, through simple trigonometry, determine our height above the beacons, our displacement from the beacons and our rotation (yaw) relative to the beacons. This information could then be used to control the navigation algorithms to steer the aircraft to, for example, a precise landing on a 'helipad'.
Work so far
I have procured a broken Wii remote from ebay (only $1.99 yay!) and built up a simple interface circuit as per the examples here. An arduino, a little bit of software and a couple of IR LEDs later and I now have a bench test setup that is working well. The video below shows the bench test at work. The infra red beacons are red circles. The camera is the blue dot. The front elevation is derived solely from the wii camera data. In this video I am flying the camera above two IR beacons 130 mm apart. The only thing the software knows is the distance between the beacons. The accuracy is astounding - less than a centimetre resolution is easily possible at 20hz.
Currently with only two beacons we can't tell if we are rotated 180 degrees. With three beacons in an asymmetric line we could remove any ambiguity about where we are.
Even if we had no knowledge of the layout of the beacons in the aircraft but had sonar for height we could the use this and the camera data to calculate their layout in the real word and use them for navigation or loitering.
Thinking outside the box we could get all Hansel and Gretal and use the arducopter to drop beacons along its flight path to find its way back home. (a simple beacon can be made for less than a dollar).
So could this work?
- Works on the bench
- Doesn't cost much
- Simple algorithms would not be to costly to run on the APM CPU
- The Field of view of the camera is too narrow (+/- 20 deg) to be practical right now. Need to find and test a few wider angle lens types. there are examples of others doing this
- You can't buy the cameras anywhere - you have to dismantle a Wii remote
- Need to modify / interface with the Arducopter software
I need some help to integrate the camera into the Arducopter code. I can just about follow the optical flow code but when it gets into matrix rotations etc my feeble brain hurts. I imagine that this camera navigation could be used in Loiter mode to supplement or replace the GPS data, and a new Auto mode that included specific landing and take off sequences and down the track indoor navigation etc
What do you think?