Hello All,

Using a Wii remote camera for Arducopter navigation is possible.

Following Leon's great blog post showing the use of a camera from a WII remote as a 3D position sensor I've been playing around and having a think about how we could use the Wii remote camera as a sensor on the Arducopter.

First some of the thinking.

Currently the only sensor we have that tells us where we are relative to something else in our environment is the sonar. Some of the other sensors tell us where we are relative to a theoretical datum (eg GPS), or a variable datum (eg air pressure) these could both see you trying to land your aircraft below ground or off in a tree somewhere. The accelerometers tell us how we are oriented relative to gravity but nothing about where we are. The gyros tell us how we are moving. The new work on optical flow which senses the surroundings also tells us how the aircraft is moving but not where it is.

My suggestion is that a combination of an optical sensor such as the wii camera and ground based beacons could be used for relative position determination to allow for precision landings, take offs and even indoor navigation and loitering.

The Wii Camera can track up to four infra red blobs. By sensing the location of a set of predefined IR beacons we can, through simple trigonometry, determine our height above the beacons, our displacement from the beacons and our rotation (yaw) relative to the beacons. This information could then be used to control the navigation algorithms to steer the aircraft to, for example, a precise landing on a 'helipad'.

Work so far

I have procured a broken Wii remote from ebay (only $1.99 yay!) and built up a simple interface circuit as per the examples here. An arduino, a little bit of software and a couple of IR LEDs later and I now have a bench test setup that is working well. The video below shows the bench test at work. The infra red beacons are red circles. The camera is the blue dot. The front elevation is derived solely from the wii camera data. In this video I am flying the camera above two IR beacons 130 mm apart. The only thing the software knows is the distance between the beacons. The accuracy is astounding - less than a centimetre resolution is easily possible at 20hz.

Currently with only two beacons we can't tell if we are rotated 180 degrees. With three beacons in an asymmetric line we could remove any ambiguity about where we are.

Even if we had no knowledge of the layout of the beacons in the aircraft but had sonar for height we could the use this and the camera data to calculate their layout in the real word and use them for navigation or loitering.

Thinking outside the box we could get all Hansel and Gretal and use the arducopter to drop beacons along its flight path to find its way back home. (a simple beacon can be made for less than a dollar).

So could this work?


  • Works on the bench
  • Doesn't cost much
  • Simple algorithms would not be to costly to run on the APM CPU


  • The Field of view of the camera is too narrow (+/- 20 deg) to be practical right now. Need to find and test a few wider angle lens types. there are examples of others doing this
  • You can't buy the cameras anywhere - you have to dismantle a Wii remote
  • Need to modify / interface with the Arducopter software

What next?

I need some help to integrate the camera into the Arducopter code. I can just about follow the optical flow code but when it gets into matrix rotations etc my feeble brain hurts. I imagine that this camera navigation could be used in Loiter mode to supplement or replace the GPS data, and a new Auto mode that included specific landing and take off sequences and down the track indoor navigation etc

What do you think?




Views: 4172

Reply to This

Replies to This Discussion

It's funny when you upload a video to You tube what it suggests you watch next.

Looks like University of Tubingen has done similar work.

See their video on youtube

Very interesting and nice write up.  Didn't take you long to get it working!


I think it's really neat/useful functionality but I think the issue with taking it "main stream" is Con #2 - the inability to buy the wii remote chip in quantity...


I wonder if there aren't other image processing chips on the market that could do something similar.  Lots of cameras and mobile phones need to do this sort of thing..surely they can't all be writing their own algorithsm.  I'll bet some company has made a chip to do this kind of thing and it's probably somewhere on digikey.com.

Excellent seeing some work done in this area! Have been thinking of exactly this for quite some time, but never got around to/mastered the technical bits. (ever since I saw Johnny Lees TED Talk on WiiMote hacks http://johnnylee.net/projects/wii/).

Keep up the good work!


I want to test this camera for position hold (loiter).

I can easily generate distance to target data in x,y and z (if we have a target height)  

Can you point me to where I would need to insert this into the AC code.

I can see in your optical flow work (http://code.google.com/p/ardupilot-mega/source/browse/libraries/AP_...) that you update vlon and vlat - how are these values used in the main navigation code?

do you have to select a specific mode when testing optical flow for loiter or does your code replace the GPS loiter code?




ps take care in the typhoon - sound like non flying weather for you for a while


    I believe we're using two nested PI controllers to control position.  The main functions that would need to be modified would be in navigation.pde.  In particular calc_location_error and calc_nav_rate.

   So what it looks like is:

        -- update_navigation in ArduCopter.pde gets called at 10hz.  In LOITER mode this immediately calls update_nav_wp. 

        -- update_nav_wp in turns calls the calc_location_error function to get the lon/lat distance to the target.  these distances are then passed into the more complex calc_nav_rate function.

        -- calc_nav_rate uses a PI controller to calculate the desired lat/lon speed (slightly confusingly renamed to x and y) and the another PI controller to convert speed to an angle to lean at

       -- calc_nav_pitch_roll converts this 'angle to lean at' into roll and pitch angles.


     So if you could modify those two functions, calc_location_error and calc_nav_rate, it should work.  Some gotchas are that the navigation code probably only gets called if we have a gps lock.  There's added complexity because we work in lat/lon distance/speed/angles right up until the calc_nav_pitch_roll stuff...but staying consistent with that might be good especially if you want your code to work outside too.


     Not easy stuff but good luck!




ps. the typhoon was quite serious.  some trees down here and there and my building was actually moving..possible because of the anti-earthquake springs that the building sits on.

Thanks Randy,
I might start with making the wii camera work as an sonar replacement/ augmenter this weekend - just as an experiment - while I digest the navigation code further.
One thing that stands out to me is that on the scales we work we should not be using lat and lon values directly in our nav pi loops as there are not 'square' unless you are at the equator. A conversion to meters x and y would be better. I need to read more of the code.....


    Neat.  I was wondering how were you thinking to replace/augment the sonar with this?  are you thinking about mounting the camera on the side of the quad and put an led on the wall?  That would make sense.  Or were you going to put multiple leds on the ground and try and judge altitude by the distance between them (I'm not sure this would even work - i haven't really thought it through).


    Re lat/lon values not being square, we apparently already take care of this (in some situations anyway).  You'll find this code in commands.pde:

 // this is used to offset the shrinking longitude as we go towards the poles
 float rads    = (abs(next_WP.lat)/t7) * 0.0174532925;
 scaleLongDown   = cos(rads);
 scaleLongUp   = 1.0f/cos(rads);

    ..and then you'll see scaleLongDown and scaleLongUp used in navigation.pde.

     Here is a link on how to do the conversion although this is not exactly the same method as above which I believe is a short-cut but it's good enough for the short distances our uavs are likely to travel.


I am going to try a downward looking camera and a two or three blob target. Two blobs gives height above target and x,y offset in the camera coordinate system. Three blobs gives x,y,z and rotation in the aircraft coordinate system.
Basically I think if I can hold the aircraft over the tagets manually (still working with a narrow +/- 20 deg field of view) I can replace the sonar data with camera data for altitude hold.
Will give it a crack this weekend and report back.


I've mounted the wii camera to my arducopter, created an arduino library for it, modified the arducopter code and done some test flying.

The test flying so far is hand flying (with and without motors running) as the back yard is too small for free flight and it was getting late.

The flights are above a simple IR target that is just two IR leds approx 15cm apart.

Here is the graph of the three altitude/range sensors

The blue line is the Wii sensor output - note when the sensor can't see the target leds it outputs zero.

The correlation between sonar and IR ranging is pretty darn good!

Here is a couple of photos of the camera mounted on the airframe

looking downwards

Looking up from below

The code also calculates x/y displacement from the target in millimeters and rotation (yaw) relative to the target.

Need to do some work to cope with aircraft roll and pitch effects (current code assumes the aircraft is always level) I assume I can use the optical flow code as an example of how to do this.


Next Steps

* free flying and Alt-hold tests

* clean up the code

* have a look at modifying loiter to use the targets

* find and use a lens to get a wider field of view




> I think it's really neat/useful functionality but I think the 

> issue with taking it "main stream" is Con #2 - the inability

> to buy the wii remote chip in quantity...


@Randy :

Why is it a so big issue? We are talking about hobbyist drones, not "mass production" drones. I think that every hobbyist interrested in having a wii camera can buy a wii-remote and disassamble it. Unlike others electronics products, the Wii-Remote camera is quite easy to desolder.



Andrew: very cool! You should post this as a blog post so more people can see it. 

Ok will do tonight (aus time)

Reply to Discussion


© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service