Shadow Cam


Shadow Cam.

This is a little proof of concept video I just put together. The goal is to always keep my aircraft's shadow in the field of view.

Equipment: Senior Telemaster. Fly-Cam-One-3 with built in pan/tilt. Sparkfun 6DOFv4 IMU (it was laying around so I used it.) Gumstix flight computer. Ardupilot used for controlling pan/tilt servos on the camera.

The flight is 100% manually piloted. Camera is 100% automatically pointed.

On board I am running a 15-state kalman filter for attitude estimation. The filter converges to "true" yaw angle independent of ground track, wind, and magnetometer. This is actually critical for camera pointing.

On the ground I have small app I whipped together Thursday evening that computes the sun location for the current time in ECEF coordinates. Then converts the sun "vector" to NED coordinates based on a GPS connected to the ground station (assuming I'm not ranging very far from my start point.) The code computes a new sun angle every minute. Finally, the sun vector is inverted to get a shadow vector and that is sent up to the aircraft as it's target point vector (once a minute.)

Notice: sun vector * [ -1 -1 -1 ] = shadow vector.
Also: sun vector * [ 1 1 -1 ] = reflection vector (where we would be looking at the suns reflection off surface water.)
Also: sun vector * [ 1 1 1 ] = rainbow vector if we would happen to fly above the clouds (this would keep us looking at the center of a rainbow circle/arc.) :-)

In order to run myself over with the shadow from the aircraft's perspective I need to fly the airplane through the sun from the ground pilot's perspective.

Disclaimers: this was my first time out trying something like this so the video is rough. The pan/tilt on the flycam is very low grade, but works well enough for a demo. I'm also flying with an IMU that is about 2 orders of magnitude coarser than I'm used to flying with, so that degrades my attitude estimation more than I would have liked (but the filter still converges.) I put very little effort into aligning the IMU mount with the camera mount, so there are certainly a few degrees of bias just from mounting issues. Finally, I only eyeballed the mapping between servo commands and pan/tilt angles so I'm in the ball park, but there are certainly errors there too. It's a hack, but made for a fun afternoon. :-)
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • I am just brainstorming here, but imagine a sensor whose output was the direction of the sun.

    Since you know the direction of the sun in the ECEF coordinates, you could partially determine your own attitude in ECEF coordinates. There would be an ambiguity in the angle along the axis of the sun. Thus for example, if the sun were setting directly to the west, and you were flying directly toward it, there would be an ambiguity in your roll angle, but not your pitch and yaw. Alternatively, if you were heading north, there would be an ambiguity in your pitch, but not your roll and yaw. By the way, 3D Magnetometers and 3D accelerometers suffer from this same kind of ambiguity. As you know, the trick is to combine multiple sensor readings to remove the ambiguities. If we had such a sensor, it would add information to our state estimation when its ambiguous axis wasn't aligned with the other sensor's ambiguous axes.

    So how could we build a sensor like this? One option would be to use a webcam, and find the sun, as was done in this paper

    http://www.google.com/url?sa=t&source=web&cd=1&ved=0CCM...

    I agree that the processing load would be pretty high. There has been some work doing video processing using FPGAs, which might make this approach more feasible. Alternatively, I wonder whether it would be possible to spherically arrange an array of light sensitive elements (e.g photoresistors) and extract the sun's angle from their signals.
  • One issue in IMU work is that you never really know the absolute "true" roll, pitch, and yaw of your aircraft. You hope your filter "converges" but does it converge with a bias or error? Does the error wander around? Can certain maneuvers blow up your error so you are way off and it takes some time for your filter to get back close again? Just because your aircraft gets around the sky pretty well, doesn't mean your IMU isn't 5 to 10 to 15 to 20 degrees (or more) off at times. If you think you are locked on to the true roll pitch and yaw, how well are you really? are you within 2 degrees all the time? 5 degrees? How do you quantify that? Most people don't have funds or room to fly a couple hundred thousand dollar IMU with their own IMU project to validate their results against.
    But using a camera an pointing at a known object or in a know direction could be one way to get feedback on the quality of your sensor. I don't think you could separate out the amount of error in all 3 axes (roll, pitch, and yaw) but you could at least get an upper bound on the absolute magnitude of your attitude error.
    This would require somehow very carefully calibrating the alignment of your camera mount with the alignment of your IMU. It would also require very careful calibration of your camera motion. And then you'd probably want some sort of image processing algorithm that could reliably locate your shadow (or whatever object you are pointing at) in the image. It certainly would require some work, but would allow you to figure out a bound on your total attitude error.
    After thinking about it some more, I'm not sure if this would be enough information to correct for IMU errors on the fly. (???) How would you separate out if your errors are due to roll, pitch, or yaw and in what proportion? But maybe if it's an extra vector of information you can work into your filter you could do something helpful with it?
    Maybe the biggest difficulty would be the real time image processing aspect ... somewhere on board you'd need to be cranking through a lot of pixels pretty fast. I guess you could do a wireless video link to the ground, but my experience with those is that they can fuzz in and out and if your attitude determination is dependent now on your link to your ground station (both video and data) then maybe that's not the most robust and general way to build an attitude determination system.
    Still it would be really interesting to see what someone could come up with using real time video information to augment their other sensors.
  • Nice work! The video provides definite proof that your attitude estimation is working very well.

    It occurs to me that it should be possible to run a system the other way. That is, by tracking your shadow (or the sun) you can estimate your attitude (as long as the sun is out and/or you have a shadow).
  • 3D Robotics
    This is genius. Best proof-of-concept video I've seen for ages--really smart execution with clear applications. Bravo!
  • Very interesting!
This reply was deleted.