Skywriting with a quadcopter?

Hello all,

I've been using the APM platform for a few months now, and i'd really like to try a software project. Something i have in mind is following a predefined path for extended exposure photography. I've seen many instances of this, such as below, but i'd like to do something autonomous.

uc?export=view&id=0BxzY1gh9GhguTlZLRGVILUZDblk

Obviously, this could have a lot of possible solutions! AUTO mode with GPS waypoints can already achieve something to this effect - but i do not believe it can obtain the fidelity that i would like. 

The solution that I keep coming back to is a single external camera with 2 LEDs of known distance apart mounted on the quad and a fixed compass orientation. By keeping the quad pointed towards the camera, I believe 2 points in space would be enough to determine x y z coordinates. 

Is this technically feasible? Would LEDs and a webcam provide enough information to calculate position?

I believe you can transmit this kind of position data through MAVlink, which would then be interpreted by the APM control software. I know I'm probably in over my head, but where would this customization take place? Would it be a custom flight mode? Could I somehow get the APM to accept these as intermediate waypoints or something? Are there any guidelines for the structure of the APM program? I am very mildly experienced with C.

I know this is heavy on 'wants' and not on technical details. What i'm looking for is input on feasibility, and checking to see if anything like this already exists. I don't necessarily want to reinvent the wheel, but I thought this might be an interesting project. 

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Developer

    That's a pretty picture.

    yes, it should be possible.  Andrew Matthews did something similar maybe two years ago with a Wii camera and 4 LEDs although that made use of the smarts in the Wii camera and it was looking down at the LEDs instead of sideways and all the intelligence was within the vehicle instead of the ground station.

    We've got some introductions to the code here although it was written for AC3.1 and the structure has changed a fair bit.

    One important issue is to decide if the smarts should go on the vehicle or the ground station. I'd tend towards putting it on the vehicle because the latency will be lower but either way is probably possible.  Another small issue is that AC doesn't currently have a way to accept very small changes in horizontal position from the ground station.  It's Lat/Lon points and they're transmitted as floats so their accuracy is only about 1m.

    The upcoming sparkfun AVC though is pushing us towards developing some vision recognition stuff for finding and popping red balloons. I'll be using an ODroidU3 on top of a Pixhawk for that..maybe there will be some re-use of this code for more visual development.

    • thanks randy! that odroid guide is great. based on the rpi references, i'm assuming the same guidelines will work? I was planning on using an rpi camera module because of its speed and IR capabilities. if you don't think its fast enough, I also have a BBB laying around...

      • Developer

        Joe,

        My guess is that the RPi will also be able to do it and like you say, the camera is apparently very quick.  RPi, ODroid, BBB are all quite similar so I think whatever we get going will be share-able.  My guess is that some of this vision work is going to turn out to be the major advance this year for the project.  There's a significant number of people in the community look into it.

  • Moderator

    Jack Crossfire is our local expert

This reply was deleted.

Activity

Neville Rodrigues liked Neville Rodrigues's profile
Jun 30
Santiago Perez liked Santiago Perez's profile
Jun 21
More…