As mentioned before, I have been working with a new quad rotor called the "X-3D-BL Scientific" from Ascending Technologies GmbH in Stockdorf, Germany, with the concept of integrating the SRV-1 Blackfin camera and radio board with the UAV flight controls. Interface is relatively simple - the X-3D-BL has a very capable onboardinertial measurement unit integrated with the brushless motor controls,so the interface between Blackfin and UAV is a simple 38kbps UART.
My original robot firmware required only minor changes, and I added flight controls to our Java console that was originally designed for a ground-based robot. The 3 columns of buttons on the left are assigned to pitch, roll and yaw control, and the buttons further to the right change or kill the throttle or initialize the controllers. The last column changes video capture resolution. The Java software has an archive capability which I exercised here -
http://www.surveyor.com/images/x3d-srv1-012808.mov
This particular video clip isn't very exciting, as I never take the quad more than 1-2 inches off the ground, but it does show the live view from the quad via the WiFi link and is 100% under control via WiFi from a remote host. There were some pretty good crashes earlier, but unfortunately I wasn't running the archiver at the time. I need to fine-tune the flight controls, and then will hopefully capture some more interesting video.
While this project is furthest along, I now have firmware for the Blackfin board that can either control the airframe via serial interface (e.g. the X3D) or 4 servo channels. The next flyer will be my "baby Hiller" coaxial fixed rotor which steers by shifting battery weight, and then I will start working with the fixed wing "Carbon Prime". It's nice to be making progress again on these projects, and now that I'm back in it, everything else feels like a distraction.
Comments
The purpose of the exercise is to figure out what control signals are required to achieve basic steering and stability. One of the goals is to identify a minimal sets of sensors required to achieve useful autonomous behaviors, completely taking the human operator out of the loop. From this starting point, I should be able to figure out how to program some basic behaviors, which ultimately can be coupled with sensor feedback (GPS, proximity, optical flow, altitude, visual landmarks, etc) to develop autonomous behaviors. This is a complex computational problem - we essentially need to distill and replicate the effect of a huge number of "calculations" performed by a human operator with an R/C transmitter when steering an aircraft on a complete mission, including takeoff and landing.
Could you explain a bit more how this works. Do you actually fly the quadrotor with that Java app, like a RC transmitter? (If so, that sounds hard!). Or is the quadrotor autonomous, and you just "fly the camera" to points of interest, as with military UAVs, and let the aircraft fly itself?