uav projects with blackfin processor and wifi radio

X3D SRV1Console

After some distractions and delays, my uav projects, first described here - are back on track. Yesterday, I received the needed firmware update for the AscTec (X3D) quad - it's now possible control the quad via the serial interface with no FM transmitter in the loop. As before, the controller is an SRV-1 Blackfin Camera Board, which includes a 500MHz Analog Devices Blackfin BF537 processor with 1.3 megapixel Omnivision OV9655 camera module and Lantronix Matchport 802.11b/g radio. The firmware changes were critical for defining the quad's behavior upon loss of signal from the Blackfin.

As mentioned before, I have been working with a new quad rotor called the "X-3D-BL Scientific" from Ascending Technologies GmbH in Stockdorf, Germany, with the concept of integrating the SRV-1 Blackfin camera and radio board with the UAV flight controls. Interface is relatively simple - the X-3D-BL has a very capable onboard inertial measurement unit integrated with the brushless motor controls, so the interface between Blackfin and UAV is a simple 38kbps UART.


My original robot firmware required only minor changes, and I added flight controls to our Java console that was originally designed for a ground-based robot. The 3 columns of buttons on the left are assigned to pitch, roll and yaw control, and the buttons further to the right change or kill the throttle or initialize the controllers. The last column changes video capture resolution. The Java software has an archive capability which I exercised here -

http://www.surveyor.com/images/x3d-srv1-012808.mov

This particular video clip isn't very exciting, as I never take the quad more than 1-2 inches off the ground, but it does show the live view from the quad via the WiFi link and is 100% under control via WiFi from a remote host. There were some pretty good crashes earlier, but unfortunately I wasn't running the archiver at the time. I need to fine-tune the flight controls, and then will hopefully capture some more interesting video.

While this project is furthest along, I now have firmware for the Blackfin board that can either control the airframe via serial interface (e.g. the X3D) or 4 servo channels. The next flyer will be my "baby Hiller" coaxial fixed rotor which steers by shifting battery weight, and then I will start working with the fixed wing "Carbon Prime". It's nice to be making progress again on these projects, and now that I'm back in it, everything else feels like a distraction.

Views: 1134

Comment by rad man on January 28, 2008 at 5:51pm
that remindes me of this curious little tool (i wish i could insert a hyper link buy i guess your going to have to copy/paste) http://www.thinkgeek.com/geektoys/rc/8698/
Comment by Howard Gordon on January 28, 2008 at 6:09pm
thinkgeek link - thinkgeek never updated their website when we changed from the ARM7 w/ Zigbee to the Blackfin w/WiFi, but that's basically the controller we're using
Comment by rad man on January 28, 2008 at 6:32pm
ahhh i see

3D Robotics
Comment by Chris Anderson on January 30, 2008 at 7:59am
Howard,

Could you explain a bit more how this works. Do you actually fly the quadrotor with that Java app, like a RC transmitter? (If so, that sounds hard!). Or is the quadrotor autonomous, and you just "fly the camera" to points of interest, as with military UAVs, and let the aircraft fly itself?
Comment by Howard Gordon on January 30, 2008 at 9:13am
Yes - that's exactly what I'm doing - instead of remotely generating the servo control signals and relaying them via FM transmitter, the onboard Blackfin processor is generating the servo signals and receiving navigation commands from the Java app on the remote computer. It is somewhat challenging, though not impossible, to fly this way using the mouse and clicking on various controls (my son suggested adding a joystick or Wiimote to the computer for steering).

The purpose of the exercise is to figure out what control signals are required to achieve basic steering and stability. One of the goals is to identify a minimal sets of sensors required to achieve useful autonomous behaviors, completely taking the human operator out of the loop. From this starting point, I should be able to figure out how to program some basic behaviors, which ultimately can be coupled with sensor feedback (GPS, proximity, optical flow, altitude, visual landmarks, etc) to develop autonomous behaviors. This is a complex computational problem - we essentially need to distill and replicate the effect of a huge number of "calculations" performed by a human operator with an R/C transmitter when steering an aircraft on a complete mission, including takeoff and landing.

Comment

You need to be a member of DIY Drones to add comments!

Join DIY Drones

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service