All Posts (14054)

Sort by

Me and my team have come a long way since I first posted on RC-Groups with interest of building a UAV.

I started pretty much from square zero, other than having some experience with Lego Mindstorms. I didn't know why my Stamp was being shorted when I did not use a resistor with an LED. I managed to short circuit my Parallax Servo Controller 2-3 times, ultimately resulting inthe battery springing a leak. I didn't know what a green dot meant on aschematic, and I almost fried my FPU chip. But in time, I learned frommy mistakes.

I have much to thank to this website, Parallaxforums, and RC-groups. I personally have learned ten times more from this project thanthree year's worth of science and technology classes, and to have themoney and freedom to design/research whatever you're interested in is agreat feeling for an aspiring engineer. It is really a great experience.

After months of research, we drafted a project proposal and emailed it to BAE Systems, a defense contractor company with ties to the Honolulu school system, and our principal, with requests for sponsorship.

BAE Systems responded to us in about 2 weeks, expressing interest in our proposal and offering us $363 to build our prototype (They offered $455, but I declined the additional money as I decided to withhold buying the airframe until I was confident this project could be done). Then shortly afterwards, our principal responded, requesting us to do a PowerPoint presentation to her, a few teachers, and two engineers.

So we created a PowerPoint and crammed as much detail into it as possible (complete with GoogleEarth footage zooming in on a simulation flight path =P), which impressed the principal, who decided on the spot to fund us the remaining $200 we needed to buy the Easy Star and a radio transceiver system. As it turned out, one of the engineers (who will be our mentor) actually has a hobby doing the exact same thing we were doing (he uses the Easy Star as well)!

While all this was happening, we were also building a ground-based prototype to test our autopilot system. We used relatively low-tech materials (as you can see in the photograph) but it works fairly well and has a good turning response. My only concern is that it is too slow to get any sort of accurate bearing from the GPS's differential positioning (In that case though, we'll just use a digital compass to simulate a GPS bearing reading).

Anyway, after printing out Chris's Basic Stamp autopilot source code, GPS demo source code, pages and pages of floating point coprocessor instruction and data sheets, and PSC documentation, and then analyzing it, highlighting it, and annotating it, I've finally grown confident enough to code my own navigation program from scratch, although I pretty much used Chris's source code as an autopilot bible while I coded.

The current program works in a similiar way to Chris's navigation program, except it uses the floating point coprocessor, which allows a much more liberal use of variable storage, not to mention a relatively high precision without having to resort to clever tricks with integer-only math. The only difference is that since the turning system is a differential drive with continuous rotation servos, the turning angle is inserted into a function for the duration of turn, rather than the position of the rudder.

There are some logic and mathematical errors, especially with unit conversions, and when I work with bits, bytes, words, etc, since I don't really have any formal computer science training, so those concepts are foreign to me, but I think for the most part it looks okay. (It compiles!)

I am thinking of upgrading to the BS2P though, since currently I'm not sure how to implement waypoint lookups, other than just using constant arrays (do those even exist?!). The increased speed and scratchpad RAM will be nice.

So here is the program. For some reason, you can't directly view it on our website since it gives you a 404 error (it works offline), but you can still download it by going to the homepage and right-clicking --> save as, but you would need the Basic Stamp Editor to view it. (http://www.project-uav.net/index.html) I don't suggest you read our website other than the front page though, since it is mostly redundant information and mainly for academic purposes.

Otherwise, I've also uploaded the code to pastebin, so you can view it there too. Unfortunately without syntax highlighting it is a huge eyesore, but hopefully you can follow it.

http://www.pastebin.org/18059

P.S. I type as soon as a thought gets in my head, so in retrospect it kinda sounds like I'm rambling, so please excuse that.









Read more…
3D Robotics
In an earlier post (and here), I showed how to switch the Basic Stamp autopilot from the Parallax GPS module to the better EM406. The reason to do this isn't just the superior reception of the EM406. It's also that the original version of the Basic Stamp autopilot is based on the Parallax GPS Module's "smart mode" of receiving GPS data, which is both incompatible with all other GPS modules and with all known GPS simulators.

So this was as good a time as any for me to upgrade the Basic Stamp autopilot for the EM406, so we can both use a better module and use GPS simulators (as described in this tutorial). If you want to use a GPS simulator with your Basic Stamp autopilot, you MUST make this change.

Here's the beta code of my port to the EM406. It was kind of a hassle, due to the different ways the two modules give GPS data, but the new version has a built-in GPS parser and is otherwise a lot more robust and can handle any GPS module you want to pair it with. It's also cleaned up a bit, so it's simpler and easier to follow. Tomorrow I'll test it with hardware in the loop and see how it does...

[UPDATE: I caught a lot of bugs in this code, including my stupidity in forgetting that if you place a value in two bytes of scratchpad memory with a PUT [location] Word [value] command, you've got to do the same when you GET: ie, "GET [location] Word [variable]". In the previous code I was PUTing with a Word modifier, but not including that modifier when I'd GET. Anyway, it's all fixed in the code linked in the paragraph above. Still need to do some tweaking to do the conversions properly to get around the limited variable space, but we're pretty close.]

Read more…
3D Robotics

If you've been following the exploits of IceBear and a few others, you know that recently they've been beta testing a hot new autopilot that promises to be better and cheaper than anything else in the under $1,000 category. It's called the AttoPilot and I'm delighted that our third interview is Dean Goedde, its creator, who is here to tell us more.

Before we start, a little overview: To date, if you wanted to build an under-$1,000 UAV you only had two choices for the autopilot--make it yourself or pay $500-$800 for a UNAV PicoPilot (and the version of that with the features you want--the NAT--will push the total cost of the UAV to about $1,500).

I'm flying with PicoPilots and generally like them, but the relatively high cost and low feature set of that hardware (plus its total lack of customizability) is why Dean decided to make something cheaper and better. The picture above is him after one of his first successful test flights, which he says shows what he looks like "after 30+hours being awake hammering out new code, followed by generation of 6MB of log data from 20 minute flight!"

AttoPilot could revolutionize our hobby. Dean hasn't announced exact pricing yet, but he's targeting roughly half the price of the PicoPilot and the feature set matches those of autopilots costing thousands of dollars more. The early reports from the beta testers have been very positive.

This is still pre-release, so Dean doesn't have the website up yet (it will eventually be at attopilot.com) and there is still much be done on manufacturing, documentation and fulfillment. But I'm thrilled by what we've seen so far, and I'm hoping that AttoPilot will be powerful, customizeable and affordable--which is really what DIY Drones is all about. And, as you'll see at the end of this post, two years ago he started just like us, learning basic embedded programming with Parallax kits.

The interview will be in three parts:

Here's Part 1:

Q: Can you tell us a little about yourself, day job, background, family, education, hobbies, etc?

I'm 36 years old, born in 1971 in the middle of the Apollo moon rocket days. At age 4 and 5, I would take construction paper, color it red with a crayon, chop it up (to increase surface area) and pack it in a tube with fins and a nosecone. I was suprised when it didn't "go", but can't remember if I tried to apply fire or not (I hope not!).

My mom's older brother is my godfather, and he is like a 2nd dad. He grew up in the 1940's building free flight planes, then did some of the earliest RC in the 50's and 60's (1 channel, "escapement" if you have ever heard of this it uses a rubber motor for torque that applies a bang-bang left/right/neutral choice) and a single pushbutton on the RC Tx. He was a maintenance guy in the US airforce in the early 1960's (heady times indeed) and is an electronics guru, that worked on missile silo maintenance.

In the 1970's he got into multi-channel Heathkit RC gear, and flies on that same equipment to this day. I have memory back to age 1 or so - so I clearly remember a warm summer afternoon in 1975 or so where I was playing in the front yard (rural area) and a 2m sailplane lands on the front yard but I was totally alone, then about 30 seconds later here comes my Uncle sitting on the open window jam of a car driven by his sister in law, coming down the dusty rock road at 50mph, he is hanging out of the car with Tx in hand. He had just flown the plane from his house about 2 miles away.

He was also the type of uncle that is very generous, bringing a plane to family get togethers, letting all of us kids flying it, being helpful, and showing us how to do stunts even. He is just such a super guy and a BIG reason I love this stuff, that I HAD to dedicate a paragraph to him. He is currently waiting for serial # 00001 of the production version of AttoPilot. I talk to him about 3 times a week.

I started building free-flight balsa planes of my own design at age 8. Age 9 I was fascinated with the Wright Brothers, reading a book about them in the 4th grade classroom literally 100 times in that year. This book was not dumbed down for kids, it described all of the background, trial and error, wind tunnel testing, extensive testing in the Carolinas over the sand dunes, about their homemade gasoline engine, etc.... I was enthralled. For 11th birthday my uncle and aunt bought me a 2m sailplane balsa kit from hobby-lobby called the Wanderer. It costs $14 as a kit back in 1982. My parents told me that if I finished it, they would buy the RC equipment.

I built it 100% myself, the covering job looked crappy, but it flew well. I added an 0.049 engine pod, then next year an astro 05 electric motor direct drive on a 7x4 prop. Electric fascinated me from the beginning. From age 12-15 I built many more RC planes, all of them self-designs. The first 5 or 6 just did not fly, but it just made me try harder. I then built a Goldberg Eaglet-50 (0.25 engine, full house control), and was even more hooked.

I studied music performance (trumpet) the first 3 years of undergraduate, but then switched to chemistry. After my bachelor chem degree, I got a PhD in chemistry at the University of Illinois at Urbana-Champaign. In 2001 I moved to Phoenix Arizona to work for Intel Corp (the chipmaker) at a high-volume fab as an engineer, and have worked for them until present.

At my day job, I do a lot of data mining, and looking for correlations between effects and their possible causes. I have 7 years experience sifting through literally terabytes of real-world data from $3 Billion chip making fabs. Looking at data is in my blood, as well as iterative methods of every sort. I am an Excel and SAS JMP guru, though up until recently had no programming experience.

Hobbies include flying RC, AutoCAD design of airplanes and laminate rifle stocks, gunsmithing and target shooting, aircraft construction, and now (the last 2 years) electronics and programming. I like to expose/develop/etch my own PCBs at home for rapid prototypes, and am able to do solder pasting of the smallest SMD components (0402, QFN, etc...).

I have almost no background in electronics or programming except self-taught starting in January 2006. I must recommend the Parallax company for EXCELLENT tutorials/kits/discussion forums to help utter beginners. That is where I was 24 months ago.

Read more…
3D Robotics

Q: You decided to go with an IR sensor (aka "thermopile", shown) rather than a gyro and accelerometer in an IMU for the first version. Can you explain how you came to that decision?

I have found that even without fancy code, the thermopile is a surprisingly robust solution; a LOT of bang for the buck. This keeps my AttoPilot in the spirit of "simplest low cost 3 axis autopilot.


My first flights were with very stable aircraft (the Miss2 old-timer). Even with this plane, over-steering was a problem at times. I realized some method of attitude sensing was a strict requirement for any serious autopilot.


I started out flying the FMA CoPilot [which is based on the IR sensor show above] only as the typical RC usage on a small unstable homemade plane with no dihedral and just aileron/elevator flight control. After experimentation, I realized that AttoPilot could skew the inputs to the CoPilot control box, and all of a sudden I had AttoPilot flying that same small unstable plane (at 70+km/h) even better than the raw AttoPilot used to fly the polyhedral winged Miss2.


Later, I reverse-engineered the CoPilot sensor head, and figured out how to interface to it directly to AttoPilot via 2 channel ADC (BTW, there is mis-information on the Paparazzi website regarding the FMA sensor head; it say CoPilot head is designed for 5V, but in fact it is actually a 3.3V device, so you don't need to replace resistors to make the gains correct), so now AttoPilot has direct use of the thermopile data, and therefore has direct idea of pitch and roll angle for a robust flight control solution that will never over or under-control.


I am integrating GPS and barometer data with the thermopile, so the solution is very robust, and self tuning. Additionally, although I have a working Kalman IMU, accelerometers have the limitation of being affected by motor vibration, whereas the thermopiles are immune to vibration.


So, in the end, to keep in spirit of a low-cost full function autopilot, I am starting out offering the thermopile version. As I wrote above, my use of thermopiles is more sophisticated than FMA's, so people should not be fearful about how well it will control their plane. I do have a working Kalman IMU, but just not ready to a level that fits my vision of a plug-and-play autopilot. Building all of the fancy control routines for my usage of CoPilot thermopiles is a nice stepping stone to a Kalman IMU later.


Q: What was the biggest challenge in designing AttoPilot?

Besides learning embedded programming over the last 12 months, besides coming up with the trigonometry routines that work anywhere on the globe, the biggest challenge in came from learnings with the Beta testers: making the Rx-interface code object TRULY universal to all R/C Rx out there. I had to crack down on myslef and learn assembly code, which I had avoided up until 2 months ago. Now that I know assembly, I am not only HOOKED on it, but re-writing other code objects.

Q: What advice do you have for others that are interested in building autopilots or related gear?

These things are difficult, like Masters or PhD thesis level work, so don't kid yourself that it is easy. Have passion for it. If you are so determined to make something happen that the thought of it not working out makes you very unhappy, then you WILL find a way to get around ANY barrier, including lack of knowledge. Just because you don't know how to do something is the WORST reason for not doing it! Read "Think and Grow Rich" by Napolean Hill - Passion for something is the #1 ingredient to success.

[Part 1 of this interview series is here. Part 2 is here.]

Read more…
3D Robotics

Q: In a nutshell, what's the AttoPilot project about?

The AttoPilot is about bringing a full-function powerful autopilot to a larger crowd, for many of whom "DIY" is not their interest, just something they can buy and use. I myself wanted to buy a hobby autopilot in late 2005 knowing nothing except "I'll look on the Web, surely I'll find something".

I was SHOCKED that the cheapest unit that met my basic requirements costs $900 (the UNAV NAT). Everything I found was either really heavy/expensive, or home-made looking. I knew there was much room to improve, even for somebody like me with no idea (at the time) how to make something like an autopilot that I would let fly my airplane.

So, I started out only with ideas of how it "should" work with no ideas of what roadblocks the hardware/coding might have in store for me. I think this naiveness was an asset. Features of initial release hardware rev (other versions coming, some smaller):
  1. Low weight: 36 grams (1.25 ounces) is the TOTAL added to your airplane including GPS and antenna and sensors. 50% of that is just the CoPilot head and cable (18 grams). The IMU version is lighter, mass = 22 grams total system. [note: the picture here are of prototype hardware]
  2. Small size: Atto is a single package of about 25x20x10mm with right-angle header connections to the servos, like a small 6 channel Rx. The only external pieces are the GPS module and CoPilot sensor head.
  3. Rx interface: Atto reads the channels from your Rx through the servo jacks, no soldering to the Rx board. Atto is designed to read 3V/5V logic signals automatically, and ANY pulse order from the Rx, be it overlapping or serial, Atto doesn't care. It reads with precision to < 1us.
  4. Integrated data logging to micro SD card, and USB jack for interface to your PC. Currently in testing, I have about 20 parameters logged at user-configured sampling rate, up to 5 Hz. You can view the flight history in Google Earth as a 3D path, plus analyze many details of the flight, including how the autopilot performed, and what it was "thinking" at the time. You can use this data to know precise location of aerial photos.
  5. 100,000 waypoints that are 3D. Make your flight plan graphically in Google Earth, then use simple formatting application in Windows to hand-edit the path, and add XOR checksum to each line. The finished file goes on the SD card via USB jack.
  6. No limit to distance between waypoints, or to how far the unit can fly from starting point. Proven to work in all hemispheres (NS, and EW of the prime meridian)
  7. Intergrated static and dynamic barometers with special data filtering routines, for baromteric altitude control (that is corss-referenced against GPS altitude) and pitot tube airspeed.
  8. Currently 8 triggerable events at each waypoint. User-configurability what the 8 events are, be it position of auxillary servos, etc... Useful for Aerial Photography.
  9. Drives up to 6 servos, with user-configured mixing of servo outputs (for V-tail, elevons, or even just 2 seperate aileron servos)
  10. Telemetry jack to the XTend modem by Maxstream. This is currently being developed but is on track. Working now is the downlink to moving map software, but goal is 2-way com to take R/C control at long range, upload new waypoints, and issue commands such as RTL.
  11. Stability/attitude control: FMA's CoPilot thermopile head is used in a special way: By fusion of the thermopile data with barometer and GPS data, a very robust and self optimizing system is created. Auto trimming during flight, auto gain adjustment.

Pricing - I can't comment on this yet, as I don't know what it will cost to have these made by a professional assembly house (I will not sell a hand-made AttoPilot, people deserve at least the same quality as is in their TV remote). I can tell you that my desire is to sell something 20x more functional than anything else currently available, and for about HALF the price of other hobby autopilots. Many people are not interested in DIY, but rather plug-and-play. Of course, there is much more to a drone than the autopilot, so I think these people still fall in the spirit of "DIY Drones".

Schedule of availability: Summer 2008. Beta testing is still going strong with rapid roll-outs of firmware upgrades, and hardware add-on PCBs that will be integrated into final production version.

Q: What's the origin of the name "AttoPilot"?

Well, you have micro and pico pilots, and they refer to the SI notation for millionth and trillionth (1x10e-6 and 1x10e-12), respectively. I have talked to Curt Olson off and on, so I know that guys like him don't need the tiniest autopilot, but for people that do, I'll offer something smaller than the "pico" pilot. "Atto" is SI notation for 1x10e-18, or a millionth of a trillionth. Also, Atto is like Auto, so instead of an auto-pilot, you get the name Atto-Pilot!


BTW, the Beta Attopilot uses pretty big DIP processor and breakout boards from sparkfun.com, but even then with GPS the unit comes in at 1.6 ounces, which is already HALF of the Pico Pilot's assembly of PCBs to make their complete system. By going to a NEWER GPS besides the ETek unit and having Atto be a 100% SMD [surface mount device], the mass drops down to the figures I quoted, in the < 1 ounce range.


In addition, I have a 6 gram "shrink" version with integrated Kalman IMU on the same PCB with integrated GPS and antenna and other sensors. This version is suitable to fit INSIDE the wing of a tiny MAV (like a 6" span plane). This is a work in progress, but I suppose my point is that my autopilots will TRULY be Atto in sized over this coming year.


[You can read more about this project and its inception in these two RCGroups threads. Latest beta details. The origin of the project emerged in this thread]


[Part 1 of the interview is here. Part 3 is here.]

Read more…
Developer

Final version of ArduIMU (Include Video)

After a few more tests, I now have the final version of the ArduIMU. The main problem was the calibration of the gyros: converting the raw data to degrees/s. I had to change the source code, compile it, upload, and repeat it over and over again. Totally waste of time.So i configured Arduino to externally adjust the gyros and the accelerometer jitter filter, using the nunchuck potentiometers (joystick axis). Now it’s working pretty well. =)Please, in the source code, ignore all the lines marked with "//(kalman)", this lines are related to the kalman filter code, those marked by "//(nunchuck) are related to nunchuck accelerometer and axis control, and all marked by "//(gyro)" are code related to the control of the gyroscope.Another tip, all the data is processed in radians, then is converted to degrees (radian*180/PI = radians*57.295779= degrees). If somebody needs any help, please feel free of asking me.Video:-Yellow color = Just Accelerometer read-Red color= Accelerometer+Gyro+KalmanFIiter
Arduino Source Code: http://sites.google.com/site/jordiuavs/Home/kalman_filter_by_Jordi.txt Labview (ground station) source code: http://sites.google.com/site/jordiuavs/Home/kalman_filter_by_Jordi.txt

To do: -Integrate another Gyro (to make it dual axis) . -Program the SPI altimeter.-And put all the hardware together and make it work (GPS, Altimeter, IMU, Modem).

Done: -Servo control -Gps Waypoint navigation (I’ll post later about it).

The next article I'm going to post all the links to get started with Arduino environment. All you would need to learn: tutorials, examples, references, tips and tricks. If you already know C++ it’s going to be 99% easier. If not, you are going to discover why C/C++ is the most compressive and famous language ever ;-) Then jumping to any AVR, ARM, PIC microcontroller C/C++ language environment, is going to be a lot easier, because they’re all based on the same principles.

Read more…
3D Robotics
Every member of DIY Drones has the ability to create their own blog posts and discussion topics (Please do! This is a community site and its vitality depends on your participation).

Doing so is easy: click here to create a blog post or click here to create a discussion topic (those authoring links are also at the bottom of the respective columns on the front page). But which should you choose?

The following are some guidelines and tips on effective posting on DIY Drones, whether you choose discussion topics or blog posts.

First, the practical differences:

Blog posts:
  • Have more layout control, including inline videos, along with some basic HTML layout and typography tools.
  • Are included in the RSS feed. So when you post a blog entry, subscribers to the feed can read it, even if they don't visit the site. That means more readers.
  • Are posted in the center column in chronological order, and stay in that order.
Discussion topics:
  • Have fewer typographical and layout options
  • Can include files
  • Are not included in the RSS feed
  • Rise and fall on the list at left based on activity (responses)
When should you create a blog post versus a discussion topics, or vice versa?

Blog posts are best when:
  • You have something to say, be it a project report or an interesting bit of news found elsewhere
  • There are pictures or videos to illustrate the point
  • You're writing something longer than a hundred words or so
Discussion topics are best when:
  • You have a question or need help
  • You want to raise an issue for discussion
  • You want to share some code and solicit comment
Finally, here are some tips for good blogging:

  • Start with an image (put the cursor before the first word on the page, and insert an image there). Click the image icon, upload the image and select the following options: wrap, right, 300 pixels (or 200). This will ensure that the post excerpt on the front page shows an image, which is important if you want to be read.
  • Turn all your URLs into proper links by selecting the text you want to be the link, clicking on the chain icon, and pasting the URL in the box.
  • If you edit the blog post later, pick "Choose a date and time" and then enter a date and time close to your original posting time. The ensures that your blog post will stay in chronological order, and not jump to the top of the list regardless of the original posting date.
  • Images are good. More images are better. Video is better yet.
That's pretty much it. Happy posting!




Read more…
3D Robotics
Okay, now that we've connected our PC to our autopilot's GPS input, let's let Curt walk us through all the things we can do with FlightGear. Starting with this:

Q : Is there a way to use FlightGear to plot a course (waypoints) and have it output GPS to an autopilot as the simulated plane in FlightGear flies that course? Or even output GPS while you fly the plane in FlightGear manually?


For any flightgear flight (manually flown, flown on autopilot, etc.) you can continually output GPS sentences over a serial port. There is a command option that let's you choose exactly what you want to do:

It looks something like:
<protocol>=<channel>,<direction>,<hertz>,<channel_specific_options>

So if you wanted to output standard NMEA strings on COM1 at 1hz using 4800 baud, you would do this:
nmea=serial,out,1,COM1,4800

With this option activated, you can connect a variety of moving map applications, or a PDA, or just about anything else that is capable of connecting up with a gps and doing something. For instance, there is a palm pilot moving map application designed for flying and this can be slaved to a running copy of FlightGear. Here's a video of that:

This particular flightgear command line option is very powerful and allows you (with different configuration choices) to send data through a network socket to another application. FlightGear can also receive data through this same mechanism and draw the view based on the external data.

So you could plug a real gps into flightgear and have flightgear draw a 3d view based on the gps data.

Or you could slave one (or more) copies of flightgear to a single master copy (sending and receiving the data at 60hz over a network connection) and create a multi-monitor distributed simulator system with a wrap around visuals. That's what we've done here:

File input/output is also an option for this command so for another example you could fly some route, log the nmea/gps strings as you fly to a file (instead of sending them out a serial port), and then have that file to feed that back into your autopilot system later for testing and development.

This all might sound a bit crazy, but one of the design goals of FlightGear is (as much as possible) to be open, flexible, and adaptable to new situations. Exposing a variety of ways to send and receive data allows FlightGear to be adapted to a huge variety of uses.

So just to summarize ...

OUTPUT:


FlightGear can be used to drive external moving maps, external gps's, even other copies of FlightGear to create a multi-monitor wrap-around visual configuration.

INPUT:

FlightGear can take input from a variety of sources so it can be slaved to an external GPS or IMU outputting data in real time, it can be used to replay a saved flight. I talked to an engineer that had ride in the back of a 767 with no windows and wanted to interface FlightGear to the live 767 systems so he could see what was going on while he flew.

MULTIPLAYER:


Don't forget the FlightGear multiplayer system. If you are slaving FlightGear to some external data source, but have the same copy of FlightGear connected to the FG MP system, now suddenly you have the possibility of injecting a real, live flight into the virtual FlightGear world and anyone else logged onto the multiplayer system, can watch your real UAV fly in the virtual world. They can come up and fly with you, etc. If you have multiple uav's registered in the system they can all see each other in their respective synthetic views, they can be highlighted/circled on the hud overlay, etc. It's almost a little mind bending to think about, but it's all there and has been succesfully demoed. Oh, and we have a google maps based multiplayer map system, so if you do have your live UAV flight registered in the MP system via a local running slaved copy of FlightGear, anyone with a web browser, anywhere in the world can track your UAV live.

I don't know if all of these things are immediately useful to everyone (or may never actually be useful for anyone) but sometimes you do things just because they are cool and fun and because you can, and somewhere, someplace you hope you've scored a couple geek points. :-)

[In our next installment of this series, we're going to drill down on connecting FlightGear to an IMU...]

Read more…

X3D SRV1Console

After some distractions and delays, my uav projects, first described here - are back on track. Yesterday, I received the needed firmware update for the AscTec (X3D) quad - it's now possible control the quad via the serial interface with no FM transmitter in the loop. As before, the controller is an SRV-1 Blackfin Camera Board, which includes a 500MHz Analog Devices Blackfin BF537 processor with 1.3 megapixel Omnivision OV9655 camera module and Lantronix Matchport 802.11b/g radio. The firmware changes were critical for defining the quad's behavior upon loss of signal from the Blackfin.

As mentioned before, I have been working with a new quad rotor called the "X-3D-BL Scientific" from Ascending Technologies GmbH in Stockdorf, Germany, with the concept of integrating the SRV-1 Blackfin camera and radio board with the UAV flight controls. Interface is relatively simple - the X-3D-BL has a very capable onboardinertial measurement unit integrated with the brushless motor controls,so the interface between Blackfin and UAV is a simple 38kbps UART.


My original robot firmware required only minor changes, and I added flight controls to our Java console that was originally designed for a ground-based robot. The 3 columns of buttons on the left are assigned to pitch, roll and yaw control, and the buttons further to the right change or kill the throttle or initialize the controllers. The last column changes video capture resolution. The Java software has an archive capability which I exercised here -

http://www.surveyor.com/images/x3d-srv1-012808.mov

This particular video clip isn't very exciting, as I never take the quad more than 1-2 inches off the ground, but it does show the live view from the quad via the WiFi link and is 100% under control via WiFi from a remote host. There were some pretty good crashes earlier, but unfortunately I wasn't running the archiver at the time. I need to fine-tune the flight controls, and then will hopefully capture some more interesting video.

While this project is furthest along, I now have firmware for the Blackfin board that can either control the airframe via serial interface (e.g. the X3D) or 4 servo channels. The next flyer will be my "baby Hiller" coaxial fixed rotor which steers by shifting battery weight, and then I will start working with the fixed wing "Carbon Prime". It's nice to be making progress again on these projects, and now that I'm back in it, everything else feels like a distraction.
Read more…
3D Robotics
Matt Chave's CUAV project is a great example of custom autopilot building, but one of the most interesting parts about it is that it uses magnetometers--digital compasses--in addition to the regular accelerometers and gyros. In the second part of our interview (the first part is here) I asked him to explain why:

Q: Can tell us a bit more about why you chose magnetometers and what you've learned working with them?

A few reasons:

We chose magnetometers because the earth's magnetic field is well modeled and provides a very good inertial reference to the aircraft.

We believed that the world magnetic model would provide sufficient coverage and accuracy for us to use for the inertial references. So far we've found it acceptable.

The magnetometers available are capable of detecting rotations to hundredths of a degree. We're not that accurate but we only need a few degrees accuracy for the current airframe.

We didn't believe it would be possible to calculate the attitude using a kalman filter with gyros and accelerometers because of the clock cycles this would consume and without floating point maths the solution could become to inaccurate.

We were concerned with the possible noise issues, however we found that most of the noise was high frequency and easily filtered out.

Since we're concerned with the frequencies below 50Hz the main problem is calibration for soft iron effects in the aircraft. These are calibrated by rotating the aircraft (easy while its this small) and recording the results then applying the appropriate bias' and offsets.

The magnetometers provide highly repeatable results with a long lifetime, they're very small, consume little power and perhaps could allow us to fabricate a 'system on a chip' solution.

They're also successfully proven on LEO (low earth orbiting) satellites.

Q: Why couldn't you use the magnetometer for roll, too? Indeed,with enough processing power could you imagine a full-featured autopilot with magnetometers alone?

Unfortunately, a single measurement of a vector can only provide two axis of information. There are papers out there which describe using Kalman filters to include dynamics and capture multiple measurements to resolve the third axis without requiring another sensor however as far as I'm aware these algorithms are only used on LEO satellites and require the collection of data from a complete orbit to provide a solution. We also don't have a gravity gradient stabilized airframe which they rely on, so we choose a roll stabilized airframe and assumed that the roll would be zero (of course that's not true but it helps to make it possible).

A good place to read more about the many different attitude determination algorithms is some of the papers available through Google scholar.

To explain it for those of us who are more visual the image shown here [at the top of the page] shows a single vector measurement viewed in the inertial frame. You can imagine the x,y and z field measurements which the plane would make in the body frame.

If you imagine the airframe glued to a cone shape centered around the vector with its apex at the tail of the airframe. Then if you were to rotate that cone about the axis (the vector), the measurements of the vector in the aircraft's reference frame would be identical. So there is mathematically no unique solution for the attitude without further information (eg. zero roll). If we add in another vector we are able to remove this dilemma.

Read more…
3D Robotics

Interview: Matt Chave and the CUAV project

For the second interview in our series, we'll go to Australia, where aerospace engineer Matt Chave has been developing a very interesting open source UAV that uses magnetometers in an interesting way (and he's using FlightGear for simulation!). Called CUAV (formerly Apeliotes), it's pretty much state of the art for amateur UAVs and is full of good ideas that we can all use.


Q: Can you tell us a little about yourself, day job, background, family, education, hobbies, etc?

I've always had a fascination with aircraft, physics, maths, programming and electronics among other things and couldn't think of a more perfect way to combine them all than an aircraft flight control system. While I was an undergrad I managed to convince my supervisor at uni that we could make an R/C autopilot for my honors year...(actually it didn't take much convincing he's a bit of a modeler himself). So I did plenty of reading over the summer holidays while I worked for a navigation company in New Zealand called Navman, and was ready to hit the ground running the following year implementing it. The original design was called the Apeliotes project and included the hardware and software to fly a small polystyrene plane. The controller used an arm7 micro on a custom board:

The only sensor was a three axis magnetometer to determine the error in pitch and heading from the desired attitude assuming the roll angle was zero. Apeliotes flew fairly well and managed several successful test flights switching between Remotely piloted and autopilot while in flight and recovering from any attitude our test pilot could put it in back to straight and level flight towards geographic north. you can see the video here.

I haven't given up yet on the magnetometer-only design yet, but it was difficult to make it fly anything other than north and level. Not that there's anything special about geographic north for the magnetometer especially in Dunedin where we were testing (inclination 70+degrees, declination about 20degrees). But I was taking too much time and we needed to detect roll anyway to fly more sophisticated models. So when I continued the project as a masters with the uni we added in a three axis accelerometer and toyed with a few computationally inexpensive attitude detection algorithms including; downhill simplex minimisation of Wahba's loss function, Triad, QUEST, etc...

A new controller board was created (below) with the lessons learned from the first along with many other developments and the project changed its name to the CUAV (custom unmanned air vehicle). Yeah, I was sick of the Apeliotes name :)

I've now moved to Australia with the project landing me a great full time aerospace job and am continuing the open source development of the CUAV in my spare time.

Q: What's unique about CUAV?

I'll try to summarise some of the major differences:

  • Cost: Using only magnetometers and accelerometers for the attitude detection and efficient algorithms on a an arm7 micro reduces the cost significantly over other gyro based systems, and the parts are also readily available worldwide.
  • Failsafe: A CPLD provides a thoroughly tested failsafe uav/rpv switching mechanism which is separate to the flight control computer so that control can be recovered at any time. The switch was required to be very immune to noise since we had some very low quality R/C gear with only a very small range so we've removed any chances of the state changing erroneously. The plane could even operate without an r/c receiver attached although this almost certainly would be disallowed in most countries it's handy when developing on the ground that you don't require the r/c transmitter to be switched on or receiver connected.
  • Attitude detection solution: efficient attitude solution which means there's more time to calculate the guidance etc and should be capable of flying aircraft which have high dynamics.
  • COTS friendly board design: There are no modifications required to plug in your R/C gear. I didn't want to require anyone to pull apart their receiver and solder wires which would void their warranty's etc. Also CUAVs power is sourced from the same lines that the R/C receiver, servos and speed controller get their power from so the power consumption of the entire system is easily monitored.
  • Low power consumption: 375mW including GPS and sensors with an acceptable input voltage between 3.6 to 5.5V
  • Weight: 50g including GPS and sensors; Size: 10cmx4cmx2cm
  • Open source: Though the source isn't released yet it will be soon I'm very conscious of the RERO (release early release often) mantra and am looking forward to getting it out there soon.
  • Flexibility: The custom board combined with the arm7 provides plenty of room for further additions of sensors etc.
  • Custom FDM: Custom physics model for the aircraft. This provides the ability to do hardware in the loop testing on the ground. Output to flightgear for visualization and networking. We found writing our own physics model for an aircraft helped with understanding the dynamics involved.
  • Data analysis: We use a matlab GUI to help tune and test the system. This should eventually be made from c/c++ so that you wont require matlab.
  • Compiler: gcc(++) arm7 development, using linux so that no proprietary software is required.
  • Custom fixed point math library: including vectors, rotation matrices, quaternions and trigonometry using 16.16, 32.32 and a 64 bit integer accumulator class. When we started the apeliotes project there wasn't any available math libraries for the arm7 (or we couldn't find any) and since there is no built in divide available on the arm7 we needed to build our own.
[In the next installment, Matt talks about the advantages and disadvantages of magnetometers]

Read more…
3D Robotics

GPS simulation hardware setup, part 2

In an earlier post, I discussed some of the options for connecting your autopilot to your PC for GPS simulation. I've now tried several of the options on our Basic Stamp autopilot, and I have some recommendations on what to do (and not to do).

Rather than try to re-use your dev board's serial connection to the PC for GPS simulation (an option I discussed in the earlier post), I recommend that you get a USB-to-serial board like this one ($20), and run two serial connections simultaneously. This is because one of the serial connections is going to be used by the Basic Stamp as a terminal monitoring the autopilot and incoming data. The second one is going to be used by the GPS simulation software. You really need both, especially if you're doing any debugging.

With the above USB-to-serial card, it's a little unclear what pin does what (the boards don't come with instructions). The picture below shows which wires you need. Basically, with the USB connector at the bottom, you want the first three pins from the top on the right side. They are, in order, ground, V+ and "Rx". (confusingly, the terms Rx and Tx depend on your perspective of what side of the serial connections you consider the "sending" side. In this case, this is the same pin that we call "Tx" on our GPS module.

We're going to plug this board in the exact same rows on the breadboard as we used for the real GPS module, so it's simply a matter of unplugging one and plugging in the other to do a simulation. This is what it looks like on my board (click to get a larger picture if you can't read the text)


When you plug in that USB to serial adapter, Windows should recognize it immediately and load the right driver. Windows will assign the board a serial port, but odds are it will be pretty high one (I usually get 10 and above). To find out which one it got, go to the Windows Control Panel, System, Hardware, Device Manager, Ports. It should show that you've got a new "USB Serial Port (COM X)" where X is whatever port it's been assigned.

Now you need the simulation software. I found a good free one from FlyWithCE here. Fire it up, and see if you can select the port that your USB to serial card has been assigned to. If that port isn't listed, you need to remap the card to a lower port. The easiest way to do that is back in Windows Device Manager. Select the port, and click Properties, then Port Settings, then Advanced. Select the lowest port you can. Click Okay, unplug the cable and plug it back in again. Now when you check Windows Device Manager, it should be listed at the new port numbers.

Now go to the Basic Stamp IDE and run the GPS test code from this post. The Debug terminal should show that the code can't find a GPS signal. Now go to the GPS Simulator. Enter a starting lat/lon and press start. If you set everything up right, you should start to see NEMA sentences. Click on the Move and/or Circle buttons and enter some speeds, and those GPS readings should start to change.

Now you've got GPS simulation working! If you fire up your autopilot, it should start to navigate according to these [fake] readings. But it's more fun to see what you're doing, so we'll replace the FlyWithCE simulator with Flight Gear in the next post.

Read more…
3D Robotics
[In the previous installments of my interview with Curt Olson, I focused on using FlightGear for GPS simulation--you fly the plane around in the flight simulator and the software outputs GPS data to fool your autopilot into thinking that it's the plane in the flight simulator, so you can see how the autopilot responds to being off course. Now we're going to turn to more sophisticated uses, with sensor data from your UAV (either live or recorded) driving the simulated plane on-screen. This is pretty hard-core stuff and not for everyone, but for those of you who want to really push the envelope, here's Curt, straight from the source. -ca]

FlightGear has the ability to turn off/disable its built in flight dynamics engine (i.e. the built in physics model that computes the simulated motion of the simulated aircraft.) Once the built in physics are turned off, you can feed in a sequence of locations, orientations, control surface positions, and other data from some external source, and FlightGear will dutifully render that data. If you connect up FlightGear to a data stream from a real time flight, or a data stream from a replay of a real flight, and feed in a smooth sequence of changing positions and orientations (hmmm, I work at a university so I should probably call this a sequence of locations and attitudes so as not to offend) :-) then FlightGear will render a smoothly changing view based on the data you are sending.

This turns out to be a very powerful and popular use of FlightGear. You can watch the FlightGear rendered flight from a variety of perspectives, you can examine the motion of the aircraft relative to the control surface deflections, you can leave a trail of virtual bread crumbs in the sky to see your flight path, you could include a sphere at the location and altitude of each of your waypoints to see how the autopilot tracks towards them, etc.

One thing I working on now (in partnership with a company in the LA area called LFS Technologies) is a glass cockpit display that shows even more detailed and interesting information. It's not completely finished, but it would connect up to flightgear and serve as another view into the guts of what the autopilot is thinking and trying to do. I don't know if I ever posted this link publicly, but here is a short/crude movie that shows the "glass" cockpit view in action ... low res and chunky, sorry about that.

Hopefully you can see there is a map view of the route and waypoints, there is a compass rose, a vsi (vertical speed indicator), an altimeter tape, and a speed tape. On each of these you can place a "bug". I.e. a "heading bug" would sit on top of the target heading, the "altitude bug" would sit on top of the target altitude, etc. And then in addition to all this, the actual control surface deflections are shown in the upper right.

What this gives you is a real time view of where you are at, where you should be going, and what the autopilot is trying to do to get there. For example. Let's say the target altitude is 1500' MSL and the UAV is currently flying at 1200' MSL. The altitude bug should be planted at the 1500' mark on the altimeter tape so we know where we are trying to go, and the altimeter will show the current altitude. If there is an altitude error, the autopilot should want to climb to fix that. So the VSI will have it's bug positioned to show what the autopilot wants the target rate of climb to be ... maybe 500 fpm for this example (and the rate of climb should diminish the closer you get to the target altitude so you intercept it smoothly.) Now you can look at the VSI and see what the target rate of climb is compared to the actual rate of climb. If there is a difference between the two, the autopilot will try to adjust to aircraft's pitch angle to increase/decrease the actual rate of climb. The primary flight display (PFD) shows actual pitch and roll, but also contains flight director style "vbars" that can be used to show the desired pitch and roll angles. Finally, in the upper right, you can see an indication of the actual elevator position that the autopilot is commanding to try to achieve the target pitch angle. Easy huh? :-) I actually use the word actually a lot I guess, basically feel free to edit that down a bit, and basically, "basically" is another word I actually use too much. Good thing I'm a software guy, and not a journalist!

So in other words, elevator deflection is used to achieve a target pitch angle. The pitch angle is varied to achieve a target rate of climb. The rate of climb is varied to achieve a target altitude. And the glass cockpit display in combination with FlightGear shows *exactly* what the autopilot is thinking and attempting to do for each of these stages of the altitude controller.

And the story is similar for the route following. The autopilot uses aileron deflection to achieve a target bank angle. The bank angle is varied to achive a desired heading. The heading is varied to fly towards the next waypoint. And again all of these can be seen changing in real time as the aircraft flies (or in a replay of the flight later on if you are trying to analyze and tune or debug the autopilot.)

So how does this help anything? If you are actually working on designing an autopilot and doing some test flights, then (as an example) maybe you see that the target pitch angle is never achieved. Then you notice from the visualizer that you run out of elevator authority before you get to the target pitch. Or maybe you could see things like excessive elevator motion as the system continually over compensates trying to seek the desired pitch. Then based on the specific observed behavior, you can tune the gains or limits of the autopilot system to fix the problem. Maybe you need to reduce the sensitivity (gain) of the elevator controller to eliminate the over compensation, maybe you need to increase the limits of the allowed elevator motion to be able to allow for greater pitch up or down angles.

As another example, maybe very early on in your autopilot development, you observe the aircraft always enters a loop when you activate the autopilot. From the FlightGear visualizer system, you might see that the elevator is moving the wrong direction so that more and more up elevator is commanded as the pitch angle gets higher and higher. You might realize that the direction of elevator deflection should be reversed inside the autopilot so less and less up elevator is given as the pitch angle increases.

The alternative to having tools to replay and visualize what is happening is that you fly, something strange happens, you land, scratch your head and try to figure it out, make some stab in the dark change, take off, see if it helps, it probably doesn't, you land, reverse the change you made earlier or make the same change but in the opposite direction, take off, more weirdness, more head scratching, etc. The more tools you have to see and understand the process of exactly what the autopilot is trying to do, the better off you are when it comes to making adjustments to make the autopilot even begin to work, or later to refine the behavior and make it work better or converge faster.

You could even take a step backwards in the pipeline. All of the above presupposes that you know your location, and know your attitude (pitch, roll, yaw.) But all those values are based on noisy and imperfect sensors and probably process through a variety of filtering and interpretation algorithms. It may be that you are trying to debug and tune the algorithms that tell you what your roll, pitch, heading, location, rate of climb, etc. are. In this case, a visualization tool is still very useful because you can compare what the computer is showing you to what you are seeing in real life (or what you remember seeing.) You can look at a real-time 3d visualization of the sensor data and say "that aint right", or "that looks about right". Either a plane looks like it is flying right or it doesn't. If it is flying along on a straight and level path but the nose is pitched up 45 degrees in the visualizer, you immediately see that something is wrong. If it is flying along on a straight and level path, but the tail is pointed forward, you can see something is wrong. A visualization tool like this can help you see and understand what your sensor integration algorithms are doing and possibly when and where they are breaking down. And all of that is useful information to point you towards the exact location of the problem and hopefully help you find a solution.


Read more…
3D Robotics

Blimp UAV: handling navigation indoors

The blimp UAV project is moving along nicely. We've picked a general size and selected most of the components. We've now decided to go with a PIC processor, rather than the Stamp, Propeller, or ATMega168 that we've been using here to date because you can get PICs in faster speeds and with more memory, something we'll need for expandability down the road. My partners in this project also have a lot of custom PIC development tools, so since they're going to do most of the work I'm happy to take their suggestion.

The big deal this week is that our IR navigation system, Evolution Robotic's NorthStar, arrived. This is the DevKit, which isn't cheap ($1,800) but we only need one and the idea is that we can build a board that has the receiver built in and a cheap transmitter kit. We're hoping to keep the cost to users of both under $100.

The kit that arrived is for NorthStar 1.0, and now that Evolution Robotics has announced Northstar 2.0 at CES, I can disclose that that's actually what we've been planning to use all along. But the dev kit for that one isn't ready yet, so we're building the first prototype on 1.0 and then upgrading the necessary parts and code when 2.0 arrives.

Here's the basic overview of how NorthStar 1.0 works:


And here's what's in the dev kit:

First, it comes in two nifty cases, one for the IR transmitters (shown above) and one for the IR receiver:


Here's what's inside the transmitter case (click for bigger picture and readable text):


Here's what's insider the receiver case:


Here's what the receiver looks like, attached to the PC interface that's used for testing:

The receiver module, which is the slim black box on the right, weights about 12 grams. It's got a bunch of directional IR receivers inside. The fixed IR transmitters (pic at the top), project beams at unique frequencies on the ceiling, and the receiver tracks them and outputs x and y position and a directional vector. You can think of it as a very high resolution (2-3 cm) GPS replacement for indoors.

Here's what the receiver looks like placed on the BlubberBot circuit board, which is about the size of the one we're going to use (this one is based on the Arduino platform, but ours isn't very different). This is the front of the board, hanging from the bottom of the blimp:


And here's the NorthStar receiver that I've placed on the back to get a sense of scale:


You'll note a challenge that we'll have to overcome. NorthStar 1.0 (and one of options on NorthStar 2.0) is based around the idea that IR transmitters would beam spots on the ceiling and the receiver would be placed to look up and navigate from those. But our blimps are going to be used in gymnasiums and other large rooms where the ceiling is too far away to see. So we'll want to navigate based on direct line of sight from the transmitters.

So where should we mount the receiver? If we mount it facing down, it will lose sight of the beacons when it's close to the ground. Facing to any side means that it won't be able to see any beacons not on that side. We could use two receivers, one on each side and hope that one's always in sight of a beacon, but this introduces complicated hand-off problems as the blimp rotates.

Roomba, the robot vacuum cleaner from iRobot, uses a similar system to get back to its charging base, but rather than spots on the ceiling or trying to keep facing an IR beacon, it uses a cone-shaped mirror that bounces IR from any angle down to a horizontal ring of IR sensors:


What if we mounted one of these on top of the NorthStar reciever and then placed the package horizontally below the blimp? We'd use two direct IR beacons in the room, rather than projecting spots on the ceiling (that just means taking the diffusing lenses off the IR transmitters).

We'll have to play with the system a bit to see if that works, but for now that's the plan. BTW, with a third IR transmitter, it's possible to get altitude, too, but the math on that is kinda gnarly, so we're using an ultrasonic sensor firing down for now.
Read more…
3D Robotics
(part one of the interview is here. A followup to the post below, with some more specific advice, is here.)

Before we drill down with Curt on the specifics on one way to use FlightGear for simulation (synthetic GPS generation), I should start with a little primer on the hardware side of simulation.

The first thing we'll need is a way to connect a PC's serial port (or, if you're like me and only have a laptop with no serial hardware, a virtual serial port via USB) to your autopilot. In the example in the next post, we're going just be doing GPS simulation, basically tricking your autopilot into thinking that it's flying around when in fact it's right on your desk.

There are two ways to do make this connection with your PC, where the simulator will be running: in software or in hardware. The first is the easiest but requires you to change your autopilot code every time you do a simulation. The second is a simple plug replacement for your GPS module, but requires some one-time wiring.

The hardware approach:

If you're using a desktop PC with a real serial port, just cut the end off of a 9-pin serial cable and do one of the following, depending on your setup:
  • If you're plugging in your GPS with a standard three-pin RC servo plug, just connect the serial cable's pin 3 to the white wire of the servo plug. (See DB9 in this schematic to know which is pin 3)
  • If you're using a standard GPS connector, such as the little white one in the EM406, the serial cable's pin 3 should go to the connector's pin 4 (Tx), as discussed in this post.
  • If you want to go straight to your breadboard, just cut the end off of a 9-pin serial cable and solder pins 2,3 and 4 to a header strip (break off three pins from a breakaway strip). We're actually going to use only one of those pins--the Tx signal on pin 3--but you may want the others for other simulations later. Stick that headers strip on some unused part of your breadboard and connect it with a wire to whatever Basic Stamp pin you're using for GPS in.
If you're connecting to your PC with a USB cable, as I do, you'll need to buy a USB-to-serial converter such as this one ($20), Plug it into your breadboard and connect V+ and ground as shown in this tutorial. Connect pin 2 to whatever CPU pin your GPS Tx pin is normally connected to. If you're working on a development board, this is probably the easiest approach. It's what I'm doing. [UPDATE: here's a How-To for my particular setup]

The software approach:

At this point you're probably probably saying "But my autopilot is already connected to my computer! Why do it again?" Well, yes, but it isn't connected to your PC on the same wires as your GPS. If you to want have a simple plug trade between your GPS and your simulator, you'll have to use the hardware approach above. But if you're willing to tweak your autopilot code when you're running simulations, you can skip all of that and edit a single line, changing the pin on which your autopilot looks for GPS data.

In our Basic Stamp examples, you'd go to the constant declaration part at the top of the code and edit "Sio PIN 15" (or whatever you've set as your GPS-in pin) to "Sio PIN 16". That's because "16" is the Basic Stamp code for the dedicated serial-input pin (SIN, physical pin 2), which is normally used by the Stamp Editor during the download process. That will be different on different processors and programming languages, so revise accordingly if you're using something else. Just make sure you're telling the autopilot to look for GPS on the same pin it normally uses to communicate with the PC.

And now to simulation...


Now you can use the same cable and hardware setup you normally use, but the autopilot will look to the PC for GPS NEMA sentences. And those will now be generated not by a real GPS module, but by a simulator.

So now we can turn to the next part of the Curt Olson interview, which will talk about how to get that simulator running.

Read more…
3D Robotics
I'm delighted to start our interview series with Curt Olson, the creator of the cool open source FlightGear flight simulator and a UAV engineer in real life. When I started researching how to use simulators (GPS and IMU) to ground-test UAVs (aka "hardware-in-the-loop" simulation), I found that Curt had done some of the most interesting work in this area. And because here at DIY Drones ignorance isn't a bug, it's a feature (learning in public helps others learn along with us), I thought I'd go straight to the source to find out how it's done.

We had a long email interview, so I'm going to break it into several parts. First an introduction and overview of the sort of UAV simulation you can do with FlightGear:

Q: Can you tell us a bit about yourself? We know you're the creator of FlightGear, but would love to hear a bit more about your day job and UAV projects. Also family, hobbies, odd things that nobody knows about you ;-)

Currently I'm employed in the Mechanical Engineering Department of the University of Minnesota, however that is only 50% time right now and I've been told that as of June, the department will be unable to renew my contract. <insert plug for Curt looking for possible employment or uav consulting projects> :-)

My primary focus at the university has been engineering support for a very advanced driving simulator:

I've been involved in a couple UAV projects with the U of MN aero department. Tons of pictures and information here and here.

I am also involved in a UAV project with ATI, sponsored by NOAA. This has been a lot of fun because it involves designing a new airframe from the ground up, quite a few unique operational challenges (launch from a small boat, recovery in the ocean) and [twist my arm] a couple trips to HI.

Family: I'm married, two daughters, age 4 + 7 (by the time the readers read this anyway.) And 2 dogs. Before kids came along I enjoyed skijoring here in MN with our 2 huskies:

Hobbies: RC model airplanes since high school, small plastic and balsa models before then. I also play a little soccer in an old guys league in the winter and still try to chase around the young guys on my summer team.

Odd things? I was a big Duke's of Hazzard (tv show) fan in high school. My latest FlightGear invention is a tribute to simpler times ...think my sense of humor is optimized to entertain myself and probably not many others. :-)

Q: What are the range of options in UAV simulation/testing? We know about basic GPS NEMA sentence generation, but what else is possible, from course drawing to IMU simulation?

I think if you are careful and clever, you can test just about every element of your system on the ground. That's actually not a bad idea because it's often hard to trouble shoot something that's a couple hundred yards away that you don't have physical access to and have no way to look inside and poke around.

Q: What can FlightGear offer as a simulation engine? Advantages over other options, such as dedicated GPS sims or MSFT Flight Simulator plug-ins? Special suitability to amateur UAVs, etc.

FlightGear can be useful in a number of ways throughout the lifespan of a UAS project. Such as:
  • You can do a physics (aerodynamics) model of your model in flightgear and use that to do handling qualities test and/or turn the flight control gains.
  • You can create a 3d model of your UAV with animated control surfaces for the purposes of visualizing your flight and analyzing the aircraft performance. There's certain things where you need to stare at tables of numbers and graphs, but for some things, seeing what the thing does in 3d and in real time can be helpful. Certain things that just aren't right can jump out at you visually where as you might miss it if you just looked at plots or raw numbers.
  • It's possible to leverage FlightGear as a component for hardware in the loop testing. FlightGear can model the aircraft dyanmics, send "fake" sensor data to your flight computer, and accept control inputs from your flight computer.
  • It's possible to test your flight control system or higher level control strategies purely within flightgear before implementing them on an embedded flight computer.
  • As you begin the flight testing phase, you can feed your flight test data back into the aerodynamics model and make the simulation that much more accurate and useful.
  • Operationally, you can use FlightGear to replay/visualize past flights. You can feed the telemetry data to FlightGear in real time for a live 3d "synthetic" view of the world. You can overlay an advanced hud, steam gauges, or glass cockpit type displays. [example shown below of real video next to synthetic FlightGear view of the same flight]

  • If you have net access, you can connect the slaved copy of FlightGear to the FlightGear multiplayer system and then anyone in the world with a web browser could track your UAV live using our multiplayer map system (google maps based.)
  • You could also have multiple UAV's participating in the FG multiplayer (MP) system and you would be able to see each other in your respective synthetic displays.
[Next: a drill-down on getting FlightGear to drive an IMU]

Read more…
3D Robotics

Best way to simulate a UAV flight?

While it's cold and windy outside, the best way to test our UAVs may be simulations. As best as I can tell there are at least two ways to do this, but I can't say I really understand either of them well enough to implement:

  1. Generate synthetic GPS data with your PC's serial port, see what the autopilot does as a result. Here's a GPS NEMA sentence generator. The fake course data can be plotted in Google Earth. This is just for navigation, of course.
  2. Generate synthetic GPS and IMU sensor reading, and display the autopilot responses in a flight simulator. Curtis Olson, the creator of FlightGear (open source flight simulator) does that with his UAVs. No idea how. Impressive, though...

Has anyone tried either of these? Something else? Can you tell us how to do it?


Read more…
3D Robotics

Making a UAV fail-safe

A big part of the DIY Drones credo is keeping it safe, and by that I don't just mean adhering to FAA regs and staying away from built-up areas, but also keeping it safe for your expensive UAV! The truth, as we all know, is that computers crash and that aircraft flown by computers crash harder ;-)

The aim of a good UAV is to have a fall-back system by which a human pilot can take over if (when!) the autopilot goes funky. There are at least three ways to do this, all of which we feature in one or more of the autopilots here:

  1. Completely separate the navigation and stabilization systems and ensure that the stabilization system can be overridden by manual RC control. That's what we do in GeoCrawlers 1, 2 and 3, all of which use the FMA Co-Pilot for stabalization (it controls the ailerons and elevator, leaving just the rudder for the autopilot and navigation). If the autopilot crashes, you can still fly the plane with ailerons and elevator alone, something we end up doing all too often! (The FMA system always allows for manual input)
  2. Mechanically separate the autopilot and RC control systems. In the case of the Lego UAV ("GeoCrawler1"), the Lego Mindstorm system moves the whole rudder servo back and forth, but the RC system can always turn the rudder servo arm, allowing it to override the autopilot if need be.
  3. Install a stand-alone "MUX" or servo multiplexer, that allows the operator to switch control from the RC system to the autopilot and back again with the gear switch on the transmitter, even if the autopilot catastrophically fails. As far as I know, there's only one commercially available one of these out there, and that one, by Reactive Technologies (shown), is not cheap ($99). Still, if you install one and give it an independent power supply, there should be no reason why you can't regain control of your plane no matter how wonky your onboard computer has gone.
What you should probably not do is exactly what we do (temporarily) with the Basic Stamp autopilot (GeoCrawler3), which passes RC signals through the Stamp chip and synthetically recreates them on the other side for the servos. If that program has a bug or the chip otherwise freezes, you've basically lost your rudder and elevator, which could make keeping the plane in the air difficult indeed. You'll still have control of the ailerons and throttle, but good luck getting the plane down in one piece if your program decides to crash with the rudder and elevator at full deflection.

So the Basic Stamp UAV project might be a good place for a MUX. Anybody know of a cheaper one? (This guy is looking for one, too)

Read more…