Flickr photos are here.
All Posts (14048)
Hi Guys,
We'll be conducting a UAS Defense & Tactics Conference on July 29-30, 2010 in Washington, D.C.
Contact me at 202-536.4898 or send me an e-mail at Meagan.White@new-fields.com if you're interested . :-)I just thought you guys might like to know that we are still making REAL progress over in our workspace. Sarel has been working hard on his prototype and as you can see, mine is coming along as well - we should be combining all of our trials an errors soon and have final schematic and board designs soon. So who else might be willing to help out and lend us a hand programming all that graphical goodness??
http://www.aplanding.com/forums/showthread.php?t=4179
Hey guys,
I have been wanting to fully document my SAGAR (Semi Autonomous GPS Assisted Rover) for some time now, and the 'ardupilot goes into the water' series has been so entertaining, it gave me the motivation to finally start. The first post was simply a demonstration of the LabView ground station, which has been redesigned one last time before my girlfriend is turning in the project. First I'll show the new interface, and talk about how it communicates with SAGAR, and give some background on why I built SAGAR.
Here is a video of the new interface, with an inset of SAGAR as it runs the mission.
During the run we recorded, there was a glitch half way through. It appears Labview started to slow down and the gap between live events and what was being displayed grew, until the Labview buffer overflowed and sentences where lost. I have yet to look into that problem, as it is the first time we have observed it.
When my girlfriend came to me for ideas for her LabView class, I suggested she write an interface for my robot. I knew I would have to develop a communication protocol that I could hand to her from the start. I took a look at the structure of the ArduPilot communications, and it seemed odd to me. Is the structure a known protocol? I'm sure one of the developers will tell me.
I decided to stick to something I knew, the NMEA protocol. For those who are not aware, GPS systems communicate via the NMEA protocol, as do many other robotics systems. The structure of a NMEA sentences starts with a header that identifies the sentence, then comma delimited fields that contain the data to be passed. The sentence is usually followed by a checksum, to validate the integrity of the data. I came up with my own header, and added the fields of sensor data I wanted to have displayed on the interface. Here is an example of my structure.
$SAGAR,heading,pitch,roll,wheelspeed(Commanded l+r, actual l+r),
distance_trav,GPS_Fix,GPS_Lat,GPA_lon,GPS_speed,GPS_COG,Battery_V,Battery_I,processor_load*CS
This is one of two sentences SAGAR will send to the interface. The other sentence contains mission statistics like current waypoint number, distance to waypoint, etc. There are also 4 sentences that the interface sends to SAGAR, each representing a different mode for SAGAR to enter, and commands to follow. There is a fail-safe in place, if SAGAR doesn't receive a command sentence in 500ms, it halts and enters stand-by.
To finish off this oddly ordered intro, I 'll give my motivation for building SAGAR to begin with, starting with a quick life story.
To most of the locals around here, I am a young gun. This time last year, I was nearing the end of my college career. All my life I knew I wanted to be an electrical engineer, but I never really knew what branch I wanted to specialize in. The family business was generators, so I took as many power classes I could sign up for. It was ok, but I wasn't a fan of all the extra math involved as opposed to other fields of EE. My last year of school, I had to build a senior design robot as defined by the IEEE 2009 SouthEastCon Hardware Competition. I had a blast. I instantly realized the field I wanted to be in was robotics. The robot my group built did so well, if it had gone to the competition (long story of why it didn't) it would have crushed the competition as during every test run it easily doubles the score of the robot that did win. Here it is:
I am very proud of how well it works. After graduation, A division of the U.S. Navy that specializes in unmanned robotics got a hold of this video and asked for my resume. Now I work with million dollar underwater, surface and ground drones; ie my dream job. The only problem is I don't have much experience building robots that are not made of Legos, or the only purpose is to pick up recyclables. So the month I started working, I started building SAGAR to gain the experience I wanted of the internals of unmanned drones. SAGAR started as a bag of parts nearly a year ago, and grew from there. (Almost) everything is from scratch, down to DIY battery packs.
Not too shabby? Forgive and spelling/grammar, I am definitely bad at both.
Comming up next: The importance of a good chassis, and building my own closed loop motor controller.
Specifications:
Visual Localization System can be used to solve an outdoor localization
problem. The camera is mounted under the belly of an unmanned aerial
vehicle (UAV). The camera is used to search for visual landmarks on the
ground (a terrain matching based navigation)
Visual Localization System
Please check out this story on AOL News about the culture change going on at the Air Force Academy.
http://www.aolnews.com/nation/article/unmanned-flight-brings-air-force-cadets-down-to-earth/19428947
The creator explains: "No jailbreaking. No WiFi. Stock receivers. I fly model airplanes and helicopters with my iPhone. I use an off-the-shelf 2.4GHz module and a custom iPhone app. The app is now in beta testing.
I use the phone's headphone jack to communicate with the Spektrum module. I make no modifications to the module or the receivers. This application does not use WiFi, Internet, external servers or microcontrollers."
(via MakeZine)
Congratulations to Antonio Lyska, the UAV winner of the Sparkfun Autonomous Vehicle Competition! In his final run, he did a 26 second lap and then nailed an autonomous landing in the box for a 30 second deduction, for a final score of negative 4 seconds. Even more impressive, this was totally DIY--he's been working on his Rabbit-powered IMU-based board for five years, and coded everything from scratch, including a beautiful custom ground station. He's so confident in his setup that he doesn't even watch his plane in flight--he just watches the action on his ground station. That's him above, with his flying wing crossing the start line for the victorious run. So calm!
In second place, with a time of 3 seconds (18 second run, minus 15 seconds for attempting an autonomous landing) was Doug Weibel, with ArduPilot IMU (version 2.6 code) on his SkyFun. His plane did a gorgeous run, with sharp turns and rock-solid stabilization and the fastest time of any UAV, but it got a bit lost coming in for the autonomous landing and ended up crash-landing in a nearby parking lot. But autonomous landing was attempted, and the plane gave up some foam to get those 15 points!
In third place was the University of Arizona Paparazzi team, which impressed everyone with their autonomous landings. The last one just missed the box, but otherwise it was very solid all day. Their best score was 5 seconds, after deductions.
Jordi and DIY Drones came in fourth place with 37 seconds (no deductions attemped) with our ArduPilot (2.5.4 thermopile code) and EasyStar.
The UAVDevBoard teams impressed everyone with their awesome vertical starts, acrobatic maneuvers and stunning stability. Unfortunately all that sky candy came at the cost of clipping corners (in the case of Ben Levitt's Acromaster), so he didn't get a course completion score.
Adam Barrow managed to go to a local hobby shop and get replacement parts to reubuild his T28, which was flying perfectly with the UAVDevBoard. But in the final practice flight he lost control in manual mode and piled it in again, so he wasn't able to compete in the final rounds. But based on the glimpse of autonomous performance, that plane and autopilot are going to be a serious contender next year.
For next year, we're thinking acrobatic plane and quad copters are going to win the day, given the scoring system. Can't wait!
"Robota": Rabbit-powered custom autopilot. He did everything himself!
"UofA Robotics": Paparazzi-powered Twinstars
"Pine Tree": Doug Weibel and his ArduPilot IMU-powered Skyfun
"DIY Drones": The tried and true ArduPilot-powered EasyStar
"Donuts, Coffee and Muffins": Ben Levitt and his UAV DevBoard-powered AcroMaster (backup one!)
Adam Barrow and his UAVDevBoard-powered T28
Ryan Beall and his custom autopilot, Ateryx, in a Alpha Sport 450 (not competing)
The rain has picked up, but we all got a good round two in. Here are the current standings, in order:
- The custom Rabbit-powered wing ("Robota") had a great run, with an auto takeoff (hand launched, so it didn't count, but still impressive) and an autolanding that just nailed, stopping right on the box line. He got a 30 second deduction for that, so his 34 second run turned into a time of 4 seconds, for first place
- The U Arizona Paparazzi team ("UoA Robotics")had a 35 second run, but nailed a perfect autonomous landing, for a 5 second second place.
- Doug Weibel tried an autonomous takeoff with ArduPilot w/IMU Skyfun ("Death by Pine Tree"), but it didn't quite pull up in time. He was allowed a second start (manual this time) and blasted through a very tight 20 second run for fastest overall time, but no bonus points, and thus third place
- The rain let up enough to let Jordi ("DIYDrones") do a slightly faster and very smooth second run of 37 seconds with the stock ArduPilot and EasyStar, giving us 4th place
- Both of the UAVDevBoard planes were recovered (one from a fence, another from a roof) but both are unflyable. Fortunately, Ben Levitt ("Donuts, Coffee and Muffins", a play on the DCM algorithm that we all use) had a backup AcroMaster. He did an autonomous VERTICAL takeoff, then a couple autonomous barrel rolls, then some calibration loops upside down, then did the course upside down. Awesome tight control, and pretty fast at 30 seconds, too, but unfortunately he clipped a corner and didn't score that round.
The rain is getting pretty heavy, so we thermopile guys may not do a third round, which would leave it to the IMU teams to win. Nail-biting suspense!
- Jordi and me (ArduPilot with thermopiles on the old EasyStar we used last year)
- Doug Weibel (ArduPilot with the IMU on a SkyFun with elevons)
- A Paparazzi team from the University of Arizona flying Twinstars
- A flying wing with a custom Rabbit-CPU based autopilot.
- Ben Levitt with an Acromaster pattern plane
- Adam Barrow with a Parkzone T28),
- We did a conservative first pattern and completed it in a time of 41 seconds, putting us in second place
- The Paprazzi team completed in a time of 21 seconds and attempted an autonomous landing, putting them in first place
- Doug Weibel put in the fastest time at 14 seconds, but clipped a corner so it didn't count
- Ben's plane did a vertical autonomous take-off, which was awesome but went down when it was doing an autonomous wind-calculation loop. They haven't found it yet, but think might be on a nearby roof.
- Adam's T did a rolling autonomous takeoff, which was super exciting (just missed a tree), but then it too went down in the same place while doing an wind estimation pass. Fortunately they found it, but the plane was destroyed.
- The custom flying wing also did a conservative first run, and finished with a time of 59 seconds.
The new EPO foam version of the Skywalker UAV platform are now for sale.
Price 144 USD + 40 USD in shipping.
New price!
- 8 AM - Team Admission
- 9 AM - General Public Admission
- 9:30 AM - 1st Heat Start Time, Raffle at End of 1st Heat
- 12 PM - 2nd Heat Start Time, Raffle at End of 2nd Heat
- 2:30 PM - 3rd Heat Start Time, Raffle at End of 2nd Heat, and Free-For-All Race (mass start)
- 4:30 PM-ish - Awards
- 6 PM-ish - After party @ Dark Horse