Setup of the hardware from Varun's post: "The XBee's TX pin is hooked up to analog4 (one of the few pins we foundwas not in use) via NewSoftSerial and runs at 57600 baud. If it has a valid packet, it replaces the ArduPilot's next waypoint value with the one from the packet. After it hits this new waypoint, it loads the old next waypoint back in from EEPROM and continues on its original path."
Operation of the GCS is simple. With Google Earth already setup on a good view of the area, click 'Capture Point' on the GCS, then click somewhere on the Google Earth globe. The point's coordinates will be captured in the GCS, add an altitude, and click 'Send Waypoint'.
Since I don't own a Ardupilot (yet) I have not been able to test it. One person has successfully tested the complete setup, but please report any bugs you find. Read more…
We all know it's easy to display information on Google Earth, like the position of a UAV. However getting information back, like the GPS coordinates of a mouse click, is another story. I really wanted to give my LabView GroundStation for my UGV SAGAR the ability the to 'click and go' anywhere on the Google Earth map. This is similar to the various wishes I have heard around here for the ability to upload new waypoints to ArduPilot while she's flying.
Well, I figured out how to do it, and I'm here to share. Here's a video of my sample LabView VI that reads in the GPS coordinates of the mouse over the Google Earth map, captures the position of a left mouse click, and feeds it back into Google Earth through the well known network KML file.
The opens the door to new GroundStation features, like "Click and Go" during a flight. Also, it could mean the retiring of ConfigTool, as all mission planning and status could be done through a single LabView VI with Google Earth. Read more…
I'm still working on the next installment on my SAGAR drone, however my progress has slowed, as I suddenly find myself in Boston for 7 weeks. Life can be very exciting when you work for the Department of Defense.
In the mean time, I have been asked by a few people over at SocietyofRobots.com on how SAGAR's micro-controller (an Axon) communicates with LabView. Instead of a bland answer, I spent s little time writting about the different problems hobbyist can run into when they try to have to computer systems communicate with each other.
I have been wanting to fully document my SAGAR (Semi Autonomous GPS Assisted Rover) for some time now, and the 'ardupilot goes into the water' series has been so entertaining, it gave me the motivation to finally start. The first post was simply a demonstration of the LabView ground station, which has been redesigned one last time before my girlfriend is turning in the project. First I'll show the new interface, and talk about how it communicates with SAGAR, and give some background on why I built SAGAR.
Here is a video of the new interface, with an inset of SAGAR as it runs the mission.
During the run we recorded, there was a glitch half way through. It appears Labview started to slow down and the gap between live events and what was being displayed grew, until the Labview buffer overflowed and sentences where lost. I have yet to look into that problem, as it is the first time we have observed it.
When my girlfriend came to me for ideas for her LabView class, I suggested she write an interface for my robot. I knew I would have to develop a communication protocol that I could hand to her from the start. I took a look at the structure of the ArduPilot communications, and it seemed odd to me. Is the structure a known protocol? I'm sure one of the developers will tell me.
I decided to stick to something I knew, the NMEA protocol. For those who are not aware, GPS systems communicate via the NMEA protocol, as do many other robotics systems. The structure of a NMEA sentences starts with a header that identifies the sentence, then comma delimited fields that contain the data to be passed. The sentence is usually followed by a checksum, to validate the integrity of the data. I came up with my own header, and added the fields of sensor data I wanted to have displayed on the interface. Here is an example of my structure.
$SAGAR,heading,pitch,roll,wheelspeed(Commanded l+r, actual l+r), distance_trav,GPS_Fix,GPS_Lat,GPA_lon,GPS_speed,GPS_COG,Battery_V,Battery_I,processor_load*CS
This is one of two sentences SAGAR will send to the interface. The other sentence contains mission statistics like current waypoint number, distance to waypoint, etc. There are also 4 sentences that the interface sends to SAGAR, each representing a different mode for SAGAR to enter, and commands to follow. There is a fail-safe in place, if SAGAR doesn't receive a command sentence in 500ms, it halts and enters stand-by.
To finish off this oddly ordered intro, I 'll give my motivation for building SAGAR to begin with, starting with a quick life story.
To most of the locals around here, I am a young gun. This time last year, I was nearing the end of my college career. All my life I knew I wanted to be an electrical engineer, but I never really knew what branch I wanted to specialize in. The family business was generators, so I took as many power classes I could sign up for. It was ok, but I wasn't a fan of all the extra math involved as opposed to other fields of EE. My last year of school, I had to build a senior design robot as defined by the IEEE 2009 SouthEastCon Hardware Competition. I had a blast. I instantly realized the field I wanted to be in was robotics. The robot my group built did so well, if it had gone to the competition (long story of why it didn't) it would have crushed the competition as during every test run it easily doubles the score of the robot that did win. Here it is:
I am very proud of how well it works. After graduation, A division of the U.S. Navy that specializes in unmanned robotics got a hold of this video and asked for my resume. Now I work with million dollar underwater, surface and ground drones; ie my dream job. The only problem is I don't have much experience building robots that are not made of Legos, or the only purpose is to pick up recyclables. So the month I started working, I started building SAGAR to gain the experience I wanted of the internals of unmanned drones. SAGAR started as a bag of parts nearly a year ago, and grew from there. (Almost) everything is from scratch, down to DIY battery packs.
Not too shabby? Forgive and spelling/grammar, I am definitely bad at both.
Comming up next: The importance of a good chassis, and building my own closed loop motor controller.
Hey guys. I wanted to post this as a 'Thank you' to the community for the ideas and open source code on this site.
My girlfriend came to me, she needed an idea for a project in her LabView class. At the time, I was working on my SAGAR autonomous robot. I showed her the ArduPilot Ground Station and off she went writing her own for my robot.
I developed a communications protocol using NMEA style sentences and gave her the specifications. We worked together on a few things, like the 3D rendering of orientation, and how to plot two points on Google Earth. We would not have gotten far without looking through Ground Station source.