I'm adding simple object tracking to the Arducopter code. I have come up with a system running on a BeagleBoard that has a downward pointing camera that captures an image, finds the object in the image, calculates it's centroid and sends that data as a MAVLink message to the ArduPilot Mega. This all works fine.
I then take that centroid information and use an algorithm similar to the existing optical flow one to find the actual offset in centimeters in the x and y directions. I then calculate the distance and bearing to the object. I then use functions very similar to those used for waypoint navigation:
- calc_XY_velocity() I've copied and modified to calculate the relative velocity to the object (this is the first place I'm not sure about)
- calc_location_error() is replaced by the object x and y distances, which give me values just like long_error and lat_error in centimeters
- calc_desired_speed() is calculated the same way
- calc_nav_rate() I've copied and modified to use the relative speed to the object. It uses the same x/y_rate_error, pi_loiter_lon/lat, pid_nav_lon/lat, and nav_lon/lat calculations as calc_nav_rate(). The PID values are the same as for waypoint (same objects are used).
- calc_loiter_pitch_roll() is unmodified
- I have extended the ALT_HOLD mode with object tracking (if an object is detected) and use these parameters:
yaw_mode = YAW_HOLD;
roll_pitch_mode = ROLL_PITCH_AUTO;
throttle_mode = THROTTLE_AUTO;
So roll/pitch get updated as with waypoint navigation.
I have been testing it but I get strange results. When the copter detects an object (verified through logs), I see that it flies away from it. For example, if the object is in front of the copter, the copter pitches left and flies away. Other times, it pitches forward and keeps going fast, sometimes out of control.
I've tripled checked my code and I don't think I've swapped an x/y or lat/lon and I believe all the units are correct. The implementation is similar to what is there already. I've logged values extensively, analyzed them, and see that indeed large values are being produced, but I'm not sure why yet. I am kind of stuck and wanted to know your thoughts on what to try.
1) What are the units of nav_lon/nav_lat? I have tried to figure it out, but it is not clear.
2) Should I use the GPS speed instead of the relative speed? I would like to not have to depend on a GPS fix for object tracking...
3) Does my approach seem sound?
My log is attached with the column headings inserted at the top. The scenario is described in the log. Object tracking starts at row 812/817.
Thank you in advance!
Nav_lon and nav_lat are degrees * 100 before the translation into copter space. So It's pitch and roll if the copter was pointed north.
I'll re-read your post in a few days when I get some more time. Very interesting, thanks.
Thanks! That makes sense based on how it was being contrained (+/-3000) and what happens to it in calc_loiter_pitch_roll() (since auto_roll/auto_pitch are deg*100 and sin/cos are of course unit-less)
I am also trying to figure out if the P value for pid_loiter_lon/lat and pid_nav_lon/lat are appropriate for my application, but it seems they should be since the units should be the same. I have not played with them...
I've attached my code which is on top of the released 2.4.1.
Are the values long_error and lat_error calculated by calc_location_error() in units of centimeters or something else? This comment in the function is confusing:
Becuase we are using lat and lon to do our distance errors here's a quick chart:
100 = 1m
1000 = 11m = 36 feet
1800 = 19.80m = 60 feet
3000 = 33m
10000 = 111m
Thanks. This is supposedly compensated for by the optical flow algorithm that I used which is described here:
At this point, I have things kind of working, but when the copter sees the object, it only occasionally follows it. Most of the time, though, it just wobbles quite a bit. Does this indicate too high of PI values? I am using P=3.0 and I=0.1, from waypoint navigation. Any suggestions as to which to adjust first? Next time I'm out, I will try messing with them.
Have you considered to follow the helicopter with the vision system?
You can put some light on the helicopter to make it easier to detect.
It would be a good system for tracking in two dimensions?
I have a BeagleBoard and I like to try ...
We are a team of students from the university of Southampton. We are making an autonomous surface vessel capable of crossing the atlantic. We are at a good stage with the hullform and superstructure design and we are now going into electronics.
We now think that the best solution, in order to have satellite communications and AIS object avoidance, is to have a Beaglebone Black connected to an Ardupilot. The APM would do the lower level computations of manoeuvring the vessel, where the BBB would be connected to the other systems and would do higher levels calculations (i.e. object avoidance, etc..). We believe that this system is ideal as we can use the Ardupilot library and just program the remaining parts in ROS and Python for the BBB (which would then communicate with the APM, over Mavlink and serial ports, to give new way points in case of avoiding something).
Can you tell me more about the BBB and APM interface? Also how hard do you think it is to change the Ardurover code to recieve info from BBB? (considering that i only know how to program in Python and a little bit in ROS.