Overlaying GPS Coordinates for Camera Crosshairs

3689560630?profile=original

Hey Guys!

I am working on a project to allow us to implement GPS coordinates for the location of the crosshairs of a PTZ camera, other wise known as Geo-pointing.  This will essentially give us the LAT/LONG of what we are looking at.  

I have found several methods to do this, but I'm looking for the simplest way but accurate as possible.  I am fairly new to modifying APM and to MAVLink so instruction and suggestions are very welcome!  Here is what I have in mind:

In order to get this:

LAT/LON: Self explanatory

BRG: Bearing

SRG: Slant Range (Range from Aircraft to location)

GRG: Ground Range (Range from coordinate to coordinate)

I have a formula to triangulate the position of where the crosshairs are pointing relative to the aircraft by using trigonometry.  Here is the example:

3689560667?profile=original

3689560688?profile=original

The system will incorporate 1x APM 2.5 w/ GPS, 3DR Telemetry radios, 1x IMU w/ compass for camera gimbal, 2x MinimOSD boards, 2 cameras, 1x video TX with two-way camera switcher.

Method:

Application extracts altitude from APM and camera angle from Camera IMU and finds the tangent and cosine of the angle and finds the missing sides.  This will give us the ground distance and slant range respectively.

TAN(CAng)*Alt = GndDist     COS(CAng)*Alt = Slant

At this point the slant and ground range can be transmitted to the video feed.

Next, we use the aircraft's GPS coordinates from the APM and the ground distance to find what coordinate we are looking at on the ground:

aL-aircraft Latitude

aO-aircraft Longitude

cL-crosshair Latitude

cO-crosshair Longitude

A-aircraft Altitude

D-ground Distance

B-Bearing from Gimbal Compass

 

cL =ASIN(SIN(aL*PI()/180)*COS((D=TAN(A*PI()/180)/6378.137) + COS(aL*PI()/180)*SIN((D=TAN(A*PI()/180)/6378.137)*COS(B*PI()/180))*180/PI()

cO =((aO*PI()/180) + ATAN2(COS((D=TAN(A*PI()/180)/6378.137)-SIN(aL*PI()/180)*SIN(cL*PI()/180), SIN(B*PI()/180)*SIN((D=TAN(A*PI()/180)/6378.137)*COS(aL*PI()/180)))*180/PI()

These formulas are in excel format since it is what I used to test these.

This should give us the coordinate under the crosshair.

Next, I would like for this information to be sent to the second OSD for the gimbal camera.

Now my question is, how do I implement this?

I would assume that I can attach the IMU and compass via analog sensor input, however, I have no idea how to write code for it.

I also do not know how to write the code for the calculations to the APM in order for the APM to do all the formulas.  Can the APM handle it or do I need a separate processor?

How do I create a MAVLink message for the new coordinates and inject into the data stream?

How do I write the code to the MinimOSD to retrieve the new coordinates and display it?

These might sound like noob questions, but I come from a radio frequency engineer background and just now getting into programming.  I would love to learn how to do this stuff but there is very little out there on what I'm wanting to do exactly.  Sure, I can make an LED blink, and I know that's the basic skills, but I need to know how exactly to interface this with the APM and sensors.

Once again, all of your help is greatly appreciated and I very much welcome discussion and suggestions on a better way to do this!

-Hunter

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Hi Hunter,

    This is an interesting blogpost. I, too, am looking at something similar. I am working on C++ video processing and would like to be involved in your project if possible. I have some interesting ideas that would extend your system capabilities greatly.

    Many Thanks.
  • Hi, Mr. Parris, do you have any reference on the equations to obtain the target's coordinates on the ground? I would like to check the equation and try to derive the equation by my self.

    Thanks
  • HI Hunter, 

    is there any update regarding your project, we have many inquiries for your solutions. When do you think it will be ready?

  • HI Hunter, 

    Where will you read the coordination, on the mission planner or on the OSD? 

    Regarding the target elevation, you need to rely on Google earth as you are saying. 

    As for the Plane, why you want to calculate the plane direction and doing some math?

  • Elios,

    We are trying to make this as low cost as possible, just for this purpose...S&R.  I have looked into crowd sourcing but finding the programmer was the hard part.  So now I have a small team and we are moving forward with our small business plan.  This software will be part of it.  We will make it available for free to all who contribute to either source code, hardware or software, or any investments.  To all others, we will try and make it as affordable as possible.  Our organization is called Aerobotics Group and we are serious about civil RPAs doing great things for our communities.

  • Kevin, a rangefinder is very costly and heavy for smaller aircraft.  I'm trying to make this software-based that can interface with most cameras and gimbals.  All you will need, in theory, is a computer, camera, video stream, and gimbal.

  • Great Job, We are looking to have the solution when it is ready. It is a great solution to save missing people in snow, mountains....by knowing their exact location. 

  • There has been some progress. I have a programmer working on a third party application to implement the solutions. To solve for elevation differences, Mission Planner has the terrain map feature that can plot elevation changes using Google Earth server data. This application will expand on that feature and map the terrain the entire length of the ground distance and calculate the hypotenuse where the elevation and hypotenuse meet at 0 based on the camera's angle. This will calculate the new ground distance and hence give a more accurate geo. To solve for the aircraft's attitude, the state of the aircraft will be subtracted from the camera angle and give the true angle from 90 degrees. Yaw shouldn't matter since the bearing will be determined by the compass minus camera position. I have the math done and having it programmed now.
  • HI Hunter, 

    Is there any update regarding your project? It is a very interesting solution if it can be done. 

  • Hi Hunter, this is a very interesting topic, I'm facing exact the same problem, but don't you need to take into account the UAV's attitude (yaw, pitch and roll)? I mean, don't these values affect the camera's angles?
This reply was deleted.