Overlaying GPS Coordinates for Camera Crosshairs

3689560630?profile=original

Hey Guys!

I am working on a project to allow us to implement GPS coordinates for the location of the crosshairs of a PTZ camera, other wise known as Geo-pointing.  This will essentially give us the LAT/LONG of what we are looking at.  

I have found several methods to do this, but I'm looking for the simplest way but accurate as possible.  I am fairly new to modifying APM and to MAVLink so instruction and suggestions are very welcome!  Here is what I have in mind:

In order to get this:

LAT/LON: Self explanatory

BRG: Bearing

SRG: Slant Range (Range from Aircraft to location)

GRG: Ground Range (Range from coordinate to coordinate)

I have a formula to triangulate the position of where the crosshairs are pointing relative to the aircraft by using trigonometry.  Here is the example:

3689560667?profile=original

3689560688?profile=original

The system will incorporate 1x APM 2.5 w/ GPS, 3DR Telemetry radios, 1x IMU w/ compass for camera gimbal, 2x MinimOSD boards, 2 cameras, 1x video TX with two-way camera switcher.

Method:

Application extracts altitude from APM and camera angle from Camera IMU and finds the tangent and cosine of the angle and finds the missing sides.  This will give us the ground distance and slant range respectively.

TAN(CAng)*Alt = GndDist     COS(CAng)*Alt = Slant

At this point the slant and ground range can be transmitted to the video feed.

Next, we use the aircraft's GPS coordinates from the APM and the ground distance to find what coordinate we are looking at on the ground:

aL-aircraft Latitude

aO-aircraft Longitude

cL-crosshair Latitude

cO-crosshair Longitude

A-aircraft Altitude

D-ground Distance

B-Bearing from Gimbal Compass

 

cL =ASIN(SIN(aL*PI()/180)*COS((D=TAN(A*PI()/180)/6378.137) + COS(aL*PI()/180)*SIN((D=TAN(A*PI()/180)/6378.137)*COS(B*PI()/180))*180/PI()

cO =((aO*PI()/180) + ATAN2(COS((D=TAN(A*PI()/180)/6378.137)-SIN(aL*PI()/180)*SIN(cL*PI()/180), SIN(B*PI()/180)*SIN((D=TAN(A*PI()/180)/6378.137)*COS(aL*PI()/180)))*180/PI()

These formulas are in excel format since it is what I used to test these.

This should give us the coordinate under the crosshair.

Next, I would like for this information to be sent to the second OSD for the gimbal camera.

Now my question is, how do I implement this?

I would assume that I can attach the IMU and compass via analog sensor input, however, I have no idea how to write code for it.

I also do not know how to write the code for the calculations to the APM in order for the APM to do all the formulas.  Can the APM handle it or do I need a separate processor?

How do I create a MAVLink message for the new coordinates and inject into the data stream?

How do I write the code to the MinimOSD to retrieve the new coordinates and display it?

These might sound like noob questions, but I come from a radio frequency engineer background and just now getting into programming.  I would love to learn how to do this stuff but there is very little out there on what I'm wanting to do exactly.  Sure, I can make an LED blink, and I know that's the basic skills, but I need to know how exactly to interface this with the APM and sensors.

Once again, all of your help is greatly appreciated and I very much welcome discussion and suggestions on a better way to do this!

-Hunter

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • You can also keep large elevation libraries on the GCS system for elevation calculations.

  • Even at FL200, the calculations shouldn't be that complicated given that the geometry and timing is correct.

  • Bryan,

    I'm very familiar with DTED.  I too work with MQ-1s and MQ-9s on a daily basis, although DTED doesn't affect me much with my job.  I think this would be much easier to implement on the ground due to processing power and memory limitations of the APM and OSD.  The added benefit of having on the ground is you can essentially fly it IFR if your video cuts out, given you are proficient at flying IFR, lol.  I would love to get this thing running on the ground in a third party application that can import the video and overlay the metadata stripped from the MAVLink.  Anyone know C# that wants to help me out?!  

    If you notice Bryan, the overlay I have pictured is similar to the Preds and Reapers...that is where I kinda got this idea from.  

  • But then again, our aircraft are FL200+, so maybe this isn't an issue, and the simple calculations aren't that crazy to process on-board...

  • In the MQ-1 and MQ-9, we use DTED (Digital Terrain Elevation Data) that resides on the computers that run the GCS for the calculation of TGT coordinates, slant range, ground range TGT elevation and "other stuff".  The elevation of the TGT makes a huge difference in the computed TGT coordinates. 

    Another advantage of processing it on the ground is that I can use non-destructive GCS generated graphics to see where my crosshairs are pointed without destroying any of the MTS video.  The added benefit of offloading the task of computing crosshair location is the ability of having huge areas mapped out for the DTED data.  We have the ability to use destructive overlay graphics that most people are used to seeing in the RPA videos on YouTube, but I prefer not but still be able to embed the info in metadata.

    Hope I'm not rambling!

  • It all comes down in solving the camera pointing algorithm in the reverse order, as Lucas suggested. Within this algorithm there are three variables:

    1. Own (GPS) position, altitude

    2. Target position, altitude

    3. Pant tilt X and Y values. X for pan Y for tilt.

    The current algorithm gets the values from 1 and 2 above and calculates the pan/tilt angles that subsequently drive the servos.

    In your case, solve the algorithm by giving the 1 and 3 values and get the target coordinates. 

  • Anyone have a suggestion on how to write this code to Arduino and APM?

  • I checked out magnetsystems.net.  They're stuff is pretty impressive.  But it still runs the question of cost.  It seems this is tailored to military and law enforcement so the cost would seem to be out of the reach of the hobbyist.  I want to make a low cost system that will be readily available to everyone, even the hobbyist.  The formulas and implementation doesn't seem that hard, it's just getting it done.  I appreciate everyone's input on this.  You all have been fantastic over the years.

    Home
  • These guys are doing it already for quite some years now as far as I know: http://magnetsystems.net/

    They provide the coordinates not only of the crosshair location but from every pixel of the video window. An impressing technology called video draping that georeferences in real time the video from the UAV to its actual location on the ground.

    Click on the video draping link and watch the videos. I have seen this technology in action and it allows you to hover the mouse over the draped video window to get real time geographic coordinates out of it……

    Home
  • I'm no expert at code, but I would love to get a piece of what you're working on. In terms of application, it would be nice to be able to aim your camera gimbal at something, then tell your quad to head to those coordinates and get a closer look. Or same with a fixed-wing, pick a point in the distance and instruct plane to circle around that ground coordinate.

    As for finding unknown altitude at a point, could that be pulled from Mission Planner? Can it be pulled out of the maps from Google used in the software?

    Great idea you've got

This reply was deleted.