Has anyone tried this, and where would be the best place to calculate it (APM, OSD, or GCS)?
Situation: You are flying around looking at things with your camera and see something of interest. Let's say a a broken fence on your farm. You point your camera's crosshair on it and read a location for where the camera is pointing so you can go out and repair it.
Given: The location of the aircraft (Heading, Altitude, and Position) and orientation of the camera (Azimuth and Look Angle).
Problem: Determine the location of the item being viewed by calculating it form the know aircraft position/direction and the camera’s azimuth/look angle. Then out put the location in a format that can be displayed in the Mission Planner software or video stream.
Replies
Hi Jay,
Did you get any success with the Target Localisation issue?
Regards
Ashutosh Kumar
Then I use gps visualizer http://www.gpsvisualizer.com/calculators from a google search, I entered lat and long, distance and bearing, it then came up with a lat and long and even produced a google map.
Hope this helps, I expect all these calculations could be included within the osd by code, but that is beyond me. At least it can be done by paper by looking at the osd as long as the screen shows the downward angle of th camera. These calculation are only based on the fact the targets are on level ground.
Well part of the math is pretty simple along the lines of http://www.csgnetwork.com/righttricalc.html for example.
The formula to get the distance on ground is simple since you have the angle and altitude, to give an example, 20m altitude, looking angle 45degrees = distance on ground is also 20m.
At this point you know that the ground based point in the example is 20m away from your GPS location, in the heading of your vehicle, so depending on your requirement you might just record that data, return to the GPS location and walk 20m in the direction you were pointing, but ideally we would get the lat/long of the target rather than a set of instructions.
http://williams.best.vwh.net/avform.htm holds the answer, I also found some examples for android to do this:
http://www.basic4ppc.com/forum/bugs-wishlist/12909-coordinates-base...
So mathwise it is there.
I will come back to this later when I have a slightly more practical use for it. I do intend to integrate this functionality in a project I am working on.
in SAR it is indeed often problematic to figure out relative position between an airborne observer and a groundbased rescue team, which of course is of paramount importance once a possible location is identified.
There is an Android App called "XYZworks Triangulate" that kind of does this (kind of). I have sent the developer an email to see if they would help with the math. As I understand it the App used the sensor data from the phone to get a lob from two readings and calculates the intersection location.
What comes to mind immediately is targetting used by the army as I believe they are also moving away from 'illuminating' targets to GPS, but also the illuminating would require some form of ground object tracking.
In your picture, you appear to be zapping your bird with lightning. Don't do that. You'll break it. :-)
If I'm serving property with the fence line and looking for brakes, not having to fly over each of those breaks wouldallow me to maximize my flight time by just point the camera to that area and scanning the fence line with the camera.
in a SAR situation it would allow me to get the location of multiple targets and pass them to the ground for a closer look while continuing my search pattern.
Do you have a gimbal camera on your scenario to make it looking strait down?
If yes, then fly the multirotor where you are exactly on top of your target (point camera down to see the target). And telemetry will give you your actual position which will be visible on MissionPlanner.
Figured I would add a super technical diagram to help with the explination......