Determine the location of an object being viewed by the camera.

Has anyone tried this, and where would be the best place to calculate it (APM, OSD, or GCS)?

 

Situation: You are flying around looking at things with your camera and see something of interest. Let's say a a broken fence on your farm. You point your camera's crosshair on it and read a location for where the camera is pointing so you can go out and repair it.

 

Given: The location of the aircraft (Heading, Altitude, and Position) and orientation of the camera (Azimuth and Look Angle).

 

Problem: Determine the location of the item being viewed by calculating it form the know aircraft position/direction and the camera’s azimuth/look angle. Then out put the location in a format that can be displayed in the Mission Planner software or video stream.

 

Views: 3209

Reply to This

Replies to This Discussion

Figured I would add a super technical diagram to help with the explination......

Do you have a gimbal camera on your scenario to make it looking strait down?

If yes, then fly the multirotor where you are exactly on top of your target (point camera down to see the target). And telemetry will give you your actual position which will be visible on MissionPlanner.

I am planning on a gimbal. The issue with flying over it is that in some areas I might not be able to fly over to read it. This could be due to terrain or trees.

If I'm serving property with the fence line and looking for brakes, not having to fly over each of those breaks wouldallow me to maximize my flight time by just point the camera to that area and scanning the fence line with the camera.

in a SAR situation it would allow me to get the location of multiple targets and pass them to the ground for a closer look while continuing my search pattern.

In your picture, you appear to be zapping your bird with lightning. Don't do that. You'll break it. :-)

the picture is correct and the symbol used for radio communication is also correct. please post useful information.

I think there is some related work done on gimbal being able to hokd an object in center of view while the copter is moving.
What comes to mind immediately is targetting used by the army as I believe they are also moving away from 'illuminating' targets to GPS, but also the illuminating would require some form of ground object tracking.

I'm not looking to track a ground object as that would feed the debate for civilian UAVs. I just want to be able to survey fences/livestock/SAR objects and get a location to revisit on foot/ATV/horseback.

I mention the SAR (Search and Rescue) aspect as I have been a volunteer for nine years. We usually rely on the State Police helicopter to aid in airborne searches. But with the budget cuts that resource could be drying up. I work mounted searches mostly and it would be great to have a quad copter in my saddle bag. Pull it out and launch it to check the other side of a canyon rather than take my horse to the other side which could take an hour or more just to get there.

There is an Android App called "XYZworks Triangulate" that kind of does this (kind of).  I have sent the developer an email to see if they would help with the math. As I understand it the App used the sensor data from the phone to get a lob from two readings and calculates the intersection location.

Well wether you want to or not these two are basically the same thing.

But back to the actual issue, you do mean to extrapolate the GPS of an object centered in the camera view, right?

So lets examine the knowns, we know our gps location, we know the altitude, and we know the angle at which we observe the ground based object. So we need to apply geometry to extrapolare some form of GPS offsets for adding/subtracting to the known GPS position.

This should be fairly simple but my math etc is rusty... I'll get back to this later this evening when I have more time.

Well part of the math is pretty simple along the lines of http://www.csgnetwork.com/righttricalc.html for example. 

The formula to get the distance on ground is simple since you have the angle and altitude, to give an example, 20m altitude, looking angle 45degrees = distance on ground is also 20m. 

At this point you know that the ground based point in the example is 20m away from your GPS location, in the heading of your vehicle, so depending on your requirement you might just record that data, return to the GPS location and walk 20m in the direction you were pointing, but ideally we would get the lat/long of the target rather than a set of instructions. 

http://williams.best.vwh.net/avform.htm holds the answer, I also found some examples for android to do this: 

http://www.basic4ppc.com/forum/bugs-wishlist/12909-coordinates-base...

So mathwise it is there. 

I will come back to this later when I have a slightly more practical use for it. I do intend to integrate this functionality in a project I am working on. 

in SAR it is indeed often problematic to figure out relative position between an airborne observer and a groundbased rescue team, which of course is of paramount importance once a possible location is identified. 

I believe that is based on visual triangulation/angle which is slightly different in terms of inputs/outputs but uses the same kind of math. 

Sense of humour bypass there, Criro..?

Reply to Discussion

RSS

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service