Has anyone tried this, and where would be the best place to calculate it (APM, OSD, or GCS)?
Situation: You are flying around looking at things with your camera and see something of interest. Let's say a a broken fence on your farm. You point your camera's crosshair on it and read a location for where the camera is pointing so you can go out and repair it.
Given: The location of the aircraft (Heading, Altitude, and Position) and orientation of the camera (Azimuth and Look Angle).
Problem: Determine the location of the item being viewed by calculating it form the know aircraft position/direction and the camera’s azimuth/look angle. Then out put the location in a format that can be displayed in the Mission Planner software or video stream.
Figured I would add a super technical diagram to help with the explination......
Do you have a gimbal camera on your scenario to make it looking strait down?
If yes, then fly the multirotor where you are exactly on top of your target (point camera down to see the target). And telemetry will give you your actual position which will be visible on MissionPlanner.
In your picture, you appear to be zapping your bird with lightning. Don't do that. You'll break it. :-)
the picture is correct and the symbol used for radio communication is also correct. please post useful information.
I'm not looking to track a ground object as that would feed the debate for civilian UAVs. I just want to be able to survey fences/livestock/SAR objects and get a location to revisit on foot/ATV/horseback.
I mention the SAR (Search and Rescue) aspect as I have been a volunteer for nine years. We usually rely on the State Police helicopter to aid in airborne searches. But with the budget cuts that resource could be drying up. I work mounted searches mostly and it would be great to have a quad copter in my saddle bag. Pull it out and launch it to check the other side of a canyon rather than take my horse to the other side which could take an hour or more just to get there.
Well part of the math is pretty simple along the lines of http://www.csgnetwork.com/righttricalc.html for example.
The formula to get the distance on ground is simple since you have the angle and altitude, to give an example, 20m altitude, looking angle 45degrees = distance on ground is also 20m.
At this point you know that the ground based point in the example is 20m away from your GPS location, in the heading of your vehicle, so depending on your requirement you might just record that data, return to the GPS location and walk 20m in the direction you were pointing, but ideally we would get the lat/long of the target rather than a set of instructions.
http://williams.best.vwh.net/avform.htm holds the answer, I also found some examples for android to do this:
So mathwise it is there.
I will come back to this later when I have a slightly more practical use for it. I do intend to integrate this functionality in a project I am working on.
in SAR it is indeed often problematic to figure out relative position between an airborne observer and a groundbased rescue team, which of course is of paramount importance once a possible location is identified.
I believe that is based on visual triangulation/angle which is slightly different in terms of inputs/outputs but uses the same kind of math.
Sense of humour bypass there, Criro..?