I'm about to build a proof-of-concept for a copter application. Generally speaking, the mission will be to use the copter to position a video camera in a fixed position, stream video, then return to the base station. Each mission will have nearly the exact same narrowly defined parameters. I'm wondering which platform, probably open, is the best to most quickly get to a proof of concept.

  1. Launching from a base station and flying straight ahead
  2. Recognizing an object about the size of a shipping container on the ground and flying over (typically only traveling in a two dimensional arc.
  3. Recognizing a landing spot toward the far end of the object it just flew over—and landing.
    1. This landing spot will always be approximately the same shape, orientation, size, and in the same position relative to the object, but never exactly identical.
    2. Ideally visual recognition of the landing spot. But in certain cases, the landing spot can be identified and/or confirmed by its heat signature.
  4. Live streaming video to the base-station for up to 30 minutes, usually about 15 minutes.
    1. During this time, using the landing gear (as opposed to the flying gear) as primary means of maintaining the camera's fixed position
    2. Knowing which general direction to point the camera toward. This would probably NOT be based on object recognition but based on its flight path. However, this step, pointing the camera, will often be operator-tweaked.
  5. Returning to the base-station

As I see it, the non-trivial requirements for the software are:

  1. Autonomous flight
  2. Simple visual object recognition and avoidance
  3. Visual identification of a landing spot
  4. Self-landing
  5. Autonomously returning to the base station

For the hardware, the non-simple requirements would be:

  1. Ability to add a thermal sensor to use for flight guidance
  2. Ability to fly well under windy conditions

What platform would you recommend for this application?

Thanks in advance!


P.S. I have experience with software but never with object recognition.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –


  • Looking around, I guess it's a question not a conclusion: is the camera the best means of object detection/recognition? Or would a different type of sensor such as sonar be better?

    Also, a GPS signal can't be assumed.

    • Hi Shmuel,

      Cameras can provide good object recognition but there are usually some assumptions that need to be made about the object to be identified. Philosophically, this falls into the category of "a-priory knowledge" and the better this knowledge is, the more accurate the identification will be.

      To recognize an object on the basis of its geometric dimensions ("an object about the size of a shipping container on the ground") from a limited point of view ("traveling in a two dimensional arc") it might be easier to use a geometric measurement rather than image analysis - effectively you have a defined a two dimensional problem.

      The simplest way of doing this is to use a laser altimeter. Below is a typical trace that you can expect from one of our SF10/A laser altimeters where the large, rectangular object would be typical of the "container" and the flat area at the end might be the landing area.


      If you know the length of the "container" you can work out the ground speed using the number of laser samples that hit the top (at 32 readings per second) then use this to project the position of the landing area.

      The SF10/A has a number of different interface protocols including analog voltage so it's very easy to connect to any type of flight controller. Additionally, there is a USB port so that the unit can be run directly from a laptop using our free LightWare Terminal program that also time-stamps the data for you. This configuration is ideal for running simulations and verifying the real-world performance before spending a lot of flight time trying to optimize your software.

      You can add more detail by using a multi-beam laser such as the SF33 or even scanning an SF30 high speed laser but this might be unnecessarily complicated for your project.

      Good luck! LD

      • Thanks LD! Being new to this, I didn't consider laser altimeters because I didn't know they existed for copters. It sounds like an interesting option because, as you point out, the drone will be flying primarily in two dimensions and the object it seeks to fly over will have a large enough height differential from its background (the ground). I like the suggestion for its simplicity and (probably) accuracy.

        Is the software available prior to purchase of a altimeter? I assume the trace you show is a simulation based on an object the size of a container (assuming the dimensions are meters...).

        Finally, I'm assuming but would like you to confirm that the programmed intelligence resides on and runs from the drone (as opposed to, say, the ground controller.)

        Newb questions. Thanks for your help!

        • The software is available free from the website - it's a terminal emulation program with auto detection of any of our laser products when they are connected via USB to a laptop.
          Yes, the image is a simulation. I was hinting that you could design most of your system using simulations - it's much more efficient than relying on trial and error.
          There is no mapping capability in the laser and the terminal program is just a simple way of testing the unit and collecting data on a laptop.
          The idea behind the USB interface on all our lasers is that you can go and collect some real-world data to help make your simulations more accurate.
          You would need to create the 2D mapping software yourself, it isn't in the laser. But FYI, the SF10/A runs a 32-bit Cortex-M3 CPU along side FPGA fabric with an analog co-processor and hardware interface drivers. It certainly has the brains for this project!
          As a disclaimer, my experience and personal preference is to consider a laser altimeter as a possible solution. This doesn't mean that it's the only solution, so I would investigate all possibilities before making a final decision.
This reply was deleted.


DIY Drones via Twitter
RT @chr1sa: My talk on PX4 and FAA certification is coming up at 1:45 PST today on the PX4 Dev Summit livestream. Includes some cool new st…
DIY Drones via Twitter
RT @seesharp: I'm tuned into the PX4 / Dronecode free live conference. Great stuff. Microsoft AirSim talk in 10 minutes. https://t.co/0zbZ2…
DIY Robocars via Twitter
RT @masato_ka: 距離センサを3つとESP32を付けたラジコンカーをDonkeyCarライクにNNで自動走行。3層FC極小モデルをTensorFlow Lite for microcontrollerで動かしてる。機体は借り物でRumiCarって言います。Tenso…
DIY Robocars via Twitter
RT @SmallpixelCar: My car was able to go all the way autonomously until the crosswalk. It was only 100 yards from the target. What should b…
Liam left a comment on Agricultural UAVs
I'm Liam from T-MOTOR. I would like to reach out to see if there is any possibility for us to work together.
We are a propulsion system manufacturer who offers motors, propellers and ESCs for all kinds of drone applications which vary from secur…"
Jun 30
DIY Robocars via Twitter
RT @SmallpixelCar: Smart move. The car used the shadow to guide it through the bridge. This was never in the training samples. But it learn…
Jun 30
DIY Robocars via Twitter
RT @SmallpixelCar: Getting closer to the target. Single camera. Untrained road. https://t.co/Wsr7RwDamj
Jun 29
Richard Cox left a comment on Australia
"Anyone in the DIYDRONES Australian subgroup based in Alice Springs, NT?
I am experimenting with Ardupilot (standard Arduplane), Pixhawk 4 FC in a 4-ch
RC "AXN Floater Jet" foamy plane..."
Jun 29
Omar Sykes left a comment on Australia
"Hi everyone, I am looking for someone who is good at drone building, repair and software in Adelaide. Please give me a call on 0477 319 219."
Jun 29
DIY Robocars via Twitter
RT @RoboticMasters: #donkeycar https://t.co/czuLoVRcA4
Jun 29
DIY Robocars via Twitter
Jun 29
DIY Robocars via Twitter
RT @RoboticMasters: Donkey car, car car car car car car; Donkey car, car car car car car car; Donkey Car. Anyone like our tiny tiny donkey…
Jun 29
DIY Robocars via Twitter
RT @SmallpixelCar: After improving DBSCAN speed, I can get 11 frame per second on @NVIDIAEmbedded Jerson Xavier MAXN mode and the autonomou…
Jun 26
DIY Robocars via Twitter
RT @Heavy02011: Join us at next Virtual Race League: ⁦@diyrobocars⁩ Race #4 - Parking Lot Nerds, August 1st https://t.co/5KUpu7VGaH
Jun 25
DIY Robocars via Twitter
RT @SmallpixelCar: #CUDA implementation should be easy, because most of the time is on distance calculations berween two points and if the…
Jun 25
DIY Robocars via Twitter
RT @SmallpixelCar: This is the view from the car. Even my lanenet inference is fast on @NVIDIAEmbedded Jerson Xavier, DBSCAN clustering tak…
Jun 25