Hello!  I am a developer and interested in Obstacle avoidance.  Could someone direct me to a group or individual working in this field?  Is anyone working with LIDAR, optical sensors, and the like for obstacle avoidance and path planning?  I’d like to help.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

      • Interesting. Just out of curiosity, were you thinking of doing work on an open source basis or as a commercial endeavor? Is there any reason why you haven't joined up with the current development efforts around the Pixhawk and its derivatives? What sort of processing platform or system architecture do you think is appropriate for UAVs to become smarter and more reliable?

        • Right now I am interested in an open source project.  I guess I must be thick, but I can’t seem to find a project under the Droncode umbrella that is working on Obstacle Avoidance and Path Planning.  I see these two terms surrounded by dashed lines in one of the code structure diagrams but haven’t seen a group working on it.  Anyone know which aspect of the Pixhawk effort is addressing Obstacle Avoidance and Path Planning?   As for which platform is appropriate for UAVs I’m not experienced enough to answer.  A few papers I have read utilized scanning laser range finders to sense the vehicle’s surroundings.  In the paper “Flying Fast and Low Among Obstacles” (published 2008)  the range data was processed by a Pentium M 1.6 GHz processor with 2G of RAM and 4GB of flash memory. This system allows for course mission way points and chooses a path closest to the programmed path that avoids obstacles.  The little ONAGOfly that I have seen advertised claims to do the same thing.

          https://www.ri.cmu.edu/pub_files/pub4/scherer_sebastian_2008_1/scherer_sebastian_2008_1.pdf
          • That's a great research article. You might be interested to know that we have a "conical" pattern scanning laser due for release next month that produces a simplified version of the pattern described in the paper.

            Looking at the specs, I think the ONAGOfly uses simple ultrasonic sensors and doesn't really do any path planning.

            Have you considered using a standard flight controller platform, like the Pixhawk, BBBMini, RasPi or others to do the flying whilst moving the collision avoidance and path planning to a separate, high performance processor? This would allow for non-real-time strategic decision making whilst keeping time critical actions such as stabilization as a more "reflex" function.

            We are thinking of using UAVCAN bus as the primary network link between the two processing platforms with the LiDAR sensors attached directly to the "big brain" rather than jamming up the network or overloading the flight controller. Other sensors like the laser altimeter would be on the network so that both the flight controller and the big brain could access the data.

            • Having a direct link between the LiDAR and a dedicated high performance processor makes sense to me.  The LiDAR information isn’t any good to the flight controller on its own, it needs to be interpreted.  My interest lies there in interpreting the data.  If I were to start something on my own I wouldn’t even think about flying something until I worked out how to build a world map based on the LiDAR data.  Are you shipping your product yet?  Do you have a web site?

  • I have started to sperimenting with Terabee sensor.
    I use Teraranger One & Teraranger OSD. (I use two TerarangeOne, on for front and one for rear)
    I have just mounted it on my copter.

    If you would like to excange progress about it feel free to write me (=

    http://www.teraranger.com/products/teraranger-one/

    http://www.teraranger.com/products/teraranger-osd/

    • Wow!  That Teraranger device looks really interesting.  How are you using it on your copter?  I mean, you receive range data from the Teraranger, then how do you translate that information into flight instructions?

      • Scott, actually I've been using it only in OSD mode.
        I can look the distance to an object by read the OSD data.

        For now I can not interface it with my pixahwk.

This reply was deleted.

Activity