3689670229?profile=original

Traditional algorithms focused on this problem would use the images captured by each camera, and search through the depth-field at multiple distances - 1 meter, 2 meters, 3 meters, and so on - to determine if an object is in the drone’s path.

Such approaches, however, are computationally intensive, meaning that the drone cannot fly any faster than 5 or 6 miles per hour without specialized processing hardware.

Barry’s realization was that, at the fast speeds that his drone could travel, the world simply does not change much between frames. Because of that, he could get away with computing just a small subset of measurements - specifically, distances of 10 meters away.

“You don’t have to know about anything that’s closer or further than that,” Barry says. “As you fly, you push that 10-meter horizon forward, and, as long as your first 10 meters are clear, you can build a full map of the world around you.”

http://www.roboticstrends.com/article/watch_mit_drone_autonomously_avoids_obstacles_at_30_mph/Drones

The drone is a modified Team Blacksheep plane, flown by an APM 2.5. More details in their paper here.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Yes, it is about 20 X faster and runs on 1/4  of the power.

    I am pretty sure that we'll see more and more FPGA based controlers in the near future, here's why:

    - Xilinx (and Altera) released very powerfull SOC that integrate Arm Core +  FPGA on high volume for the car industry. 

    - There are numerous development kits available now , and you can get this one for 60$: https://www.crowdsupply.com/krtkl/snickerdoodle

    - With this, you can build a system comparable the team from Computer Vision and Geometry from Zurich

    ... Yeah, you need a couple of Phd's to build the algoritm,  spend endless hours tweeking  weird cameras registers and interfacing them with  LVDS signals but it can be done...

    - There are more and more open source code (Verilog or VHDL) available on the web.

    And if you really want to get started on high gears, you can buy the  Vivado HLS development suite (4k$) , and migrate your OpenCV code on Programmable Logic.

    I do run a LogiBone Cape (Xilinx Spartan 6) on my BBB and it can process a camera stream at 30fps and extract basic image features on real time (Gaussian, Histogram, Soebel, Contour, Blobs), but the GPIO is not optimised for DMA so it cannot process more complicated stuff, but it is a great training tool.

    snickerdoodle
    A palm-sized, reconfigurable Linux computer that connects to the real world: ARM + FPGA + Wi-Fi + Bluetooth + 180 I/O
  • So, if I understand correctly, FPGA totally outperforms pushbroom for a significantly lower power consumption but is more of a pain to muck about with. Seems to me, a nice little magic FPGA box & cams combo (open source for the brave ones who want to tinker) at a reasonable price would be a dream.

  • Well, they are using 2x Odroid-xu3 wich makes 16 Arm cores running at up to 2 ghz, thats a pretty good powerhouse they got in such a small plane
  • So it just a small ARM based CPU doing the computing for the collision avoidance?

  • Thanks for the update John Arne, it is on frontpage of DiyDrones as well now.

    Interesting post from Andrew Barry, comparing the FPGA vs. Pushbroom Stereo Vision for MAVs

    http://groups.csail.mit.edu/robotics-center/public_papers/Barry15a.pdf

    Looks like the future of autonomous fly to me.

    http://groups.csail.mit.edu/robotics-center/public_papers/Barry15a.pdf
  • Developer

    The latest video made after the paper was posted, demonstrates full autonomous avoidance.

    https://www.youtube.com/watch?v=_qah8oIzCwk

  • This is an interesting method, I am quite impressed by the speed of this Pushbroom Algorithm
    Let us know when youll get a real autonomous fly :
    "While these flights were manually piloted, we are confident that the system could autonomously avoid the obstacles with these data. The integration of the planning and control system will be reported in future work."
  • It means that they have a thin cable (that is stated to be non-conductive and therefore not supplying any power or data) attaching the aircraft to the ground, acting as a sort of manual / physical geofence.

  • What does "tethers onboard all research aircraft mean"? The paper notes this as well. I'd be interested to learn more about it.

  • Good work by Andrew J. Barry and Russ Tedrake. The abstract of the paper clearly states the following:

    Here, we describe both the algorithm and our implementation on a high-speed, small UAV, flying at over 20 MPH (9 m/s) close to obstacles.

    Note to editor.

This reply was deleted.