Here's a quick technical post for anyone attempting to harness the capabilities of a Realsense D435 camera on a Jetson TX2.  For me, this is about getting usable depth perception on a UAV, but it has proved more problematic than I originally anticipated.

The Intel Realsense D435 depthcam is small enough to mount on a UAV and promises enough range for object detection and avoidance at reasonable velocities.  My intention is to couple it with the machine learning capabilities of the Jetson TX2 to improve autonomous flight decision making.

The problem is that the Intel Realsense SDK2 does not apparently support the ARM processor of the Jetson TX2, as I write.  This post links to my blog article which aims to provide some simple installation instructions that now work for me, but took a long time to find out!

(Full blog article link is https://mikeisted.wordpress.com/2018/04/09/intel-realsense-d435-on-jetson-tx2/)

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • @Mike Isted

    I read your blog at


    specification from


    Intel RealSense Depth Camera D435

    Depth Technology Active IR stereo

    Maximum Range   Approximately 10 meters

    So your IR depth images in your blog are full of artifacts due to limited, maximum range of Intelsense IR technology.

    All you need for a flying drone is a single camera blur effect depth technology, giving better results, showing no 10 meters range limit.

    Give up Realsense since for a fast flying drone, 10 meters range limit is a serious problem in detecting rigid or soft obstacles.

    First Flight: Intel RealSense D435 Depth Camera on Jetson TX2
      This is part of a series of posts outlining the evolution of my GroundHog hexacopter into a multi-role UAV.  It is based on a Pixhawk flight contro…
  • excellent example of pattern recognition feature

  • Mike, if you are aiming for autonomous navigation including obstacle avoidance you will need something like an i7 processor, in order to run the entire stack in real time. 

    That is from perception (Visual SLAM), state estimation, occupancy mapping, reactive path planning to MPC control.

    This setup from a recent paper by ETH uses an Intel NUC to run the fulls stack onboard. 


    An Open-Source System for Vision-Based Micro-Aerial Vehicle Mapping, Planning, and Flight in Clutte…
    We present an open-source system for Micro-Aerial Vehicle autonomous navigation from vision-based sensing. Our system focuses on dense mapping, safe…
This reply was deleted.