jnIcA3G8M0jyU9jvhvkSxOk1acsBJKWox_OyAhGxwRbQWbxAEG2sLuEnOIk_YmKK0Bj1EX38P177JGRC1rJ18Dut9aDmmZ-As_xx1yKWLKgLauTvRpqBjfuv9JVBu6eFZ1P3_avI (1289×720)

As part of a Google Summer Of Code (GSOC). I have the privilege to Mentor a talented PhD Student of the Nanyang Technological University of Singapore, named Thien Nguyen.

Since the beginning of this project , Thien has delivered a series of Labs to serve not only as milestones for the project but also a step-by-step guideline for anyone who wishes to learn about using the power of computer vision for autonomous robot to follow. The labs include:

  1. Lab 1: Indoor non-GPS flight using AprilTags (ROS 2-based)
  2. Lab 2: Getting started with the Intel Realsense T265 on Rasberry Pi using librealsense and ROS
  3. Lab 3: Indoor non-GPS flight using Intel T265 (ROS-based)
  4. Lab 4: Autonomous indoor non-GPS flight using Intel T265 (ROS-based).
  5. Lab 5: MAVLink bridge between Intel T265 and ArduPilot (non-ROS).
  6. Lab 6: Calibration and camera orientation for vision positioning with Intel T265.

I invite you to read about this series of well detailed  experimentations and  instructions on how you can implement the  RealSense T265 tracking camera system :

https://discuss.ardupilot.org/t/gsoc-2019-integration-of-ardupilot-and-vio-tracking-camera-for-gps-less-localization-and-navigation/42394

Here is a video showing autonomous indoor flight using the system in ROS-Mavros environment  (this is part of Lab 4):

Lab 5 shows how to fly using a Python Scrit sending MavLink Message Vision_Position_Estimate directly to Flight Controller.

ymaV7i3iFv6rqoejJzsXfmAd-QnZRU9XPfQl5wp0IKQUyaVVZFvSS0AWF5y7vP1Klup7cSEpNBo1bs_02FAAt7dfNVAWq4APUo4ld1MYajCWAnRD1eeLkE12SRdRccKR3s8gaGH9

You can read the underlying principles of how to incorporate a VIO tracking camera with ArduPilot using Python and without ROS. After installing necessary packages, configuring FCU params, the vehicle can integrate the tracking data and perform precise navigation in GPS-less environment. Pose confidence level is also available for viewing directly on GCS to quickly analyse the performance of the tracking camera.

Thanks to Thien for this amazing project, experiments can now be carried on the T265 with different Flight Controllers and stacks compatible with the Vision_Position_Estimate  MavLink Message.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Thanks for the suggestion :-)

    As for the D435 , take a look at my code, you can make arrays of differents resolution and extract the minimum value (closest object) using numpy 

     https://github.com/patrickpoirier51/REALSENSE-EXPERIMENTS/blob/mast...

    patrickpoirier51/REALSENSE-EXPERIMENTS
    Experimenting with the D430 Ros-Python & C++ Standalone - patrickpoirier51/REALSENSE-EXPERIMENTS
  • "As the title says, it is GPS denied environment.

    but what follows repeated 3 times is

    "Indoor non-GPS flight

    indoor flight is always non-GPS so

    "Indoor non-GPS flight   - stays for tautology

    My nice suggestion to improve wording

    Triangulation (2D SLAM/ 3D SLAM) based indoor navigation flight

    as good and as reliable as in GPS environment

    "GPS denied environment.

    Indoor environment stays exactly for "GPS denied environment.

    BTW

    " you might consider is using the D435 API (or Python Wrapper) to generate the 100 points x 100 points depth live array. I did something similar and it works quite good.

    Should I process mega pixels subject to blur filter not to loose small sharp objects (like wires, nails) in obstacle avoidance algorithm ?

  • Hello,

    As the title says, it is GPS denied environment.

    To make it clear and simple,  we are generally referencing to  "non-GPS"  so everyone can understand that the autonomous modes (like Loiter - Guided and Auto) are being processed by a different localisation system.

    If you want something more academic: "Implementation of Visual Inertial Stereoscopic Odometry as a 6 Degree Of Freedom Pose Estimator"... ;-)

    Please note that the Labs are performed indoor as Thien cannot fly drones outdoor on the campus. I am presently experimenting outdoor, and proceed with caution as the system can degrade rapidly when exposed to bright sunlight.

    If you read through the blogs, there are still a coule of roadblocks to make the T265 as a reliable system that can be uses with accuracy on various configurations like looking down at higher altitudes. There are many references to opened PR on RealSense Github and we are progressing as fast as the issues are addressed by Intel.

    A final note on the T265-D435 Bundle, as you can read I do have them on hand, but there is NO official release of the SLAM+Avoidance  SOFTWARE system.We are waiting for an official , documented release and hopefully with real 3D mapping.

    As you can see here, it is still in development branch and on 2D format : https://github.com/IntelRealSense/realsense-ros/tree/occupancy-mapp...

    You can send me a personal message,  please note that most of my free time is dedicated to Mentoring Thien with this interesting project until September.

    Regards

    IntelRealSense/realsense-ros
    Intel(R) RealSense(TM) ROS Wrapper for D400 series, SR300 Camera and T265 Tracking Module - IntelRealSense/realsense-ros
  • a short question

    since the term "Indoor non-GPS flight" gets repeated 3 times,

    are you suggesting

    Indoor GPS flight to be a valid alternative ?

    To my understanding, Indoor GPS is nothing what should be considered as a reliable technology to control  flight of a drone indoor,

    so promoting Intel T265 as a viable alternative to highly unreliable technology

    may hurt marketing strategy by Intel

    Indoor GPS doesn't sound ok from R&D point of view

  • thank you

    "We are looking as well or the official release of the Realsense Combo  (T265 + D435) as a SLAM+Avoidance system.

    so you are home, read below as official release

    from

    https://www.intelrealsense.com/tracking-camera-t265/

    "

    For a limited time, get the power of an Intel® RealSense™ Depth Camera D435 and Tracking Camera T265 bundled together for one great price, and get started with your next project today.

    " you might consider is using the D435 API (or Python Wrapper) to generate the 100 points x 100 points depth live array. I did something similar and it works quite good.

    how to contact you about the details and success story ?

    Tracking camera T265
    Tracking camera for Robotics, Drones and More. Low power SLAM camera solution. Cross-platform compatibility, high accuracy, small formfactor.
  • Well, it is a little off-topic, but what you might consider is using the D435 API (or Python Wrapper) to generate the 100 points x 100 points depth live array. I did something similar and it works quite good.

    We are looking as well or the official release of the Realsense Combo  (T265 + D435) as a SLAM+Avoidance system.

  • Thank you Patrick

    from

    https://www.intelrealsense.com/tracking-camera-t265/

    "

    For a limited time, get the power of an Intel® RealSense™ Depth Camera D435 and Tracking Camera T265 bundled together for one great price, and get started with your next project today.

    https://discuss.ardupilot.org/t/gsoc-2019-integration-of-ardupilot-...

    https://github.com/IntelRealSense/librealsense/blob/master/doc/samp...

    I am building lidar based, forward looking 2D radar array for my car to offer obstacle avoidance, alerting .

    Surround lidar is low cost device today.

    What matters is to turn it into forward only looking device ( 100 points x 100 points depth live array)

    Laser lidar is not heavy in computer's power since number of points analyzed is reduced 100+ times

    (1Mpx x 1 Mpx video vs. 100 px x 100 px laser lidar array radar

    Since my car is noit intended for autonomous mode, I don't need 2D or 3D SLAM support

    My friends develops laser based 3D scanners

    and I am just testing every laser scanner, laser projector for outdoor mode

    since most of laser distance meters fails to work in direct daily sun

    \so is not fit for outdoor obstacle avoidance activities

    Let me know what should work better  for me:

     depth camera  vs. laser lidar (2D radar)  obstacle avoidance outdoor solution ?

    Tracking camera T265
    Tracking camera for Robotics, Drones and More. Low power SLAM camera solution. Cross-platform compatibility, high accuracy, small formfactor.
  • Thanks Chris

    Mateusz no problem, you can spread the news :-)

    Most of the details are on the series of blogs published on ArduPilot.org , you can get the essence directly on Thien's github:

    https://github.com/hoangthien94/vision_to_mavros

    ArduPilot Open Source Autopilot
    The most advanced open source autopilot for use by both professionals and hobbyist. Supports multi-copters, planes, rovers, boats, helicopters, ante…
  • Also @Patrick: when you copied the lab description by mistake you also copied the number of clicks, making it a bit confusing about "ROS 2" based labs (I got quite excited for a second!)

  • WOW! Great job! I strongly believe that combining ROS and ArduPilot can result in really powerful applications.

    Are there any plans to host those tutorials outside of ArduPilot forums? I'll be sharing the information about the labs in the next issue of Weekly Robotics, hope you don't mind extra traffic!

This reply was deleted.