Companion Computer RPI_ Balloon_Finder

Hello Companion Experimenters

Based on Randy's  Red_Balloon_Finder project, this is the implementation of this fascinating project on a Raspberry Pi 2.

This is the perfect companion computer project and it covers the whole spectrum of the autonomous flight:

1- High level processing language: Python + OpenCV

2- Full integration of the Companion Computer (CC) with Flight Control (FC) and Ground Control Station (GCS)  using MavProxy on a mix of serial and UDP communication

3- Integration of the Balloon_Finder with Software In The Loop (SITL)

4- Usage of Drone API  allowing all sort of automated flight control

5- Perfect introduction to OpenCV using a Drone Specific Application that is relatively easy to program and configure with PYTHON

6- Creation of a complete standardized system image dedicated to this application so it can be used by all the interested experimenters as a training tool to get into vision controlled autonomous flight 

Here is the proposed system:

3691275091?profile=original

It is based on the Companion Computer Architecture described here .

This is composed of 3 major building blocs:

1- FC: Flight Control , this is basically the Autopilot

2- CC : Companion Computer, which process the high level computation

3-GCS: Ground Control Station, this is Mission Planner, or QcGroundControl , mavlink --console or any GCS

The FC is interfacing with these signals:

A) Manual or Backup control links from the Radio Control. Transmission type: PWM,PPM,sBus, etc.  Freq. 2,4 Ghz

B)Telemetry & Control to and from CC using  UART .  Signal type: Mavlink.    Direct-Connect.

C) Interface with various sensors and actuators: PWM - UART - I2c - SPI - USB -GPIO

The CC is interfacing with these signals:

A) Main DOWNLINK to GCS. Transmission type WIFI, LTE, Other. Freq. 5 Ghz (WIFI).  Bandwidth (6 - 600 Mbps)

B)Telemetry & Control to and from FC using UART.

C) Interface with Camera    Using USB2  for Logitech or  CSI2 for RasCamera

The GCS is interfacing with these signals:

A) Main DOWNLINK to from CC.

B) Optional connection to Internet using LTE or HotSpot

C) On the computer side, numerous inearfaces can be connected:

    i. Immersive googles

    ii. Joystick

    iii. Gesture Interface (embedded tablet IMU)

    iv You Name It

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Hi,

    I am curious to know if instead of using the baloon popper code we can use a similar setup as described in this post for following designating targets in the live video frame. Something like follow me but instead of using a GPS to track the target position, we use the RPI to calculate the relative position of the target from the drone/camera and accordingly send velocity requests to the drone in guided mode to follow the target 

    • Hello,

      Yes it is possible, but on the actual state of development you will have to carry a red balloon with you !! 

      More advanced type of object recognition will be implemented in the future releases, but I must admit that for human tracking from above, we will need more processing power. To accomplish this you need to have the computer ''learn'' the matching pattern  like openCV HOG descriptor, see this excellent blog here that explain the principle, but I doubt that there are much models to match a human from above. But it will be possible by using a Nvidia Tx1 running Deep Learning stuff but, this project will be in a different Companion Computer Forum and with a much higher price tag :-)

  • Developer

    I'd suggest you fork this: https://github.com/diydrones/companion   and then commit your script/s as per the heirarchy ( ie RPI2/Raspbian for Randy and whoever wants to help the Raspberry PI effort, and Odroid_XU4/Ubuntu for Bills efforts.  )   then push to your personal repo/s and make a pull-request.   :-) 

  • These are the next steps:

    a) Build an inventory of all the required programs, drivers, tools to create the project's image;

    b) Test, Optimise and integrate all pieces of the puzzle into a working system;

    c) Document the various shell programs and services-daemon involved to get the system running;

    d) Describe the workflow and inner operations of the Balloon_Finder program and how it interact with the other programs;

    e)Document method and procedures to fly the balloon_finder

    f)Expand this project by adding more features and complexity, Here are some examples:

      - Optimize code to get better accuracy, speed and hit rate

      - Add gimbal control so the camera can scan while flying

      - Add moving object tracking

      -Add multiple object recognition

    Well that is a lot of work, but Randy has already did the most of it , so I will try my best to follow its foosteps... Hope you will !

    • CHAPTER 6

      BUILDING A NEW KERNEL FOR V4L2LOOPBACK TO GET MULTIPLE CAMERA SOURCES

       DO NOT RPI UPDATE ONCE COMPLETED (apt-get update is OK)


      starting with a new image (kernel 4.1.17)
      sudo rpi update (if new RPI)
      sudo apt-get update
      sudo apt-get install dkms build-essential gcc-4.7
      sudo rm /usr/bin/gcc
      sudo ln -s /usr/bin/gcc-4.7 /usr/bin/gcc

      #==check gcc vor 4.7
      gcc--version
      uname -a = version and wget appropriate kernel header -- 4.1.17

      wget https://www.niksula.hut.fi/~mhiienka/Rpi/linux-headers-rpi/linux-he...
      sudo dpkg -i linux-headers-4.1.17-v7+_4.1.17-v7+-2_armhf.deb

      get from adress below release 9.1
      https://github.com/umlaeute/v4l2loopback/releases
      expand on home/pi
      sudo make
      sudo make install

      ---comments from developers---
      on RPI the implementation is not fully functionnal becauses of the missing kernel headers
      This is a workaround using insmod instead of modprobe == need to reboot to modify setups.

      v4l2loopback requires the videodev module to be loaded manually:
      cd ./v4l2loopback-release
      sudo modprobe videodev
      sudo make modprobe
      then
      sudo insmod ./v4l2loopback.ko

      make script or integrate in rc.local
      sudo nano load.sh
      sudo modprobe videodev
      sudo insmod ./v4l2loopback.ko devices=1 exclusive_caps=0
      sudo chmod 777 +x load.sh


      check:
      lsmod == show driver installed
      ls /dev/vid* == show new video devices

      ==============================================================
      SETUP RPI2: Note: sudo chown -R pi /home/pi (optionnal)
      http://dev.ardupilot.com/wiki/raspberry-pi-via-mavlink/


      sudo apt-get install ==note: If we implement just the basic v4l2 capture and redirect = gstreamer0.10-plugins-base gstreamer0.10-plugins-good
      (sudo apt-get install gstreamer-tools gstreamer0.10-plugins-base gstreamer0.10-plugins-good gstreamer0.10-plugins-bad gstreamer0.10-plugins-ugly)
      (sudo apt-get install gstreamer1.0 )

      TEST:
      gst-inspect == xxx plugins , xxx features
      =============================================================

      =================V4L2LOOPBACK======================
      BASIC TEST NO CAMERA:
      ls /dev/video* = (Note down the new devices: let X be the number of the first new device.)
      v4l2-ctl -D -d /dev/videoX = show config
      gst-launch videotestsrc ! v4l2sink device=/dev/videoX = feed target on virtual
      mplayer -tv device=/dev/videoX tv:// = see target on mplayer


      Some utility examples:
      v4l2loopback-ctl set-fps 30 /dev/video0
      v4l2loopback-ctl set-caps "video/x-raw-yuv, width=640, height=480" /dev/video0

      =======Camera feed to virtual video port============

      resolution has to be in conformance with v4l2 setup (see v4l4 utils)
      gst-launch -v v4l2src device=/dev/video0 ! ffmpegcolorspace ! video/x-raw-rgb,width=320,height=240,framerate=15/1 ! v4l2sink device=/dev/video1
      gst-launch -v v4l2src device=/dev/video0 ! ffmpegcolorspace ! video/x-raw-yuv,width=640,height=480,framerate=15/1 ! v4l2sink device=/dev/video1


      Local with RGB
      gst-launch -v v4l2src device=/dev/video1 ! ffmpegcolorspace ! video/x-raw-rgb,width=320,height=240,framerate=15/1 ! autovideosink sync=false
      mplayer -tv device=/dev/video1 tv://

      =========Streaming over WIFI:=========

      3 Options are possible:

      A) WIFI Unicast & Broadcast with encoders like Gstreamer
      B) Reflector : A web Page on the CC where any http based app like broswer or Misson Planner HUD can directly connect like mjpg-stremae - ffmstreamer - motion
      C) WifiBroadcast, that is tweeking the maintenance session to pass packet without encapsulation for faster speed and lower latency.


      All these methods are opening a large field of experimentation.
      Patrick Duffy is doing mostly Gstreamer over wifi using ubuquity equipment, and gets pretty good results.
      See its forum: http://diydrones.com/xn/detail/705844:Comment:1799906
      befinitiv is developping the WifiBroadcast stuff, you can read his forum:
      http://diydrones.com/forum/topics/3-km-hd-fpv-system-using-commodit...
      and web: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmiss...


      I do not intend to get deeper in this for the balloon_finder. Thes cheapest, quickest way to get the image on GCS will be retained.
      So far it is either Gstreamer UDP reflecting on VLC on GCS, or low res http with ffmpeg/mjpeg-streamer/motion.

      Final decision following test and benchmark.

      https://www.niksula.hut.fi/%7Emhiienka/Rpi/linux-headers-rpi/linux-headers-4.1.17-v7+_4.1.17-v7+-2_…
    • Developer

      Patrick,

      I've been off consumed with various things for a couple of weeks but this is still very much on my mind.

      Do you think this v4l2loopback would work with the RPI2 camera or would it only work with a web cam?  I think the rpi camera doesn't appear as a device for some reason.

    • Randy,

      Me too, as you know I took some days off to get my hands into the MICRO-Zee project, it helps sometimes to switch you brain to a different project :-)

      I'll give it a try, basically you set the RaspiCam as a v4l2 compatible device by loading this driver in kernel sudo modprobe bcm2835-v4l2, ( you can make it permanent by  adding it into  /etc/modules)

      The major drawback with v4l2loopback is that it is working just with gst-0.10, so you can expect all sort of limitations and compatibility problems. So it is a solution with a very narrow set of options, but it can do the job.

    • UPDATE:

      http://stackoverflow.com/questions/25941171/how-to-get-gstreamer1-0...
      It seems there is a bug in agreeing supported resolutions between the raspicam v4l2 driver and gstreamer. You can find more info on official RasPi forum. Thanks to great developers at Raspberry Pi Foundation there is also a workaround/fix for that.
      When loading a driver just add a "gst_v4l2src_is_broken=1" flag, like this:

      sudo modprobe bcm2835-v4l2 gst_v4l2src_is_broken=1

      Here is an example of raspberry pi camera v4l2 driver mapped on video0 that is streaming using gstreamer on a v4l2-loopback virtual video1 device:

      gst-launch -v v4l2src device=/dev/video0 ! ffmpegcolorspace ! video/x-raw-yuv,width=640,height=480,framerate=15/1  !  v4l2sink device=/dev/video1 

      And then, we opened 2 concurrent opencv sessions, both  feeding from same source :virtual mapped  video1 

      total processor : 47% each opencv sessions utilizes 88% of a core.

      3702859180?profile=original

    • CHAPTER 5 Hookup - Setup - Lab Test


      Software In The Loop -SITL -Testing
      Ubuntu based computer is running SITL:
      sim_vehicle.sh -v ArduCopter -f X -j 2 --out=192.168.2.108:14550 --aircraft=balloon --quadcopter --console --map


      Testing the mavproxy on the RPI (add .18)
      Using tcp connection on port 5763 == Telemetry port (Set Adress of the PI and the port of the master)
      mavproxy.py --master tcp:192.168.2.18:5763 --aircraft MyCopter

      Configure Balloon_finder
      load python colour_finder.py
      adjust setting to filter balloon
      save setting by sliding bottom cursor
      a file balloon.finder.cnf will be saved on root
      the parameters can be copied/merged into the balloon_finder.cnf on the script directory

      3702619683?profile=original


      Testing the balloon_finder
      module load droneapi.module.api
      api start /home/pi/Flight_Director/ardupilot-balloon-finder/scripts/balloon_strategy.py

      on console type:
      mode loiter
      arm throttle
      rc 3 1800 ===get to 15-20 alt.
      rc 3 1500 === stabilize height
      mode guided

      ==camera starts
      ==sequence begins
      == ''the hunt is on''
      mode rtl
      control-c to quit

      The video sequence (camera and openCV overlay) is saved in AVI files in root (5Fps)
      Playback : mplayer balloonn-yyy-mm-dd-hh-min.avi


      more to follow......

    • Developer

      Oh good, you've got at least parts of it running!

This reply was deleted.