Hello Companion Experimenters

Based on Randy's  Red_Balloon_Finder project, this is the implementation of this fascinating project on a Raspberry Pi 2.

This is the perfect companion computer project and it covers the whole spectrum of the autonomous flight:

1- High level processing language: Python + OpenCV

2- Full integration of the Companion Computer (CC) with Flight Control (FC) and Ground Control Station (GCS)  using MavProxy on a mix of serial and UDP communication

3- Integration of the Balloon_Finder with Software In The Loop (SITL)

4- Usage of Drone API  allowing all sort of automated flight control

5- Perfect introduction to OpenCV using a Drone Specific Application that is relatively easy to program and configure with PYTHON

6- Creation of a complete standardized system image dedicated to this application so it can be used by all the interested experimenters as a training tool to get into vision controlled autonomous flight 

Here is the proposed system:

It is based on the Companion Computer Architecture described here .

This is composed of 3 major building blocs:

1- FC: Flight Control , this is basically the Autopilot

2- CC : Companion Computer, which process the high level computation

3-GCS: Ground Control Station, this is Mission Planner, or QcGroundControl , mavlink --console or any GCS

The FC is interfacing with these signals:

A) Manual or Backup control links from the Radio Control. Transmission type: PWM,PPM,sBus, etc.  Freq. 2,4 Ghz

B)Telemetry & Control to and from CC using  UART .  Signal type: Mavlink.    Direct-Connect.

C) Interface with various sensors and actuators: PWM - UART - I2c - SPI - USB -GPIO

The CC is interfacing with these signals:

A) Main DOWNLINK to GCS. Transmission type WIFI, LTE, Other. Freq. 5 Ghz (WIFI).  Bandwidth (6 - 600 Mbps)

B)Telemetry & Control to and from FC using UART.

C) Interface with Camera    Using USB2  for Logitech or  CSI2 for RasCamera

The GCS is interfacing with these signals:

A) Main DOWNLINK to from CC.

B) Optional connection to Internet using LTE or HotSpot

C) On the computer side, numerous inearfaces can be connected:

    i. Immersive googles

    ii. Joystick

    iii. Gesture Interface (embedded tablet IMU)

    iv You Name It

Views: 2203

Replies to This Discussion

Hello Glenn

The way you describe it, looks like you try to build some kind of bridge?
You might take a look at this:
http://www.glennklockwood.com/sysadmin-howtos/rpi-wifi-island.html

Hi Patrick,

Yes I had found that blog. But this appears to destroy my wlan0 and it no longer connects to my AP even though the /etc/network/interfaces settings haven't changed.

I do what the rPi to bridge the eth0 (192.168.0.*) network and the wlan0 (192.168.178.*) networks. But I don't know if standard bridging will allow my workstation on on the (192.168.178.*) network to talk to a device on the (192.168.0.*) network. 

Hi,

I am curious to know if instead of using the baloon popper code we can use a similar setup as described in this post for following designating targets in the live video frame. Something like follow me but instead of using a GPS to track the target position, we use the RPI to calculate the relative position of the target from the drone/camera and accordingly send velocity requests to the drone in guided mode to follow the target 

Hello,

Yes it is possible, but on the actual state of development you will have to carry a red balloon with you !! 

More advanced type of object recognition will be implemented in the future releases, but I must admit that for human tracking from above, we will need more processing power. To accomplish this you need to have the computer ''learn'' the matching pattern  like openCV HOG descriptor, see this excellent blog here that explain the principle, but I doubt that there are much models to match a human from above. But it will be possible by using a Nvidia Tx1 running Deep Learning stuff but, this project will be in a different Companion Computer Forum and with a much higher price tag :-)

Patrick,

I've been off consumed with various things for a couple of weeks but this is still very much on my mind.

Do you think this v4l2loopback would work with the RPI2 camera or would it only work with a web cam?  I think the rpi camera doesn't appear as a device for some reason.

Randy,

Me too, as you know I took some days off to get my hands into the MICRO-Zee project, it helps sometimes to switch you brain to a different project :-)

I'll give it a try, basically you set the RaspiCam as a v4l2 compatible device by loading this driver in kernel sudo modprobe bcm2835-v4l2, ( you can make it permanent by  adding it into  /etc/modules)

The major drawback with v4l2loopback is that it is working just with gst-0.10, so you can expect all sort of limitations and compatibility problems. So it is a solution with a very narrow set of options, but it can do the job.

UPDATE:

http://stackoverflow.com/questions/25941171/how-to-get-gstreamer1-0...
It seems there is a bug in agreeing supported resolutions between the raspicam v4l2 driver and gstreamer. You can find more info on official RasPi forum. Thanks to great developers at Raspberry Pi Foundation there is also a workaround/fix for that.
When loading a driver just add a "gst_v4l2src_is_broken=1" flag, like this:

sudo modprobe bcm2835-v4l2 gst_v4l2src_is_broken=1

Here is an example of raspberry pi camera v4l2 driver mapped on video0 that is streaming using gstreamer on a v4l2-loopback virtual video1 device:

gst-launch -v v4l2src device=/dev/video0 ! ffmpegcolorspace ! video/x-raw-yuv,width=640,height=480,framerate=15/1  !  v4l2sink device=/dev/video1 

And then, we opened 2 concurrent opencv sessions, both  feeding from same source :virtual mapped  video1 

total processor : 47% each opencv sessions utilizes 88% of a core.

RSS

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service