pymavlink usage on Raspberry Pi

Dear all,

I was hoping someone with some experience with pymavlink could help me out.

I currently have a Raspberry Pi connected to my Pixhawk via the telem2 port and have developed an image processing program to track a target. So far, I am able to use MAVPorxy to connect to the Pixhawk via the command-line and send commands like arm, set params etc. but I can't seem to set up a connection within my python script using pymavlink (The goal is to override channel 4 to provide lateral tracking).

I think it may be something to do with how I have installed pymavlink. I simply cloned the repository to my home directory and ran "sudo setup.py install". Any light that could be shed would be great.

Thanks.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • I looked at your website.  Nice.  Correct the typo though. ("I my free time")

    I will read your design doc in more depth, so please expect more feedback later.

    For right now, please let me share my prototype:

    http://droneatc.ca

    It can coordinate multiple simulated drones.  I don't dare hook it up to anything real yet.

    I am looking at how to improve it with better ORCA-style collision avoidance.
    I found an academic paper here.
    https://mirror.umd.edu/roswiki/attachments/multi_robot_collision_av...

    If I take a getting back to you on your design doc, it's because I started that first, and it's pretty heady.  ;)

    If you want a proper demo of the site (because I don't often leave the simulators going 24-7) just drop me a message. 

  • Is it possible to send data from Raspberry Pi, such as images captured with Camera module, through the Telemetry_2 port?

    • There is image transmission facilitated by MAVlink as you can see in the protocol here:
      https://pixhawk.ethz.ch/mavlink/


      It's also explained better here:
      http://qgroundcontrol.org/mavlink/image_transmission_protocol

    • I'd be curious to know this as well... Can the Pixhawk accept local data (e.g. Camera images from the pi) to send out via the Pixhawk's telemetry_2 port? I think most people in this scenario are using GSM or Wi-FI on the Pi to send out camera data, but I wonder if others have used the 3DR radio to send that camera data from the pi as well.

      • Telemetry doesn't have the necessary bandwidth to send an image with one message. You might be able to compress the image in a special way and send it over some number of bits at a time. 

  • We have been looking at using target tracking to stabilise a zoom camera at maximum zoom where the movement is too low frequency for gimbal stabilisation. We are looking at optical flow or tracking. Can you share the details on your image processing?

    • Will you be using optical flow and tracking for stabilization?

      I would think the IMU would give better results. 

      • Yeah....no. low frequency is the issue. IMU is great for the higher transients.

        • Then I guess you would need a better resolution gimbal. Or maybe increase the P gains. 

          Designing a fast enough algorithm which would compute the optical flow just for the stabilization would mean adding a microprocessor to your UAV. And probably 50+ hours of work into the algorithm itself. 

          I would try to get a better IMU or a gimbal. 

This reply was deleted.

Activity