Indoor navigation with ArduCopter

Hello everyone,

I want to mount a Raspberry Pi and a camera module on my ArduCopter powered drone and have computer vision guide the drone's movement. Flying will be perform indoors so GPS isn't an option.

I'd like my computer vision code to look for a target and send the positional errors to ArduCopter, generating the positional errors isn't an issue - passing them to ArduCopter is.

I tried working with DroneAPI and I can get data from the drone and guide it using RC Override commands, however this isn't a very good method.

What other options do I have?

I'd prefer to use UART, without modifying the ArduCopter code if possible as I'm not familiar with it (Or C/C++ for that matter).

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Have you worked further on this?

    What is the status of the GLOBAL_VISION_POSITION_ESTIMATE flag and usage to input external position estimates into the EKF?

    Regards Thomas

  • @Bim  I have a similar situation.  I'm trying to fly an quad indoors with an Optical Flow sensor (PX4Flow) and a Rangefinder (LIDAR Lite) using the 3.3RC10 firmware.  I'm able to takeoff in Loiter mode (which, from what I understand, is the only currently supported flight mode when using Optical Flow), but am having to use RC Override commands in DroneKit (DroneAPI) to do any kind of movement because I can't get to Guided Mode.  Have you found any way to control the quad indoors without having to override pitch/yaw/roll using RC overrides?  It might simply be impossible at the moment because the mavlink message SET_POSITION_TARGET_LOCAL_NED requires the quad to be in Guided Mode, which can't be used with Optical Flow (yet).

  • I am also very interested in your progress. Can you share anything about your computer vision code? Like, what it exactly does? Do you have to use a stereo camera, or how do you estimate distance in your pictures?

  • Developer

    Bim,

    There are three possible solutions I think:

    1. implement a angular controller in Guided mode to allow the companion computer to more directly control the vehicle's lean angles.  The companion computer would be responsible for estimating the vehicle's velocity and position and converting the (velocity and position) errors into angle requests.

    2. feed an estimated position into the EKF as a substitute for a GPS position.  The EKF could then work as i does not calculating position, velocity, acceleration by combining this data with the other sensors.  There's a GLOBAL_VISION_POSITION_ESTIMATE message but we haven't implemented the Copter side of it (which could be complex).

    3. attach an optical flow sensor to the vehicle.  The EKF already knows how to use this sensor to estimate position and velocity.  The companion computer could then control the vehicle with either the velocity or position controllers.

    • Hi Randy,

      1. I've read about VISION_POSITION_ESTIMATE, has it been implemented in Arducopter at all?  It sounds like you're saying it hasn't, I wanted to check if I understood correctly, or if anything had changed since 2015.

      http://dev.px4.io/external-position.html

      2. You mentioned above in suggestion #2 to feed an estimated position into the GPS, how would I go about that?

      Redirecting... Page moved
    • Developer

      Alex,

      For #1, no. still not implemented.

      For #2, someone is working on this so I think this could happen soon-ish.  I guess you're a developer so if you're happy writing code here's some advice from Tridge and I.

      For now sending in the MAVLink HIL_GPS message from the external position estimation system is a good way to start.  It's got some limitations so if these become apparent we might need to create a new message like below.
      EXTERNAL_ABSOLUTE_POSITION - make same as HIL_GPS message but:
      - remove redundant cog field
      - add velocity accuracy estimate number
      - add position accuracy (ekf will be guessing)
      - make velocities floats in m/s

      Write a new GPS_MAVLink backend (within the AP_GPS library).
      - should not be auto-detected but instead require the GPS_TYPE to be set to a new number
      - should be able to use in parallel with other GPS types
      - to trick the EKF into using it, it should claim a higher level of lock (i.e. 4D or higher number of sats)
      - the GCS_MAVLink library should:
          - inject the appropriate mavlink messages into the AP_GPS library via a new function AP_GPS::handle_mavlink_msg()
          - AP_GPS lib should then send it to all backends but all will ignore it except the new GPS_MAVLink driver.
          - ideally we should make the whole AP_GPS library a singleton so we can avoid passing a reference to AP_GPS to the GCS_MALVink library.
      - ideally we should add support in SITL (it's easy say Tridge):
         - add sitl GPS backend for it
         - add mavproxy module to test it

      Overall potential issues:
      - need to be careful about lag because GCS messages are only processed at 50hz in Copter (could be increased)
      - does vision system need high speed IMU and attitude data from vehicle?  This could add additional delays

      I know this is quite raw developer info but I'm afraid this is the best I can offer for the moment.

    • Hi Randy, thanks for the detailed response!  Yes i am a developer, but haven't yet got any experience working on ArduCopter etc, so far just DroneKit.  I'll take a look at HIL_GPS, that sounds promising.

      I'm currently using SEND_NED_VELOCITY to control the drone, as we generally do have GPS where we are flying, but, the location data from our visual system is (hopefully) much more accurate.  We're going to try out using RC overrides as well, in the hopes that this might give us more precise control than SEND_NED_VELOCITY.

      But, getting our location data into the EKF (using something like HIL_GPS?) sounds like a more robust approach!

    • Thank you for the reply Randy.

      Looks like there is no easy way for a C/C++ illiterate then, huh :)

      1. I'd prefer the autopilot be responsible for velocity estimation, I'm guessing that if the companion computer would do that it'd add some latency.

      2. Does it estimate velocity by differentiating GPS coordinate readings or by taking the velocity measurements from the GPS sensor?
      As far I know GPS sensors measure position and velocity (Velocity measurement being more accurate than differentiating the position).

    • Developer

      Bim,

      I think option #3 is the most viable then.  That leaves the flight controller responsible for the velocity estimation.

      Dronekit is in Python (and other languages) so if you're comfortable with that you could use it for the companion computer side.  We'd need to work together to enable Guided to work with only the optical flow and not the GPS (currently optical flow only works in Loiter mode).

      To answer your question #2, the EKF uses the acceleromter plus either the GPS or the optical flow to estimate velocity.

  • I am very interested in your progress on this. Please keep me updated. 
    I am pursuing a similar goal, however, I will be using sonar / LIDAR to navigate. Perhaps combination of two? :D

This reply was deleted.