Offboard Control - MAVROS

I am working on doing external control of an autopilot using an onboard linux computer running MAVROS. I found the topic on mavros entitled "setpoint_velocity," But when I send that to the autopilot in guided mode with ESCs unplugged, nothing happens. It should set the setpoints for velocity, but in looking at the logs, the dvelx and dvely did not change to comply. 

I know this is possible using the Px4 stack, but I would prefer to avoid the complexity of switching to that and stick with the ardupilot firmware on the pixhawk. 

My questions:

  • Is this offboard control capability implemented in Arducopter?  (including setpoint override control of position, attitude, and velocities)
  • If it isn't right now, is this an upcoming feature?
  • If still not, is there an equivalent way to do this that is implemented?
  • If not, do you have any other suggestions or should I continue the switch to Px4 stack?

Hardware: Pixhawk,RPi, 

Software: Arducopter 3.2.1,  Ubuntu/mavros

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Hi, What are the values you published to "setpoint_velocity" topic to control pitch and roll. Could you please share the code snipet if possible. Thanks

  • Hi Jefferson,

         I am trying to control the drone with companion computer using mavros , I want to know how to set the launch file so I can use APM stack, Is it all the same with px4 stack and using qgroundcontrol?

  • Hi  Vinh,

    In case you are still trying to use OPTICAL control..

    Are you trying to use OPTICAL CONTROL with camera on the DRONE or external camera?

    ON DRONE control can be done using PX4FLOW, a 3DR camera..

    Unfortunately this is all I have at this time.

    Offboard can be done  using a setup similar to what Mr. Jaeyoung Lim does in video 1

    the link is below:

    Offboard control



    Vinh K said:

    Hi David, were you able to get send velocity commands to make your drone move autonomously with GPS?

    I am using only optical flow, but I am not sure how to implement it.

    Jaeyoung Lim
    Read all of the posts by Jaeyoung Lim on 404warehouse
  • Yes!  Sorry I never posted when I found more information. I guess when no one else seemed interested I just kinda forgot about it. 

    If it is in guided mode, setpoint velocity commands do work. I honestly don't remember all of the details off the top of my head, but I am pretty sure it must be on the copter 3.3 which was probably what caused my problem when I posted this. It's been a while so I am a little fuzzy. But I think this is how it goes.

    Firmware must be arducopter 3.3 or higher
    Set to guided mode
    Send setpoint_velocity command
    Watch it work!

    I was definitely able to get this to work fairly well, although I never did implement it completely. (My job on the project ended at proving that it worked). If you have any questions let me know, I can't promise anything but I'll dig up what I can.

    • Hey thanks for the quick response Jefferson!! I'm working some vision guidance that will use a companion computer (probably a Jetson TK1).  Like you, I prefer the arducopter vice PX4 firmware. Next step is to understand how to get the mavlink code working in c or c++, rather than run it in python. 

      • where you able to get it working on c++. Ive done the example from dev.px4.io but I need more information on how exactly to request the FCU for velocity, accel, 3d pos and orientation.

        Introduction · PX4 Developer Guide
  • Anyone have an answer to this?

This reply was deleted.

Activity

DIY Robocars via Twitter
RT @a1k0n: @SmallpixelCar @diyrobocars It's just something that's easy to track with chroma keying. I ended up using different colors on th…
17 hours ago
DIY Robocars via Twitter
17 hours ago
DIY Robocars via Twitter
RT @TinkerGen_: "The Tinkergen MARK ($199) is my new favorite starter robocar. It’s got everything — computer vision, deep learning, sensor…
Nov 23
DIY Robocars via Twitter
Nov 23
DIY Robocars via Twitter
RT @roboton_io: Join our FREE Sumo Competition 🤖🏆 👉 https://roboton.io/ranking/vsc2020 #sumo #robot #edtech #competition #games4ed https://t.co/WOx…
Nov 16
DIY Drones via Twitter
First impressions of Tinkergen MARK robocar https://ift.tt/36IeZHc
Nov 16
DIY Robocars via Twitter
Our review of the @TinkerGen_ MARK robocar, which is the best on the market right now https://diyrobocars.com/2020/11/15/first-impressions-of-tinkergen-mark-robocar/ https://t.co/ENIlU5SfZ2
Nov 15
DIY Robocars via Twitter
RT @Ingmar_Stapel: I have now explained the OpenBot project in great detail on my blog with 12 articles step by step. I hope you enjoy read…
Nov 15
DIY Robocars via Twitter
RT @DAVGtech: This is a must attend. Click the link, follow link to read the story, sign up. #chaos2020 #digitalconnection #digitalworld ht…
Nov 15
DIY Robocars via Twitter
RT @a1k0n: Got a new chassis for outdoor races (hobbyking Quantum Vandal) but I totally didn't expect that it might cause problems for my g…
Nov 11
DIY Drones via Twitter
First impressions of the Intel OpenBot https://ift.tt/36qkVV4
Nov 10
DIY Robocars via Twitter
Nov 9
DIY Robocars via Twitter
Excellent use of cardboard instead of 3D printing! https://twitter.com/Ingmar_Stapel/status/1324960595318333441
Nov 7
DIY Robocars via Twitter
RT @chr1sa: We've got a record 50 teams competing in this month's @DIYRobocars @donkey_car virtual AI car race. Starting today at 10:00am…
Nov 7
DIY Robocars via Twitter
Nov 6
DIY Robocars via Twitter
RT @a1k0n: Car's view, using a fisheye camera. The ceiling light tracking algorithm gave me some ideas to improve ConeSLAM, and having grou…
Nov 5
More…