Hi,

 

Open source autopilots are widely used in the RC toys. I tried 3DR's pixhawk. It seems to be quite okay. There are also other open source autopilots.

The commercial autopilots are always used in commercial unmanned aircraft. They are somehow much more expensive than open source. I’m wondering what are commercial autopilots’ typical advantages compared with the best open source autopilots for conventional fixed wing usage. Could anyone give any hint on that?

 

Best,

Anna

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

    • Not all commercial autopilot companies look at and take Ardupilot code. How do they develop something of higher quality... lots of time and hard work from a building full of brilliant people with backing from large VC firms. It's not easy to build a simple to use, flexible, safe, and reliable vehicle agnostic autopilot but it is happening. The process top to bottom for HW and SW development is strictly controlled and tightly coupled. HW/SW is more thoroughly tested and every piece of HW goes through an array of calibrations. Probably most importantly this is all quantified and tracked to provide proof/evidence to support its certification. 

      • I'll bet there would be very few in the AP industry who haven't had a look or a fiddle with an APM or Pixhawk...

        • Well it would be foolish not to use/test other systems out there but by no means does that mean they are porting code or anything.

  • Anna: are you working in any comercial autopilot development?

  •  David, I point cloud map large tracts of forest using an inexpensive delta wing(FX61) with $200 CMOS camera (LUMIX 10x) pointed normal to ground and using the APM to trigger "photobursts" at various waypoints. Using the Structure From Motion Algorithm developed by University of Washington researchers, I ( and you too )  can create a  mega-pixel point cloud. My point clouds are typically 4x the density of most plane mounted LIDAR array generated point clouds. And they did not need a team of scientists to produce.  There are many programs that use the Structure from Motion Algorithm, including free Photosynth, but it requires internet connection to use. Another good program that does not need an internet connection is "photomodeler" . This new technology basically renders airplane mounted LiDAR arrays obsolete. When I told US Forest Service officials that I build point cloud mapping UAVs and use them to measure trees in remote forests, they did not believe me. They said,  "you are a nut job. Point cloud mapping requires a team of people and $millions to accomplish on any scale ".  I then showed them a few of my point clouds on my laptop. They claim I did not make those. They still believe this. They wallow in ignorance.

    • Forest, how do you trigger the lumix?,with servo? 

      • undefinedundefined3702544928?profile=originalCala, I trigger the lumix with the  Do_Set_Servo command  and I also use Condition_Delay

        Here is the process.  The Lumix is always in sleep mode so when shutter button is pressed for 1s, the camera turns on. This means only a single mechanical switch can control the camera on/off, focus, take picture and keep shutter pressed down for photo-bursting until a certain waypoint is reached and then you release the trigger..

        For instance, as the UAV approaches the target area for point cloud acquisition I set up a few waypoints prior to do certain Mav Commands to prepare the camera for photo-bursting.  The trigger is an 11g servo with rocker arm that presses against the shutter button. The servo has been fixed to the camera with JB weld and given a little flex point with rubber to keep switch from breaking in case of accidental over-driving of the servo position by programming wrong value.

        Here is the basic process. The first waypoint is a Do_Set_Servo 1700 followed by Condition Delay 1 and then followed by Do_Set_Servo 1300. So what I am doing here is  causing the servo arm to press against the Lumix shutter button at the correct position to take a picture...but the camera is in sleep mode to begin with so this action turns the camera on.  In your case the value may be totally different. You have to find the correct value by trial and error. The servo arm needs to be pushed against the trigger for at least one second though, so I add after Do Set Servo 1700 a Condition Delay 1, which means delay the next command for 1 second. Then I add Do set servo 1300, which puts the servo arm back in neutral position.  The result now is the camera is turned on with lens deployed and ready for action. If you do this too close to the target area the camera will not have time to start up and you will miss part of the survey.

        Now the next sequence is to focus the camera. So I do another waypoint about 200m further away to give the camera time to turn on and then I issue the command Do Set Servo 1500, then Condition Delay 3, then Do set servo 1300. This simulates a 1/2 button press so this is where the Lumix auto-focuses the target and sets the correct shutter and white balance. I give the Lumix a full 3s to do this. Now the camera is focused on the target and deployed open.  This last waypoint is about 100m from the first triggering waypoint.

        Now I set the trigger waypoint to be 10m lower from the focus camera waypoint,  so this UAV glides while photobursting. This minimizes shutter vibration. The trigger waypoint is again Do Set Servo 1700 and I keep shutter deployed continuously while UAV is over the target area. This simulates a photo-burst. 

        I then go back and forth to the target area and fill in the gaps by flying in from different directions. E-W, N-S, NW-SE etc..

        Don't forget to turn off the shutter trigger though. With Do Set Servo 1300 in my situation.

        I attached an example from Redwood Experimental Forest where the rangers there gave me permission to point cloud map a section of that forest....were the tallest trees are of course.

        I set the sleep mode on the Lumix to make sure the camera retracts before landing. If the lens is deployed while landing, dust and debris will get forced into the lens mechanism and ruin it.

        So this is my basic process. See attached KMZ file of UAV flight path to create a 20 mega pixel point cloud.

        • Thank's Forest, I'm going to try

    • Good morning,

      Thank you very much for the reply. I'm off looking into Photosynth and reading up on ALS and tree health. This is very interesting stuff!

      -David

  • Anna, The APM math library calculates quaternions and the same 4x4 matrix world frame transformations as any mil-spec auto-pilot.  APM is just smaller. The failures occur from lack of training and lack of quality control. There are no quality control standards for APM boards coming from overseas. From an engineering standpoint this could be a deal breaker.

This reply was deleted.

Activity

DIY Robocars via Twitter
RT @TinkerGen_: "The Tinkergen MARK ($199) is my new favorite starter robocar. It’s got everything — computer vision, deep learning, sensor…
yesterday
DIY Robocars via Twitter
yesterday
DIY Robocars via Twitter
RT @roboton_io: Join our FREE Sumo Competition 🤖🏆 👉 https://roboton.io/ranking/vsc2020 #sumo #robot #edtech #competition #games4ed https://t.co/WOx…
Nov 16
DIY Drones via Twitter
First impressions of Tinkergen MARK robocar https://ift.tt/36IeZHc
Nov 16
DIY Robocars via Twitter
Our review of the @TinkerGen_ MARK robocar, which is the best on the market right now https://diyrobocars.com/2020/11/15/first-impressions-of-tinkergen-mark-robocar/ https://t.co/ENIlU5SfZ2
Nov 15
DIY Robocars via Twitter
RT @Ingmar_Stapel: I have now explained the OpenBot project in great detail on my blog with 12 articles step by step. I hope you enjoy read…
Nov 15
DIY Robocars via Twitter
RT @DAVGtech: This is a must attend. Click the link, follow link to read the story, sign up. #chaos2020 #digitalconnection #digitalworld ht…
Nov 15
DIY Robocars via Twitter
RT @a1k0n: Got a new chassis for outdoor races (hobbyking Quantum Vandal) but I totally didn't expect that it might cause problems for my g…
Nov 11
DIY Drones via Twitter
First impressions of the Intel OpenBot https://ift.tt/36qkVV4
Nov 10
DIY Robocars via Twitter
Nov 9
DIY Robocars via Twitter
Excellent use of cardboard instead of 3D printing! https://twitter.com/Ingmar_Stapel/status/1324960595318333441
Nov 7
DIY Robocars via Twitter
RT @chr1sa: We've got a record 50 teams competing in this month's @DIYRobocars @donkey_car virtual AI car race. Starting today at 10:00am…
Nov 7
DIY Robocars via Twitter
Nov 6
DIY Robocars via Twitter
RT @a1k0n: Car's view, using a fisheye camera. The ceiling light tracking algorithm gave me some ideas to improve ConeSLAM, and having grou…
Nov 5
DIY Robocars via Twitter
RT @a1k0n: To get ground truth I measured the rug, found the pixel coordinates of its corners, calibrated my phone camera with my standard…
Nov 5
DIY Robocars via Twitter
RT @a1k0n: @DIYRobocars is back in December, but outside. Time to reinvestigate ConeSLAM! I rigged up a quick and dirty ground-truth captur…
Nov 5
More…