Copter-3.3 beta testing

Warning #1: an issue has been found with Tower's Pause button which can cause the vehicle to fly to an old position if the vehicle has not sent a position update to Tower in some time.

Warning #2: Copter-3.3.2 fixes a bug found in Copter-3.3.1's desired climb rate initialisation which could lead to a sudden momentary drop when switching from Stabilize or Acro to AltHold, Loiter or PosHold.

Warning #3: Copter-3.3.2 fixes an issue found in Copter-3.3.1 which could lead to hard landings in RTL or AUTO if the WPNAV_SPEED_DN was set too high (i.e. >400 or 4m/s) and/or the WPNAV_ACCEL_Z was set too low (i.e. <100 or 1m/s/s).

Warning #4: a bug was found in Copter-3.3 which could cause a sudden crash if you abort a Take-off initiated from a ground station.  Video description is here.  The bug is fixed in Copter-3.3.1 so we recommend upgrading.

Note #1: AC3.3-rc8 corrected a long standing bug in the HDOP reporting.  HDOP values will appear about 40% lower than previously but this does not actually mean the GPS position is better than before.
Note #2: if upgrading from AC3.2.1 the vehicle's accelerometer calibration needs to be done again.
Note #3: set SERIAL2_PROTOCOL to "3" and reboot the board to enable FrSky telemetry like in previous versions.
Note #4: the wiki will be updated over the next few weeks to explain how to use the new features

Copter-3.3.1 is available through the mission planner.  The full list of changes vs AC3.2.1 can be see in the ReleaseNotes and below are the most recent changes since AC3.3.

Sadly this version (and all future versions) will not run on the APM2.x boards due to CPU speed, flash and RAM restrictions.

Changes from 3.3:

1) Bug fix to prevent potential crash if Follow-Me is used after an aborted takeoff

2) compiler upgraded to 4.9.3 (runs slightly faster than 4.7.2 which was used previously)

Changes from 3.3-rc11:

1) EKF recovers from pre-arm "Compass variance" failure if compasses are consistent

Changes from 3.3-rc10:

1) PreArm "Need 3D Fix" message replaced with detailed reason from EKF

Changes from 3.3-rc9
1) EKF improvements:
    a) simpler optical flow takeoff check
2) Bug Fixes/Minor enhancements:
    a) fix INS3_USE parameter eeprom location
    b) fix SToRM32 serial protocol driver to work with recent versions
    c) increase motor pwm->thrust conversion (aka MOT_THST_EXPO) to 0.65 (was 0.50)
    d) Firmware version sent to GCS in AUTOPILOT_VERSION message
3) Safety:
    a) pre-arm check of compass variance if arming in Loiter, PosHold, Guided
    b) always check GPS before arming in Loiter (previously could be disabled if ARMING_CHECK=0)
    c) sanity check locations received from GCS for follow-me, do-set-home, do-set-ROI
    d) fix optical flow failsafe (was not always triggering LAND when optical flow failed)
    e) failsafe RTL vs LAND decision based on hardcoded 5m from home check (previously used WPNAV_RADIUS parameter)

Thanks for your testing!

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –


  • Hi,

    Could you pls explain what is the actual purpose of having Storm32 connected to pixhawk via serial port?

    What specific functionality you use that requires that?

    I only use one channel sent to storm32 to alter tilt of the camera, do you use anything more than that?

    I`ve heard some people somehow use GPS coordinates to track s stationary ground object, but it escapes me how to implement that. Is there a resource anywhere with samples of such integration designs? I am just curious.

    Regards, Paul.

    • Developer

      I think using the serial/MAVLink connection is better than using PWMs because ardupilot can then specify the exact angle that it wants the gimbal to move to.  With PWM signals I think ardupilot can only specify where in the total of movement it wants.  They're equivalent if the SToRM's parameters match the ardupilot MNT_ parameters but in any case it's less direct.

      • Randy,

        I feel like I am missing something in this 'ardupilot can then specify the exact angle that it wants the gimbal to move to'.

        Where is this angle coming from? I can only think of the application where gimbal would be pointed at some spot on the ground, and, then, say, do a circle around than spot having gimbal tracking this target, but as I get it - it is not something that can be easily done, it would require tracker on the ground giving GPS coords of the target into the ardupilot and, frankly, I am not even sure if anybody did a prototype of this feature reliably enough to even spend time on this. 

        There was a talk about 'boat feature' where drone would 'follow' you as you move, but, again, as I understand it all still somewhat in the prototyping stage with nothing really working.


        But if we speak of a generic task to just adjust single tilt angle - what would ardupilot talk to storm32 in this case? Or am I missing something global here? I just split signal from X8R receiver into storm32 and pixhawk and it works fine, but if storm32 would benefit from having getting data from pixhawk - what other data except channel9 with tilt angle value would it receive?

        Is there anywhere a good article describing 101 of such integrations?

        • @Paul,

          easy job, basics of trigonometry.

          If you fly over an object on the ground and camera tracking via gimbal is on, so geolocation of the object on the ground (lat, long, alt)

          can be easily calculated from gimbal angles, GPS geolocation, altitude over the ground (barometric sensor, sonar, 2D/3D maps) supported by timing.

          So your drone can fly around having camera pointed at the object permanently.

          This is just how gimbal really works.

          If interested to have this feature implemented

          just email me


          • Isn't that already implemented though as ROI in MissionPlanner?

            I know that if I specify an ROI in an auto mission the quad will face that ROI and adjust the tilt of the gimbal to point the camera at the altitude specified above the ROI in mission planner.

            As for the specific angle, having seen a demo vid on YouTube, I was under the impression the do mount control specified an absolute angle based on the quad. IE, -90 is straight down, 0 is straight ahead and 90 is straight up.

            • @MarkM you are exactly right. ROI or region of interest (or approximated point of interest in 3D space) is selected manually by pointing gimbal camera at ROI (fuzzy Point Of Interest). 3 temporary angles of gimbal camera are saved and used to solve an equation of a stright line connecting point described by the geolocation of the drone and projection of that line with the ground plane at altitude saved on take off. Since Region of Interest or Point of Interest on the ground is now geolocated (fixed) so for any new geolocation of the drone in 3D space equation of a new stright line connecting 2 known, geolocated 3D points can be solved to calculate a

              • @Paul,

                if you mean personal driver's drone as one offered

                by Rinspeed Etos and landing on a moving car,

                I can implement such technology if budget is secured.

                I have offered Rinspeed Company from Switzerland   Pizza Delivery Drone and Mind Controlled Drone technologies since they have budget to fund these technologies and implement them into consumers' market.


                If there is a simple solution for that, it would be great to know how to do it - to get drone up and film a car on a track, or a boat on water with automatic landing to this ROI location as well.

              • @Paul,

                Video tracking can be easily implemented into your drone

                since video object tracking is a standard feature implemented into security, monitoring cameras today.

                I have presented paper on tracking flying objects by a number of smartphones to get track calculated on the fly (meteor fall)

                at Vienna Conference.

                Moving object tracking by gimbal camera can be implemented as a default feature if there is any interest shown among drone pilots.

                Just let me know.




                I know some people worked on the method to track a moving ROI, so to say, but I have no clue where this development is now. If there is a simple solution for that, it would be great to know how to do it - to get drone up and film a car on a track, or a boat on water with automatic landing to this ROI location as well.


              • I see. I know what ROI is but it was practically useless feature as it is extremely rare when you would need to film around a static target that does not move.
                Still, thanks for pointing this out. I will try it during gimbal setup next time I get to it.

                I know some people worked on the method to track a moving ROI, so to say, but I have no clue where this development is now. If there is a simple solution for that, it would be great to know how to do it - to get drone up and film a car on a track, or a boat on water with automatic landing to this ROI location as well.
This reply was deleted.


DIY Robocars via Twitter
How to use the new @donkey_car graphical UI to edit driving data for better training
DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
Nov 23
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord:
Nov 23
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
Nov 23
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20
DIY Robocars via Twitter
Incredible training performance with Donkeycar
Nov 9
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge…
Oct 31
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉…
Oct 23
DIY Robocars via Twitter
RT @IndyAChallenge: Congratulations to @TU_Muenchen the winners of the historic @IndyAChallenge and $1M. The first autonomous racecar comp…
Oct 23
DIY Robocars via Twitter
RT @JoeSpeeds: 🏎@TU_Muenchen #ROS 2 @EclipseCyclone #DDS #Zenoh 137mph. Saturday 10am EDT @IndyAChallenge @Twitch
Oct 23
DIY Robocars via Twitter
RT @DAVGtech: Another incident:
Oct 23