There are already many good basic multicopter controller boards on the market. Think of Naze32, KK2, etc. Then there's also great firmware, some of which that support multiple boards and PID controllers (e.g. Cleanflight). Most of these boards and firmware fly great, but lack or have poor navigation support (alt/GPS hold, RTH, etc). 

On the other hand we have Arducopter which is a multicopter controller that also has very feature rich and stable autopilot capabilities.

Wouldn't it be great if users could pick the basic flight controller of their choice and combine it with the autopilot of their choice? I'd like to do that, and I'm convinced that once these modular designs start appearing, that it'll boost both the development of both basic controllers and as well as autopilots.

What I'm thinking of is an autopilot board that acts as a PPM-sum filter between the RC receiver and the basic flight controller (e.g. barebones Naze32 or KK2), and has it's own gyro+accelerometer and I/O connectors for PPM-in, PPM-out, GPS+nav, and that's it. The autopilot need not even know how many motors the flight controller controls. It only has to be told via some configuration tool, what the functions of the various channels are and have configurable navigation PIDs.

If the autopilot/navigation board is designed small enough, then it can be simply stacked above the flight controller board and added at a later stage after the frame has been tuned.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

          • lot's of redundancy is easier said than done, it is not just a case of adding extra components. The devil is in the detail.

            Imagine having 2 cars in case of breakdown.  If you had a narrow drive and the car in front won't start, you couldn't get the back one out.  If you break down on a long journey, the fully working car at home is no use to you at all. On a cold damp day, the battery might have gone on both cars at the same time.

            So with sensors you must be able to recognise when one has a fault and if so which one is faulty.  Also most faults have a cause, they don't just spontaniously occur, you need to be sure that the same fault be triggered in both sensors. In a plane the pilot receives a lot of information to decide which of these to trust, in the autopilot this underlying data is hidden from us, so we are not empowered to make that same decision where the autopilot can't itself.

            So these things are not trivial to deal with.

            • Just in case someone is curious about how Boeing deals with redundancy:

              http://www.citemaster.net/get/3c501aaa-39d5-11e4-9cb6-00163e009cc7/...

            • Sure Ben, you are completely right.

              Even on modern, big jets, serious issues can happen when onboard computers get conflicting data. And the conflicting data can lead even the pilots in the wrong direction.

              For example, in recent years some Airbus models had similar incidents / accidents where there was conflicting airspeed data.

              In my opinion (having some computer programming experience), we are facing new challenges. Engineers have decades of experience with aeronautical materials, learning from many accidents in the past.

              But when it comes to an accident where some software bug played some part, that's something relatively new. It's very hard to anticipate every combination of problems and make the software to deal with it in advance.

          • HI Eduardo, great perspective you are bringing here as a former airline pilot. Agree with you on some basic autopilot functions missing, like  horizontal speed hold and turn rate hold.

            They are actually available on auto missions, you can set them prior to a mission, but it's true you can't  all adjust them in flight. I think the main difficulty is the number of channels and knobs available,  you'd need many more on  a typical controller. (As an example, I am already maxed out at  16 channels and use all the knobs on my radio for things like adjustable yaw rate, pitch and roll expo/rates, gimbal pitch rate, etc ...).

            But there is certainly no theoretical limitation if a tablet or computer interface is used, with unlimited knobs, and given many controls could be multiplexed on a single channel.

            With respect to GPS dependency, I think the problem is technological. The  mems gyros used  drift considerably, so dead reckoning is not an option. That's the difference between mems gyros on a small drone  and heavy mechanical gyros in the $100,000's on airline jets.The complementary filter in DCM    fuses accelerometer data with gyors and helps adjust for their drift some, (and the newer EKF does this too) but still  in a very short time a GPS update is needed to "reset" them. Short of a breakthrough in mems gyros or alternatives to GPS like optical flow, I don't know how this problem can be overcome ...

            • Hi John, thank you for your reply.

              I guess the main thing I wanted to bring is this concept of flying 'manually' trough the autopilot (as I know it), which would be the guy who keeps the flight very stable.

              It's a very common thing in airplanes, something between fully manual and full auto flight plan mode.

              I think if one company really wants to hit mass market, that's very useful, as the lay person / photographer doesn't want to become an ace pilot, he just wants to move the camera without crashing and without shaky footage (and not having to plan missions with waypoints, etc).

              I agree with you about too much dials on the remote control, that reinforces the point that there is so much to do yet on the user interface front.

              Current controls are still inherited from r/c hobbyism, maybe the same sticks could behave differently if autopilot is engaged, for example.

              What I would like to have is something like a Wii game controler, with a button where I could engage autopilot (or many different modes like 'Sport', 'Video', etc).

              I guess user experience experts will find much better solutions in the future.

              I don't trust so much on my Android phone to fly things, as it messes even with my email and many times lags with the touch on the screen (not to mention hackers / viruses). I'm probably old school and like to have a dedicated, solid control in my hands.

              It's a pitty the inertial units can't deal with dead reckoning, I guess I was expecting too much from them, but some years ago I was really shocked to learn about the very existence of such small and cheap inertial units.

              And what about Doppler? I think that's what DJI is using on the Inspire but only for low altitude. I can tell you that we could fly for hours on it and it was very precise, not very recomended over water.

              Probably you would need more powerful sonars, not suitable for small drones..

              • Hi Eduardo

                On the subject of horizontal speed or turn rate hold I'm not sure that they would be very useful on a RC aircraft. Unless the aircraft is doing long range FPV it's unlikely that in the short distance that one can see an RC aircraft flying at a field that those sorts of mixed mode auto/manual parameters would be used. Some aircraft I fly are "invisible" in 20 seconds across the field. In the case of UAV operations these controls mostly already exist or are of no consequence when flying full auto for still imaging.

                Video for quads etc is of course different, but there I think the key to user ergonomics is to control them relevant to earth frame rather than aircraft orientation. That way the sticks always make sense...at least for me! Obviously the camera gimble would respond to orientation instead (aka heading) that way, which together with tracking should make it fairly simple for a single operator to get good footage. Most commercial operators will only ever be doing so with a dedicated pilot and camera operator however, as it's nearly impossible to be able to frame the camera footage and be situationally aware of the aircraft and position it correctly at the same time. It's a bit like texting on your mobile whilst eating a Big Mac and driving a car...not advisable at all, but some can pull it off!

                We've used Android devices for competitions and flights quite often without any significant faults or failures, beyond those we programmed in ourselves....ahem. ;-) For the reliability and size of the device I'd prefer using and Andorid device over a windows PC any day. Android sandboxing is pretty stable and secure to boot against malicious attacks too. You of course need half decent hardware though, that's where Apple has the edge a bit because they're all the same...if not quite as good! 

                • Thank you for commenting JB,

                  I'm probably thinking about the autopilot 'hold' modes because to me it seems much easier to fly these things from the inside rather from outside ;-)

                  The hold modes would be there mostly as 'dampers' against oscilations, not necessarily because one wants to keep the heading for long periods of time, for example.

                  Maybe some control sensitivity change would have the same effect, but it should be able to be disengaged, reverted to normal on a click.

                  Regarding video/photo, I would like a drone as a giant crane or camera dolly, which can perform very smooth movements.

  • I also agree that this is the future. Control on an MCU board with autopilot navigation and additional processing on a separate more powerful system running a full OS.  

    Think micro-controller based control board handling stabilization and other low level control + Intel NUC running the autopilot navigation, localization, path planning and sensor processing

    You get the reliability of embedded hardware plus the computational ability, expandability and sensor compatibility of a full X86 / computer system. Even Raspberry Pi would be a good start. I also think this will come very soon.

    This is obviously overkill for hobby world at the moment. It doesn't take much power to carry out low level control and navigate towards GPS waypoints but wait until laser scanners and vision processing get involved in industry and begin to trickle down into high end hobby applications such as 'professional' drone aerial photography. Look at what Perceptive Labs is doing with using image processing to track targets with a gimbal for aerial photography. I believe I've seen an image of their system which uses additional computing hardware strapped on top of a 3DR system to have the gimbal visually lock on to targets selected by the user'

    Want to use another camera system? Buy a new USB camera and plug it in and get the deb package for its drivers / interface. Don't worry about specific wiring and flashing the MCU again. 

    For smaller hobby aerial systems, it may be a stretch to power and fly even an RPi in something like a Bixler. Some of the processing intensive work may be able to get pushed to the ground station. 

    Micro-controller + computer with OS seems to be the way things are heading in ground robotics with ROS (Robot Operating System is a meta-operating system for linux). This architecture and ROS may yet push their way onto aerial systems (see Ascending Technologies flying Intel I7s with Linux / ROS on Pelican quadrotors) as sensing and computational requirements escalate.

    Disclaimer: I work at the first church of ROS. 

    PERCEPTIVELAB
    •   Dan Neault has already done exactly what Robbie describes, although I'm not sure if he used ROS.

      http://diydrones.com/group/learning-to-program-the-ardupilot-mega/f...

      Scroll down a bit there's even a photo. He's upgraded since that photo to a full windows board and lidar instead of ultrasound.

      Robbie, I think our  DroneScan project could do with  ROS, we have come to the limits of what embedded can do with our skills.  Do you do consulting? We cant manage another steep learning curve, would prefer to hook up with someone with those skills. P.M. if interested.

This reply was deleted.

Activity

DIY Robocars via Twitter
How to use the new @donkey_car graphical UI to edit driving data for better training https://www.youtube.com/watch?v=J5-zHNeNebQ
Monday
DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
Saturday
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
Nov 25
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
Nov 23
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord: https://discord.gg/EPsZHkg9Nx) https://t.co/PNDewvJdrb
Nov 23
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
Nov 23
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20
DIY Robocars via Twitter
Incredible training performance with Donkeycar https://www.youtube.com/watch?v=9yy7ASttw04
Nov 9
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge https://t.co/nZQTff5…
Oct 31
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉 https://t.co/NtKnO…
Oct 23
DIY Robocars via Twitter
RT @IndyAChallenge: Congratulations to @TU_Muenchen the winners of the historic @IndyAChallenge and $1M. The first autonomous racecar comp…
Oct 23
DIY Robocars via Twitter
RT @JoeSpeeds: 🏎@TU_Muenchen #ROS 2 @EclipseCyclone #DDS #Zenoh 137mph. Saturday 10am EDT @IndyAChallenge @Twitch http://indyautonomouschallenge.com/stream
Oct 23
DIY Robocars via Twitter
RT @DAVGtech: Another incident: https://t.co/G1pTxQug6B
Oct 23
More…