I know this statement will raise the question who is this guy coming and telling us that current day autopilots are all wrong. Well I have been flying RC Planes from the age of 10 and have been using autopilots from the early Ardupilot of 2009 -2010 vintage till the more recent ones from 3DR, DJI, Feiyutech including their cheap clones  over the last 5-6 years. I have been coding from the age of 15 and am now 21 years old.
 
Based on my experience with this wide range of autopilots I have come to the conclusion that hardware of majority of autopilots  are adapted from the world of data based computing made for processing huge chunks of predefined data and giving a appropriate notification or display. In the case of data based computing inputs are got from low response data source like Ethernet/internet or some sensor network, this  data is processed and outputs are either notifications or a display and in a few cases some very slow speed controls. Nothing where high speed control of a dynamic object is involved even on a single axis.
 
Hence  the question : are these processors/hardware  made for controlling a dynamic moving object with freedom across 3 axis’s like a drone??
 
After using all types of available autopilots I realized that the fundamentals of  drone control at its core requires the following steps to be done repeatedly as fast as possible
1. reading  sensor values and conveying them to the controller/processor
2. filtering  these  sensor values
3. pushing the filtered values  into a PID loop
4. transferring control commands to the actuators for immediate action.

This cycle needs to be repeated over and over again the faster the better . This is what determines the stability of the drone the higher the cycle time the higher the stability .So what is needed in the case of drones is a continuous high speed input –output action reaction control system. I realized that drone control is not so much about data crunching as about speed of the control cycle.

If the use of  drones has to grow developers have to be given freedom to code for their applications without compromising this core control cycle. In the case of drones a developers code resulting in a system hang will result in catastrophic outcomes like either crashs or fly aways, both which have been regularly reported in current autopilots. Achieving high control cycle speeds & isolating the flight controls is not possible with the current architecture of sequential processing, hence the future of drones is limited by the architecture of
currently available autopilots.

So unless a new thought process emerges drone use cannot grow exponentially. What is needed is a motherboard that it radically different from anything available today.


I have been working on this for a while now and my first hand experience is that the moment I shifted my focus to achieving higher speed control loops with my self  designed autopilot the level of stability and performance I have been able to get are awesome even in very high costal wind speeds on a small 250 mm racer. I achieved this with the most primitive of micro controller used in the first ardupilot  the  ATMEGA 328. Things got even better when I replaced the MPU 6050. IMU with the MPU 9250.

With my custom made Distributed Parallel Control Computing Bus I have been able to achieve Altitude hold with a total drift  accuracy of less than 1 meter  and a very accurate heading hold as well as GPS navigation on the 250 mm  racer. All I did was to add another ATMEGA 328 in parallel to the first one to add on features.

Thanks to this I am able to completely isolate the core flight control loop from the APP development coding there by the drone is never compromised by faulty APP development coding.

Distributed parallel control computing I have found from my own experience is an architecture that really has the potential to create exponential growth in drone applications. I would be interested to know of any other ways by which others are trying to address this core unique control processing requirements of drones.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

    • Hi LD,

      Another point on which we have significant concurrence.

      Cone Laser scanner is in process with Nvidia TK1.

      Best,

      Gary

    • @Laser 

      Remember your proposal on the from companion computer group?

      It guess it would really be appropriate for this type of architecture.

      3702626889?profile=original

      • Developer

        The problem I see with this architecture is that the bus is a bottleneck ( The part labelled control network seems to be modelled as a bus)  Is there any reason for the actuators to be connected to the sensors or the video controller? It is very inefficient and performance slows as more devices are connected and the poor old bus slows down

        I decided to go with a star network as described here.

        http://diydrones.com/profiles/blogs/simple-distributed-flight-contr...

        My main reason was that I wanted to put more intelligence in the wings themselves.. sensors servos and RC in the wings rather than the fuselage, without having a large number of connections. 

        Doing things in parallel does in fact happen on currrent systems. Your gyro accel magnetometer and baro may all have processors on board. If you can isolate processes that are independent that is a great thing. In fact removing unnecessary coupling is a good goal in any system. The downside of doing it in non reconfigurable hardware is inflexibility.

        The advantage of doing it in software ( or reconfigurable hardware) is freedom to make fundamental changes on the same hardware.

        Personally I wouldnt bother with an 8 bit processor any more though. You can get cheaper 32 bit processors with more ram more rom and much better and more numerous peripherals. They will also handle an RTOS :) 

  • 100KM

    mauvdrones,

    Some of the orginal Swiss pixhawk implementations were on the gumstix processors that have both an ARM and a fast DSP.    The DSP is good for the core control loops.

    Here is a link to the gumstix site.

    https://store.gumstix.com/coms/overo-coms.html

    Overo COMs
    Gumstix's product series powered by the OMAP3 applications processors with Cortex-A8 Our most popular series of computers-on-module, featuring Texas…
    • @ David James:

      Yes , I agree a DSP is a boon while  creating an Autopilot as it serves like a Math Offload Core ,

      In Fact in further designs I will be implementing a TMS320F6678 (as interface Core). there are ways to enhance speed on normal MCU and MP’s ,

      I realized that I was using tangent quite a bit in all my calculations like Gps Nav , Fusers , some Kinematic calculations , so I decided to divide 360 degrees into 720 parts , giving me a resolution of 0.5 degrees , and stored all in the program memory except for 90 n 270 as int16_t, created a custom function called “float myTan(int16_t angl)” , hence my values spanned from -32767 to 32767 , divide by 100 à range= -32.767 to 32.767 , it was a bit on the flash , but boy it was damn fast as compared to native method.

  • Very good!  Keep up the great work!

  • I was ready to jump down your throat for being uninformed, but then I kept reading...a lot of words. But I applaud your efforts!  Where's Darius Jack to offer his "Open Technology Park"? :)

    • He's been band.

    • T3

      Somewhat ironically given the topic at hand Darius Jack was banned for, among other things, failing the Turing test.

  • Well done my friend. Funny, a few weeks ago i was thinking the same thing while reading a tread about the development of new arducopter boards. The current trend of thinking is to increase cpu power and multicore setups with 3 or more sets of sensors. The first thing that crossed my mind was: how about cost? It is easy for people in richer country's to buy complex FC's but the majority of people can not afford 100usd + boards. Secondly, why not make a board that runs the basic loops with a bus and then expand on that with daughter boards that add functionality like more sensors or even dsp style boards for more advanced routines like camera recognition as found on the Dji phantom 4. This would allow users to upgrade there FC's with features they want or can afford. Also this would allow third party development of new features independent of the development of the basic FC. And that is just what you have created. Big thumbs up. If you need some beta testers, let me know.
    @Kabir: How about aliasing noise? Take a look at this: https://www.youtube.com/watch?v=-lmoKal_e4s
    Of course there is a point where more resolution becomes less important but Joshua makes some good points in this vid.

This reply was deleted.

Activity

DIY Robocars via Twitter
RT @chr1sa: Just a week to go before our next @DIYRobocars race at @circuitlaunch, complete with famous Brazilian BBQ. It's free, fun for k…
Saturday
DIY Robocars via Twitter
How to use the new @donkey_car graphical UI to edit driving data for better training https://www.youtube.com/watch?v=J5-zHNeNebQ
Nov 28
DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
Nov 26
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
Nov 25
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
Nov 23
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord: https://discord.gg/EPsZHkg9Nx) https://t.co/PNDewvJdrb
Nov 23
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
Nov 23
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20
DIY Robocars via Twitter
Incredible training performance with Donkeycar https://www.youtube.com/watch?v=9yy7ASttw04
Nov 9
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge https://t.co/nZQTff5…
Oct 31
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉 https://t.co/NtKnO…
Oct 23
DIY Robocars via Twitter
RT @IndyAChallenge: Congratulations to @TU_Muenchen the winners of the historic @IndyAChallenge and $1M. The first autonomous racecar comp…
Oct 23
DIY Robocars via Twitter
RT @JoeSpeeds: 🏎@TU_Muenchen #ROS 2 @EclipseCyclone #DDS #Zenoh 137mph. Saturday 10am EDT @IndyAChallenge @Twitch http://indyautonomouschallenge.com/stream
Oct 23
More…