Here is Revision 2 of the Companion Computer  System Architecture. Thank to JB for this one !!

This is the latest revision of the Companion Computer Architecture.

After some more discussion regarding the overall structure, we thought it would be a good idea to incorporate the various parts of the the complete system typically used in flying a UAV.

Please note that some of the items are optional or redundant and can be left out as indicated by a *. Ideally the RF link comprises of only one link, most likely over wifi, however for redundancy and compliance purposes other connections are shown.

This is composed of 4 main building blocks:

  1. FC: Flight Control - This sub-system includes the RTOS based autopilot, telemetry radio*, RC receiver* and peripherals

  2. CC : Companion Computer - This sub-system includes higher level Linux based CPU and peripherals

  3. GCS: Ground Control Station - This sub system is the user interface for UAV control - This typically includes PC/Linux/iOS and Android based platforms that can communicate via telemetry radio, wifi or LTE/3G/4G

  4. MLG: Multi Link Gateway - This is an optional system for use on the ground to provide connectivity with the CC and FC. It can also be used as a local AP, media store and antenna tracker etc.

The FC is connected as follows:

  • via RC receiver* to the remote control

  • via Telemetry* radio to the GCS

  • via UART or Ethernet  to the CC

  • via FC IO to peripherals like ESC, servos etc.


The CC is connected as follows:

  • via WLAN to the GCS and/or MLG

  • via LTE/3G/4G* modem to the GCS and/or MLG

  • via UART or Ethernet to the FC

  • via CC IO to HD peripherals, like USB or CSI camera etc


The GCS is connected as follows:

  • via telemetry* radio from the FC

  • via MLG WLAN or MLG AP or direct from the CC AP

  • via MLG or direct LTE/3G/4G* through the internet or PtP

  • control of tracking unit on the MLG*

  • various peripherals like joystick, VR goggles etc


Views: 7666

Replies to This Discussion

Not sure if I'm reading the diagram correctly or not but I think in general we should only have one wifi link between the GCS and the vehicle.  That link should be able to be used by both the FCU or the Companion Computer but in general, the companion computer will be more powerful so it's probably best that it holds the wifi link with the GCS and passes through data to the FCU.

Ideally we should allow the 2.4Ghz tx/rx to be replaced by a wifi based transmitter like Buzz has been working on.

@Patrick, I'm very interested in the companion computer concept because of the potential for greater situational awareness. My thoughts on the architecture that you have presented is that it is too "complex" and therefore prone to many of the errors that we see in existing systems.

IMHO, there should one primary communications backbone based on a redundant protocol that handles the transfer of all control data. This is typically where CAN bus or Modbus could be used - I prefer CAN because it allows for multi-master interaction.

Video and high bandwidth data should be handled directly by the companion computer including the downlink as mentioned by Randy.

Noise sensitive protocols such as I2C, SPI and UARTs should not be permitted. Instead, CAN nodes can be used to link to peripherals that don't already include a CAN interface.

Hello Laser Developer,

Its a pleasure having you taking part of the discussion, your knowledge of system in general,  and Lasers/FPGA in particular are certainly an essential asset in this type of development.

Yes, I tried to put everything on the first pass. Now I will issue a revision one with stripped down design and add @Randy and your input.

It's nice to be involved Patrick. Please take into account that my experience in these matters is heavily swayed by what I've seen in large industrial systems and may not be directly transferable to the drone arena. That having been said, I am a big fan of using existing, open standards whenever possible so that the upgrade path is designed into the architecture of the system. This can make the starting point a little more difficult to achieve (like not directly connecting cheap sensors with I2C interfaces) but it results in a system that can evolve for many years to come.

Patrick...we're doing it again! Another design project! Like the X1! ;-)

I'm currently mulling over your layout and will try to present a modified version of what we used at the OBChallenge, that also included long range comms and redundancies. I will also try to justify each interaction and try to reduce the overall layout complexity as much as I can. Give me a couple of hours though! ;-)

Chat soon. (If you're not going to bed yet!)


Laser Dev

Good to see you here. Would welcome your input on this project!



I like buzz's project, and I agree. There's some better hardware on it's way that will help too.


Following comment , I made a revision 1

Please excuse the quality, this is a copy of a google drawing that I cannot make a direct publication within the forum windows (anyone knows how to do it?).

 Actuators, Sensors is now splitted. Actuators can be controlled by RC Xmitter on manual/bypass mode.

There is a gray sensor windows for devices that can be run on either systems (FCU-FMS).

Just one downlink and it is WIFI 5 Ghz for the moment.

AP Bridge - Antenna tracker can be either modes AP-Client and the tracker can be WIFI-USB or WIFI-WIFI or any other configuration. I added a WIFI (Hotspot) - LTE for Web Access, needed for Google Earth and additionnal web based apps., like running caffe on the NVIDIA JETSON (TR1 or TX1).

Backup link from GCS to RC XMit can be PPM - WIFI - Xbee - or Operator's Hands :-)

Hopefully we will agree on a common architecture within a revision or 2 and we can really start cooking.

Hello JB,

Actually I am working pretty early.. it is 06:36 in this windy winter morning.

If you prefer, you can work directly on the drawing (or a rev2).

I just gave you acces to google drawing.

Thanks Patrick!

Sorry about the it the wrong way around can have some of the heat from here, it was 42C hot today! ;-) I'm sitting in my aircon basement!

I'm trying to condense the diagram into different areas etc atm and will put a copy of it in your gdoc as rev2.

I would like to propose a change in the naming conventions to avoid confusion as I think FMS/FCU are to similar in meaning and in abbreviation.

  • In the ardupilot wiki the autopilot (FCU) is referred to as the Flight Controller or FC. It would be good to keep this I think.
  • The (FMS) companion computer or CC for short  is called exactly that in the wiki too. I don't particularly fancy companion computer (takes to long to type and say) but I can live with CC for now until we come up with something better.

Is that ok?

Looking closer to what I was thinking for sure.

A couple more things, I'd say the companion computer generally doesn't need to have direct access to the sensors because it can get similar data from the FCU via mavlink.  So for example it can request all the accelerometer data at up to 50hz if it needs it.  Because the companion computer should normally being doing "higher end" processing type work I think it'll be ok for it to not receive the really high rate sensor data that the FCU needs.

So I think basically split the upper most box, "Sensors" and connect them all to the FCU except for the camera which should stay with the companion computer.

That is no problem, I just tried to stick with aviation standards . 

Companion Computer, sounds just a little bit weird to me, because actually it is doing all the high level stuff. It is bizarre to call a NVIDIA TX1 the companion of a STM32 .... but who cares, as long as they make a good team  :-p

This may be a big leap of faith but why not run a single on-board control bus for everything?

  • Sensors are either smart enough to drive the bus directly or form part of an array that shares a common bus port
  • Actuators include motor speed controllers that can be directly on the bus or also part of an array
  • There is "no limit" to the number of nodes that can be attached - both master and slave types
  • The flight controller only talks to the bus. No other interfaces are required except maybe for programming
  • The flight controller is actually a "real-time" companion computer
  • The bus can be used to transfer commands and data in any direction, to and from any node
  • The bus becomes the "center" of the design, not the flight controller

Laser Dev

Awesome! I like the idea and its a discussion I had when I visited Tridge back in June 2013 where I asked him why we weren't using the CAN bus more on the STM32! CAN does that for the automotive industry already, so why not use it? There they use the engine management computer and Gearbox or ABS computer to take over from the other when one fails.

The only issue I see is that some devices don't yet have CAN and will need a adapter. It requires a few new hardware components, but I'm in complete agreement. The only addition I'd make is a direct radio link between GCS or RC TX and the Flight Controller as well, for redundancy purposes. A CAN radio might be an option otherwise.

I'm not sure if it's in the scope of companion computer we're discussing here, but definitely the way to go IMHO.



Season Two of the Trust Time Trial (T3) Contest 
A list of all T3 contests is here. The current round, the Vertical Horizontal one, is here

© 2020   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service