Here is Revision 2 of the Companion Computer System Architecture. Thank to JB for this one !!
This is the latest revision of the Companion Computer Architecture.
After some more discussion regarding the overall structure, we thought it would be a good idea to incorporate the various parts of the the complete system typically used in flying a UAV.
Please note that some of the items are optional or redundant and can be left out as indicated by a *. Ideally the RF link comprises of only one link, most likely over wifi, however for redundancy and compliance purposes other connections are shown.
This is composed of 4 main building blocks:
FC: Flight Control - This sub-system includes the RTOS based autopilot, telemetry radio*, RC receiver* and peripherals
CC : Companion Computer - This sub-system includes higher level Linux based CPU and peripherals
GCS: Ground Control Station - This sub system is the user interface for UAV control - This typically includes PC/Linux/iOS and Android based platforms that can communicate via telemetry radio, wifi or LTE/3G/4G
MLG: Multi Link Gateway - This is an optional system for use on the ground to provide connectivity with the CC and FC. It can also be used as a local AP, media store and antenna tracker etc.
The FC is connected as follows:
via RC receiver* to the remote control
via Telemetry* radio to the GCS
via UART or Ethernet to the CC
via FC IO to peripherals like ESC, servos etc.
The CC is connected as follows:
via WLAN to the GCS and/or MLG
via LTE/3G/4G* modem to the GCS and/or MLG
via UART or Ethernet to the FC
via CC IO to HD peripherals, like USB or CSI camera etc
The GCS is connected as follows:
via telemetry* radio from the FC
via MLG WLAN or MLG AP or direct from the CC AP
via MLG or direct LTE/3G/4G* through the internet or PtP
control of tracking unit on the MLG*
various peripherals like joystick, VR goggles etc
Replies
thanks!
The range covers like a room, something like 2-3 meters, but... on the vehicle, this distance will always be short, so, no worries about that...
I manage to get some time today and spend 30 min dealing with the sony API stuff and then I took a video to prove the latency and color detection, what my script is doing:
If you pause the video when the image is showing the two stopwatch, and subtract one time from another, we will get the latency... it is around 0.4 seconds... this time is also the time to get the image analyzed.
Then I made a opencv script that will just read and display the images from the camera, and the latency was 0.36 seconds.
Then I did a similar one but with pygame instead of opencv... this was faster at 0.2 seconds of latency.
The video to show what I did is here:
So, then again, we need to improve my color detection algorithms... But the wireless transmission is promising...
Aldo
You really got something interesting there, I was looking to do some experimentation with the PixyCam and a RaspberriPi ZERO even thinking of a complete small size platform using the soon to be released PXFMIni from @Victor
I created a new forum on the companion computer group, it is just next door balloon_finder , I invite you to join so we can specifically talk about the balloon_finder development and explore the different avenues like the pixy cam option
The new architecture diagram at the top looks good!
I created a new forum on the companion computer group, it is just next door balloon_finder , I invite you to join so we can specifically talk about the balloon_finder development and explore the different avenues
I think the overall intention of this group is to develop a generic companion computer for the FC.
I see the companion computer as a piece of hardware that like a phone/PC can run numerous applications and connect various hardware.
To move ahead two things need structure; the hardware and the software. Like Steve put it we need to standardize the architecture. I feel a lot of the discussion is a result of not having a standard structure, but this is mainly due to the fact that we're in the "brainstorming" phase of the development, where we want to accommodate all the ideas coming in. This is a good thing.
Out of this we need to develop the actual scope of the project. Currently I see it as follows in order of latency (copy paste from here):
All these can easily be accommodated on the resulting software image, but how it is all connected needs to be defined.
I'd like to propose that we start with standardizing the CC in the aircraft itself here and deal with the ground CC/Router setup in a separate thread.
-
Steve
Thanks for joining the group! I look forward to hearing from you here.
1) Agreed - This needs to be included for still imaging purposes on the CC - we have done the same
2) Openwrt is already on the list, however mavproxy is unlikley to run on UBNT hardware or similar because from what I know it doesn't support Python which mavproxy relies on. Ideally we'd run Openwrt on the air/ground CC for routing purposes to leverage the already available functionality there. For the ground side we should move this discussion to a new thread - I'm in the process of creating some
3) Agreed - The Joystick is another item for the the ground based system list - it's also possible to have a wifi direct joystick which does not need a PC to run, and can go up through the wifi relay - wifi RC control needs to have it's own development, CC is only the technology enabler
I'll get on to starting the individual threads so we can have separate development paths.
Regards
Based on the actual architecture I will push on the balloon finder for now on.
The next step are:
Build detailled list of requirement fo the Image
Experiment on multiple concurrent feeds of video for both streaming and openCV
experiment with the ne Raspbian image just released today wich got an OpenGL GPU accelerator
Document and build workflow of balloon_strategy.py that is th min program for th balloon_finder project
Keep you updated
Exciting stuff! Glad to see there's an effort to standardize the architecture so we can work to a common goal.
I've got a couple comments:
1) Cameras - Sony's camera API makes a really attractive target for a companion computer interface. Supporting some subset of the overall API would open up easy connection to anything from a $100 action cam on up to full frame SLRs, all with common calls. Because the cameras already interface over wifi, everything is already onboard that's necessary for integration.
2) The AP-Bridge/Antenna Tracker - What do people think about targeting one of the open source router frameworks like OpenWRT? It really fits nicely with what I see you wanting to do and allows the possibility of using something like: https://www.ubnt.com/airmax/nanobeamm/. Carrier class, environmentally hardened, and when you add up all the necessary pieces to cobble something similar together, not that expensive. I've tried turning a Raspberry pi in to a router before, but always come back to my nanostation. Would the hardware have high enough specs to support something like running Mavproxy?
3) Remote Control - I fly with a USB joystick today. That appears to be a relatively known quantity. However, I can see the advantage of decoupling the joystick from the GCS and having it communicate directly with the AP-Bridge. Is it worth the effort to expose that interface or do people think forcing the joystick to communicate via the GCS is adequate?
I'll certainly be following with interest and will try to participate where I can.
Thanks!
-Steve
Hello Steve
Thanks for your comments, that is very appreciated.
Here are my replies
1) Sony Camera is an excellent product that has a particular niche on the smartphone market. The problem with this device in a configuration like the balloon_finder is that it adds latency in the process and emits a third set of signal in a crowded RF environment on the drone.
2 )OpenWRT , this is a great product, I used to hack DLINK routers with this a long time ago and looking at the latest releases it is an active and full featured product. Once again, it adds latency in the transmission and the goal here is to get the minimum time and process between Lens-to-HUD. By the way, if you are interested in experimenting with Ubiquity class products , I recommend that you take a look at Patrick Duffy Blog. He is the king in this matter and it is all based on RPI as well. If I ever need to implement this type of equipment I will certainly ask Patrick for guidance, he is always there ready to help and share his knowledge.
3) Joystick, personally I do not use them, I guess it is a question of personal preference. You could connect to AP bridge if you build it on a RPI. Otherwise most of the flight goes through keyboard command of flightmode switch on the RC xmitter.
Hope you will take part on the balloon_finder group ;-)
The Sony camera api allows controlling any Sony camera with WiFi using JSON-RPC over http.
I agree that the latency might be not ideal for the ballon finder. But once we would have an onboard smartphone, I would anyway suggest using the phone's camera for the ballon finder.