For the past few weeks I have been working on a way to integrate better cameras into our mapping platforms. Our platforms are based around the APM: Copter and APM: Plane software and use 3DRs Pixhawk autopilots to control the vehicles. While the Pixhawk provides for a sime way to do survey missions, the next step towards advancing this area is to incorporate better cameras, more intelligence and vision processing into the systems. This is where the flexibility of this platforms provides a great advantage over other systems.
The first step towards improving the system is to increase its processing capabilities and data collection sensors. By attaching a companion computer like the Edison, you provide extra processing without compromising the flight control system. This way if the Edison crashes, the Pixhawk can still bring the vehicle back in one piece.
This post will show the first part of what we have been working on which is attaching a better sensor for surveys. Sony has some really cool cameras that have ditched every thing (or almost everything, the flash is still there) that a drone does not use, like a big bulky display and buttons. These cameras are meant to connect to a smartphone and use that as the control interface to the camera. This connection is done through Wifi, and Sony has released an API to allow developers to Crete their own apps. And in fact, all you really need is a Wifi connection to the camera.
Using this API, you can control all aspects and settings the camera allows, things like Shutter speed, aperture, ISO, shooting mode, focus, zoom, picture quality, you can even preview the images that have been captured in different resolutions, and download the full size images of needed. This opens up a much better control of the capture process and even allows for post processing of the images with vision algorithms on board!
Working with her Dev Team, we have been able to write a module for Mavproxy and made some modifications to the Pixhwak code in order to allow control of these Sony cameras from the Pixhawk via mission items, and from ground stations sending mavlink commands to control the camera.
This is all still in its early stages, but it is functional enough now to allow for mapping missions using this system.
The connections as shown above are pretty straight forward, an Intel Edison is used as the companion computer to translate the Mavlink commands into API calls sent to the Sony camera. The Edison is powered and connected to the Pixhwak using the Telemetry 1 port which provides up to 1 amp of power and forwards Mavlink Messages to the serial port on the Edison. Inside the Edison, MAVProxy consumes these messages and the smart camera module interprets them and sends the appropriate commands to the camera.
The messages used are the command long MAVProxy commands for DIGICAM_CONTROL and DIGICAM_CONFIGURE. Once the Edison is setup to connect to the Wifi hotspot from the Camera, it will look on that network for a valid camera IP and try to connect to that camera. Once connected the camera will respond to commands sent from the ground station or from a pixhawk mission. If you own a camera with a power zoom, you can zoom in or out using the DIGICAM_CONTROL message, and trigger the camera using the same message. Here are some results from a mapping mission using the Sony QX1 camera mounted on a 3DR Aero-M.
I will write a wiki page on the developer documentation explaining the technical setup, but be aware some changes were required or the Pixhawk code so that it correctly handles the forwarding of the packets, these changes are currently only available on the AC3.3 beta code. The MAVProxy smart camera module is now in master in the MAVProxy Github repository which by the way was just recently moved into the Dronecode project Github account.
Comments
I have the VP systems Multi + USB for my A5000 and does allow for control over camera parameters:
http://vp-systems.eu/order_cr.html#CABLE-MULTI-USB
"Controls zoom (mostly 2 speed option), movie recording, shutter release* and camera off/wakeup features of the camera through MULTI interface. USB connection can be used to charge the camera on the fly or for remote control (Shutter speed, aperture, ISO and Movie recording). Can be used both MULTI+USB remote control features."
They also mention a UART control option but I haven't looked into it, just using 3 PWM channels at the moment.
@technicus acityone the BMCC is a really nice camera for video, but the resolution it gets is very limited 1080p (1920x1080) that equates to 2MP, so for mapping work that is pretty much useless. Also the only way to get stills from the BMCC is to set it to shoot in RAW and you will get 1 pic per frame and geotagging those images would be very hard as there is no feedback from the camera when each image is taken.
I think each camera has its strengths and weaknesses, the BMCC is great for video and productions, but not for shooting stills or for mapping.
Test with the 2.4ghz control and wi-fi:
Wi-fi 2.4GHz with FrSky its ok. No problem.
Wi-fi 2.4GHz with turnigy 9x module is bad. Wi-fi down.
@Vince hogg, the orientation of the images might be annoying, but its not relevant for the image processing, software like Pix4D takes the images as is and produces the same results. The live video can be obtained through this system and sent out via streaming, and hopefully when the Solo comes out we might be able to inject the signal into the solo streaming. ;)
@Glenn Gregory, I have not seen any issues with the 2.4ghz control. However I am only doing short control commands to the camera. (take picture, change exposure, etc) I think Tridge tried taking a picture and downloading it right after and that did cause some interference. Its on my list of tests.
Regarding USB control, it can be done, there are several solutions out there, however you can only control zoom (mostly 2 speed option), movie recording, auto focus* and shutter release, according to one of the manufacturers:
http://vp-systems.eu/camremote.html
About the camera speed, you can shoot at up to 3.5 fps in continuous burst mode but only for 15 frames. In reality I have been able to shoot at about 1 frame per second continuous. But keep in mind this solution is not for fast shooting, and there is a shutter lag between the command sent to the pixhawk and the actual picture taken that is quite significant so for the Geotagging we will use the feedback from the camera that tells us when the pic was taken. I also have to add the clock setting of the camera to sync it with the gps clock on the pixhawk, that will give more accurate geotags.
Drone deploy also uses Wifi triggering for the QX1 and for the GoPro as far as I know.
In general the strength of using this system is not to just use it as a trigger, it gives you full control over the camera, as well as the possibility to use the images for navigation or for on board image analysis. So for search and rescue you could be running an algorithm to recognize people and report possible positions to the ground station. Like it was done on the Australian outback challenge. Also this solution can also be used to control multiple cameras and can be extended to use other types of cameras like multispectral or thermal.
@Jaime: Nice job, I'm very happy that you take open-source serious Jaime.
The QX1 for drone use is let down by a few issues.
--Wifi control is not ideal for most of us.
--there is no live video out (apart from wifi)
--there is an annoying 'auto up' image correction which flips about when the camera is vertically down.
hoping the next 'lens' style camera will fix these issues.
@technicus: are you able to take stills with the BMCC?
A QX1 working with a PixHawk AP, connected via a Multi/USB cable to a CamRemote V2 that is connected to the PixHawk Aux port. Works perfect! The plan is to fly a multispectral camera alongside the QX1, so we'll have RGB HR imagery to mosaic alongside multispectral imagery for crop health, the multispectral imagery is a lot lower resolution.
Note the CamRemote V2's is mounted undern the INERTIC camera mount, you can see the blue reset button.
There's some cool stuff going on right now with mini Linux computers connected to Ardupilot.
I'm experimenting with Mavproxy and Gstreamer at the present moment.
Also exploring using gamepad controllers for Android