Using the Intel Edison as a Smart Camera Controller

For the past few weeks I have been working on a way to integrate better cameras into our mapping platforms. Our platforms are based around the APM: Copter and APM: Plane software and use 3DRs Pixhawk autopilots to control the vehicles. While the Pixhawk provides for a sime way to do survey missions, the next step towards advancing this area is to incorporate better cameras, more intelligence and vision processing into the systems. This is where the flexibility of this platforms provides a great advantage over other systems. 
The first step towards improving the system is to increase its processing capabilities and data collection sensors. By attaching a companion computer like the Edison, you provide extra processing without compromising the flight control system. This way if the Edison crashes, the Pixhawk can still bring the vehicle back in one piece. 
This post will show the first part of what we have been working on which is attaching a better sensor for surveys. Sony has some really cool cameras that have ditched every thing (or almost everything, the flash is still there) that a drone does not use, like a big bulky display and buttons. These cameras are meant to connect to a smartphone and use that as the control interface to the camera. This connection is done through Wifi, and Sony has released an API to allow developers to Crete their own apps. And in fact, all you really need is a Wifi connection to the camera. 
Using this API, you can control all aspects and settings the camera allows, things like Shutter speed, aperture, ISO, shooting mode, focus, zoom, picture quality, you can even preview the images that have been captured in different resolutions, and download the full size images of needed.  This opens up a much better control of the capture process and even allows for post processing of the images with vision algorithms on board! 
Working with her Dev Team, we have been able to write a module for Mavproxy and made some modifications to the Pixhwak code in order to allow control of these Sony cameras from the Pixhawk via mission items, and from ground stations sending mavlink commands to control the camera.
This is all still in its early stages, but it is functional enough now to allow for mapping missions using this system. 
The connections as shown above are pretty straight forward, an Intel Edison is used as the companion computer to translate the Mavlink commands into API calls sent to the Sony camera. The Edison is powered and connected to the Pixhwak using the Telemetry 1 port which provides up to 1 amp of power and forwards Mavlink Messages to the serial port on the Edison. Inside the Edison, MAVProxy consumes these messages and the smart camera module interprets them and sends the appropriate commands to the camera.
The messages used are the command long MAVProxy commands for DIGICAM_CONTROL and DIGICAM_CONFIGURE. Once the Edison is setup to connect to the Wifi hotspot from the Camera, it will look on that network for a valid camera IP and try to connect to that camera. Once connected the camera will respond to commands sent from the ground station or from a pixhawk mission. If you own a camera with a power zoom, you can zoom in or out using the DIGICAM_CONTROL message, and trigger the camera using the same message. Here are some results from a mapping mission using the Sony QX1 camera mounted on a 3DR Aero-M. 
I will write a wiki page on the developer documentation explaining the technical setup, but be aware some changes were required or the Pixhawk code so that it correctly handles the forwarding of the packets, these changes are currently only available on the AC3.3 beta code. The MAVProxy smart camera module is now in master in the MAVProxy Github repository which by the way was just recently moved into the Dronecode project Github account.

Views: 20648

Comment by Timothy on May 26, 2015 at 7:56pm

I think the latency of carera control may prevent the professional photographer to reach the sweet point in serious work. 

Comment by iskess on May 26, 2015 at 9:44pm
Has anyone tried to hard wire the QX1? We modded our NEX7 and it wasn't too difficult, but some brave soul made an online tutorial. We need to trigger at sub-second intervals so I can't rely on IR or wifi.

Comment by Hein du Plessis on May 26, 2015 at 10:03pm

Nice work, Jaime, thanks for sharing. I still rely on intervalometers.

Comment by Glenn Gregory on May 26, 2015 at 10:14pm

Thanks for the write-up Jamie.

I have also been using the edison to control a QX1 via wifi. Next step was integrating it to Pixhawk.

The one thing I worry about is the 2.4 wifi with 2.4 RC. I have not seen an issue using FrSky. Have you seen any issues with this? Potentially when the vehicle is further away and recieving a weak RC signal the QX1/Edison wifi may overpower and block the weak RC signal?

@iskess I have been investigating using USB to control QX1. You can currently use available triggers such as Gentles or Mono for basic shutter control. This can be interfaced to the Edison for the sub-second trigggering. But what I would like to do is figure out the MULTI protocol to allow full control over USB. Has anyone discovered this yet?

Comment by Dimitri S on May 26, 2015 at 10:57pm

Hey Glenn,

I'm actually working on a gimbal using two QX1s, I don't have time right now (school work, having fun, PCB stuffs to do) but I'm going to take a logic analyzer to it in about two weeks once school gets out.

Comment by Dimitri S on May 26, 2015 at 10:58pm

I forgot to say but tie the green wire to ground, which is shield, to trigger just a photo, I think it was orange for AF but I forget.

Comment by iskess on May 26, 2015 at 11:35pm
Thanks Glenn. I'm triggering an a5100 through the multiport, I suppose the same cable would work on the QX1? The QX1 body is 218g and the a5100 is 283g, so I could save 65g but drop from 24MP to 20MP. I'm not going to mess with wifi for triggering. The Pixhawk does a good job triggering by distance at sub-second intervals, so for my mapping needs I don't see a need for the Edison. Very innovative and exiting work though! Seems most useful for copters where you want to adjust zoom and frame the right shot.
Comment by iskess on May 26, 2015 at 11:46pm
Dimitri just posted the required usb triggering pins. My last question is about speed for the QX1, anyone know what the fastest continuous single shot time is?
Comment by Glenn Gregory on May 27, 2015 at 12:13am

@Dimitri Yes post your results if you find anything. I was going to do the same but am away for a few months so cannot do until then. I think it is a mix of LANC and something else.

@iskess Yes same cable 'should' work. Yes depends what you want to do with the QX1 whether a Edison is worth it. Instant geotag image :)

Yes depends which cable you have for pinout to direct trigger. See these for reference.

Using cable from RM-VPR1 remote (green for trigger):

I just got a FOTGA RM-VS1 remote to use the cable, but the pinout is different.

Comment by Serge Contal on May 27, 2015 at 2:02am

This is so cool !


You need to be a member of DIY Drones to add comments!

Join DIY Drones


Season Two of the Trust Time Trial (T3) Contest 
A list of all T3 contests is here. The current round, the Vertical Horizontal one, is here

© 2020   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service