For the past few weeks I have been working on a way to integrate better cameras into our mapping platforms. Our platforms are based around the APM: Copter and APM: Plane software and use 3DRs Pixhawk autopilots to control the vehicles. While the Pixhawk provides for a sime way to do survey missions, the next step towards advancing this area is to incorporate better cameras, more intelligence and vision processing into the systems. This is where the flexibility of this platforms provides a great advantage over other systems.
The first step towards improving the system is to increase its processing capabilities and data collection sensors. By attaching a companion computer like the Edison, you provide extra processing without compromising the flight control system. This way if the Edison crashes, the Pixhawk can still bring the vehicle back in one piece.
This post will show the first part of what we have been working on which is attaching a better sensor for surveys. Sony has some really cool cameras that have ditched every thing (or almost everything, the flash is still there) that a drone does not use, like a big bulky display and buttons. These cameras are meant to connect to a smartphone and use that as the control interface to the camera. This connection is done through Wifi, and Sony has released an API to allow developers to Crete their own apps. And in fact, all you really need is a Wifi connection to the camera.
Using this API, you can control all aspects and settings the camera allows, things like Shutter speed, aperture, ISO, shooting mode, focus, zoom, picture quality, you can even preview the images that have been captured in different resolutions, and download the full size images of needed. This opens up a much better control of the capture process and even allows for post processing of the images with vision algorithms on board!
Working with her Dev Team, we have been able to write a module for Mavproxy and made some modifications to the Pixhwak code in order to allow control of these Sony cameras from the Pixhawk via mission items, and from ground stations sending mavlink commands to control the camera.
This is all still in its early stages, but it is functional enough now to allow for mapping missions using this system.
The connections as shown above are pretty straight forward, an Intel Edison is used as the companion computer to translate the Mavlink commands into API calls sent to the Sony camera. The Edison is powered and connected to the Pixhwak using the Telemetry 1 port which provides up to 1 amp of power and forwards Mavlink Messages to the serial port on the Edison. Inside the Edison, MAVProxy consumes these messages and the smart camera module interprets them and sends the appropriate commands to the camera.
The messages used are the command long MAVProxy commands for DIGICAM_CONTROL and DIGICAM_CONFIGURE. Once the Edison is setup to connect to the Wifi hotspot from the Camera, it will look on that network for a valid camera IP and try to connect to that camera. Once connected the camera will respond to commands sent from the ground station or from a pixhawk mission. If you own a camera with a power zoom, you can zoom in or out using the DIGICAM_CONTROL message, and trigger the camera using the same message. Here are some results from a mapping mission using the Sony QX1 camera mounted on a 3DR Aero-M.
I will write a wiki page on the developer documentation explaining the technical setup, but be aware some changes were required or the Pixhawk code so that it correctly handles the forwarding of the packets, these changes are currently only available on the AC3.3 beta code. The MAVProxy smart camera module is now in master in the MAVProxy Github repository which by the way was just recently moved into the Dronecode project Github account.
Comments
Since playing with the Remote API, I recently discovered this implementation on the ESP8266.
https://github.com/kuczy/Sony-QX-10-WiFi-Remote-Control-ESP8266-Ard...
Might be helpful to others attempting to fly the camera, without putting too much extra weight up there.
Hey Jaime,
You mentioned that you'd be writing a wiki page to explain the setup of the smart camera module to control the QX1. I'm currently trying to download images from the camera to an Odroid C2, the smart camera module seems to be extremely useful.
Thanks!
Hey all Jamie Machuca published the connecting aspect of this on instructables:
http://www.instructables.com/id/Intel-Edison-Smart-Camera-Trigger-f...
I have gotten the camera to work using Seagull UAV but I have a problem turning the camera on before the flights.
The QX1 shuts down after 2 minutes of inactivity, meaning I can't just turn it on and launch the plane quickly (not reliably at least)
With seagullUAV you can turn on the camera, but then you need 3 different PWMs: Idle, Toggle on/off, trigger
This means you can not configure the servo output as a camera trigger (camera triggers only have 2 positions allowed)
And with do_repeat_servo commands it is very difficult to plan missions. I couldn't get grid survey to export a mission using do_set_servo as trigger
Is there any solution here that I am missing?
Any help on this would be much appreciated
Have you tried connecting to multiple QX1s simultaneously using the edison? I know it is possible with a WPS pushbutton enabled access point but am curious about using the edison.
Will the edison work with the sony HX90v?
Yes I have check my baudrate on my Intel edison multiple times now, I set it in my .bashrc with the line stty -F /dev/ttyMFD1 57600.
Yes the intel edison J19 pin 3 is connected on TELEM2 ground. I will send pictures of my setup so you can have a better idea of what I have going on.
I will try this tomorrow, I believe I can get my hands on a FTDI cable or at least make one. Thanks again for your help.
https://pixhawk.org/peripherals/onboard_computers/intel_edison suggests you might be ok without a level converter.
https://www.sparkfun.com/products/13038 IS a edison GPIO breakout board WITH a level converter.
the edison manual itself says the I2C SNB9024 chip has: "Seven general purpose 1.8V I/Os,with two of them supporting up to 3.3V".
It also says that pretty much all the rest ( except USB) "use 1.8v signalling", but nowhere that I could find does it describe it's maximum tolerance/s for allowble over-voltages on IOs. :-(
see: http://akizukidenshi.com/download/ds/intel/edison-module_HG_331189-...
Sorry I forgot you performed a loopback test. Yes this proves the pin is still ok. Yes cat /dev/ttyMFD1 will do the same as the screen command, you need to confirm you are using the correct baudrate as per your linked forum i.e stty -F /dev/ttyMFD1 -a
The pixhawk should start sending mavlink data on TELEM2 about 5 seconds after powerup, although this data will appear to be garbage. I would confirm the Pixhawk is outputting mavlink data on TELEM2 first and your wiring is correct by connecting to a computer first with a FTDI if you have one handy.
Are you connecting Ground on TELEM2 to Edison Ground?
@Glenn
Thanks for the response. So I realized the intel edison is on a 1.8V logic but i have seen other people to get the intel edison and pixhawk to successfully talk to each other so i know it's possible without a level shifter. Doesn't my loopback test prove that I haven't damaged any pins yet?
In my loop back test I ssh into the intel edison and then use cat /dev/ttyMFD1. This should be the same thing as screen /dev/ttyMFD1 57600 correct? For loopback test I basically open two terminals via ssh and have one terminal echo hello > /dev/ttyMFD1 and the other terminal I run cat /dev/ttyMFD1. The receiving terminal then outputs hello into the terminal. However when I just run cat /dev/ttyMFD1 while connect to the pixhawk telem2, I receive nothing.
I am using the mini breakout board.
and yes the SERIAL2_PROTOCOL parameter is = 1.