Developer

3689652590?profile=original

 
For the past few weeks I have been working on a way to integrate better cameras into our mapping platforms. Our platforms are based around the APM: Copter and APM: Plane software and use 3DRs Pixhawk autopilots to control the vehicles. While the Pixhawk provides for a sime way to do survey missions, the next step towards advancing this area is to incorporate better cameras, more intelligence and vision processing into the systems. This is where the flexibility of this platforms provides a great advantage over other systems. 
     
The first step towards improving the system is to increase its processing capabilities and data collection sensors. By attaching a companion computer like the Edison, you provide extra processing without compromising the flight control system. This way if the Edison crashes, the Pixhawk can still bring the vehicle back in one piece. 
     
This post will show the first part of what we have been working on which is attaching a better sensor for surveys. Sony has some really cool cameras that have ditched every thing (or almost everything, the flash is still there) that a drone does not use, like a big bulky display and buttons. These cameras are meant to connect to a smartphone and use that as the control interface to the camera. This connection is done through Wifi, and Sony has released an API to allow developers to Crete their own apps. And in fact, all you really need is a Wifi connection to the camera. 
      
Using this API, you can control all aspects and settings the camera allows, things like Shutter speed, aperture, ISO, shooting mode, focus, zoom, picture quality, you can even preview the images that have been captured in different resolutions, and download the full size images of needed.  This opens up a much better control of the capture process and even allows for post processing of the images with vision algorithms on board! 
     
Working with her Dev Team, we have been able to write a module for Mavproxy and made some modifications to the Pixhwak code in order to allow control of these Sony cameras from the Pixhawk via mission items, and from ground stations sending mavlink commands to control the camera.
     
This is all still in its early stages, but it is functional enough now to allow for mapping missions using this system. 
3689652616?profile=original
The connections as shown above are pretty straight forward, an Intel Edison is used as the companion computer to translate the Mavlink commands into API calls sent to the Sony camera. The Edison is powered and connected to the Pixhwak using the Telemetry 1 port which provides up to 1 amp of power and forwards Mavlink Messages to the serial port on the Edison. Inside the Edison, MAVProxy consumes these messages and the smart camera module interprets them and sends the appropriate commands to the camera.
    
The messages used are the command long MAVProxy commands for DIGICAM_CONTROL and DIGICAM_CONFIGURE. Once the Edison is setup to connect to the Wifi hotspot from the Camera, it will look on that network for a valid camera IP and try to connect to that camera. Once connected the camera will respond to commands sent from the ground station or from a pixhawk mission. If you own a camera with a power zoom, you can zoom in or out using the DIGICAM_CONTROL message, and trigger the camera using the same message. Here are some results from a mapping mission using the Sony QX1 camera mounted on a 3DR Aero-M. 
    
    
I will write a wiki page on the developer documentation explaining the technical setup, but be aware some changes were required or the Pixhawk code so that it correctly handles the forwarding of the packets, these changes are currently only available on the AC3.3 beta code. The MAVProxy smart camera module is now in master in the MAVProxy Github repository which by the way was just recently moved into the Dronecode project Github account.
    
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • @Joseph the Edison UART uses 1.8V logic, while the Pixhawk TELEM2 uses 3.3V logic. You will need to use a logic level shifter in-between. I can't find the Edison specs to determine if the Edison IO is 3.3V tolerant, but you potentially could have damaged the pins.

    I'd add a level shifter and then check if the Edison is receiving any data, before trying in Roscopter, such as 

    screen /dev/ttyMFD1 57600

    What breakout board are you using with the Edison?

    Do you have SERIAL2_PROTOCOL = 1 in the Pixhawk?

  • I am new to the community and so if this is the wrong place to post this please forward me to the correct forum.

    I am trying to get a intel edison to be a companion computer for the pixhawk (apm firmware).  I am using J18 pin 13, J19 pin 8 and 3 for the serial communication pins (this should be uart1).  I have tested the pin connections and a have done loop back test and I believe that it is possible for the intel edison to recieve mavlink packets from the pixhawk.  The intel edison is connect to telem2 and I have set the correct baudrates for both the intel edison and the pixhawk.  The pixhawk is configure for mavlink as well.  

    My software setup is as follows I have ubilinux on board the intel edison with ROS.  I am trying to use Roscopter node (https://github.com/UtahDARCLab/roscopter) however I get stuck trying to receive heartbeat msg from the pixhawk.  I believe I have setup the intel edison pins correctly (i have been following this  https://communities.intel.com/message/265412) I have the latest adrucopter firmware uploaded through mission planner on the pixhawk.  I don't know what else to try.  I have been working on this for sometime now and would really appreciate any help or guidance.

    Thanks,

    JB

    UtahDARCLab/roscopter
    ROS interface for Arducopter using Mavlink 1.0 interface. - UtahDARCLab/roscopter
  • is there a way to use this setup with the solo and pixhawk 2?

  • Developer
    The code is released as a module within Mavproxy. It can control the zoom, trigger, set modes,exposures, apertures. Live view is not there because it may interfere with the RC if it's in 2.4 ghz also.

    https://github.com/Dronecode/MAVProxy/tree/master/MAVProxy/modules/...
  • Is there any development on this.  I would like to use a sony camera on a copter and be able to control the zoom and camera and have fpv

  • Hi Guys,

    I am also considering switching from S110 to QX1, and using the MONO, with a 3DR AERO. I am curious what camera settings I could use. I assume even after integrating the camera into the plane I could change the settings via wifi while on the ground and the settings should be sufficient even without the CHDK script previously on the Canon.

    Thanks,

    Balint

  • Although this article is about the QX1, it's worth noting that there are several cameras which support the same Remote API including the action cam family.

    https://developer.sony.com/develop/cameras/

    As far as I know there is no way to hard wire to these cameras, but there are firmware tools for reverse engineering the firmware so maybe one will be found. The run Linux internally...

    There's also another github project which supports the full range of 'commands' to these cameras, so you can do all sorts of interesting things.

    https://github.com/Bloodevil/sony_camera_api

    Simon.

    Overview - Camera Remote API beta SDK - Sony Developer World
  • HI @OscarAvellanedaCruz,

    > which Frsky RX did you test?
    Frsky D8R-II Plus 2.4ghz (ACCST Telemetry)

    > Was there a drop in signal quality?
    No

  • @Marden Alcantara, which Frsky RX did you test?

    Was there a drop in signal quality?


  • Jaime, very nice work. I am keen to test it myself.
    I have would like to know what it is the actual shutter delay using the microusb trigger method.

    Aside from this, has anybody tested the Olympus Air? It has a veeery impressive specs and form for aerial surveys. Also an API.
    Problem is that it is difficult to buy.
This reply was deleted.