Imaging and mapping recommendations for fixed-wing UAV

I am part of an engineering team that uses a 2-metre UAV for aerial surveying. Our current imaging platform is the Canon G9 point-and-shoot camera connected to a Raspberry Pi 1. Under Linux, gphoto2 sends trigger commands to the camera over USB. The Pi is connected to the autopilot (formerly an ArduPilot Mega board, now a Pixhawk) to retrieve telemetry with which to tag the images. The images are sent to the ground over a WiFi connection separate from the autopilot link.

We have several problems with this setup:

1. gphoto2 is too slow to capture. In continuous shooting mode, we get an image at best every 4 seconds (or 0.25 fps). The camera itself can do 0.8 fps for a long time when disconnected, using burst mode. Flying at 20 m/s at an altitude of 100 m, we would need less than 3 seconds between images to obtain a suitable overlap of 30% using this specific lens. I put CHDK on the camera but had trouble controlling it over USB. If CHDK is recommended for triggering then I will try again. Any other triggering methods recommended? We could change cameras, but the lighter the better. A DSLR takes us to the very limit of our flight envelope. Industrial cameras are a good option but very expensive.

2. Our control software is outdated and does not synchronize the telemetry and photo capture accurately. I am writing a new program to take advantage of our new Raspberry Pi 2 and provide proper synchronization. Any recommendations for this? We looked at using the hot shoe on the G9 to know the moment of capture. We also tried Mission Planner geotagging with time offset from the log, but it is very desirable to get geotagged images from the UAV in flight without waiting to land.

3. The image compositing software I wrote needs some work. Currently it georeferences the images based on GPS, altitude, and attitude. It then warps the images to correct distortion. In QGIS, the images are overlaid together based on GPS only. No stitching or edge detection is done. Obviously this could be improved. We are investigating Pix4D. What techniques or software are recommended for this? Continuous processing (add additional images to the composite automatically as they come down from the UAV) would be a huge bonus. If done in one batch instead, processing time is important to us.

4. Our GPS might not be good enough for this application. We are using the 3DR uBlox GPS module which updates at 5 Hz and is accurate to about 2.5 m. Should we replace this with something better? Our goal is to generate georeferenced, possibly orthorectified composites with better than 0.25 m pixel resolution. In ground testing, we can get +/- 1 m GPS accuracy more than half the time.

Thanks.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Activity

DIY Robocars via Twitter
RT @_JonMyer: 🚨Attention DeepRacer's including #UndergroundDeepRacer🚨 Check our our LIVE stream that including @IAM_dbro Take a few moments…
Aug 5
DIY Drones via Twitter
RT @MarvelmindMaxim: Extreme precision for 60 swarming robots. #marvelmind #autonomousrobotics #robotics #swarmrobotics #rtls #ips #indoor…
Aug 3
DIY Drones via Twitter
RT @MarvelmindMaxim: Precise (±2cm) tracking for racing boats and autonomous boats. Works outdoor and indoor. #autonomous #AutonomousVehic…
Aug 3
DIY Drones via Twitter
RT @MarvelmindMaxim: Helping PixHawk folks to fly autonomous quadcopters using PX4 and ArduPilot. https://marvelmind.com/drones/ Equally suitab…
Aug 3
DIY Robocars via Twitter
RT @chr1sa: The @DIYRobocars @donkey_car virtual AI car race is starting in 15 minutes! Watch it live on Twitch https://www.twitch.tv/mossmann3333 htt…
Aug 1
DIY Robocars via Twitter
RT @chr1sa: Don't miss our monthly @DIYRobocars @donkey_car virtual AI car race tomorrow at 10:00am PT live on Twitch. Head-to-head racing…
Jul 31
DIY Robocars via Twitter
RT @sparkfun: Our completed tutorial on building an @NVIDIA Jetson Nano-powered @Sphero RVR gets your bot up and running via teleoperation…
Jul 30
DIY Robocars via Twitter
RT @SmallpixelCar: Freeway test https://t.co/4V5tV9lhIP
Jul 29
DIY Robocars via Twitter
Very small autonomous cars racing, thanks to an overhead camera: https://control.ee.ethz.ch/research/team-projects/autonomous-rc-car-racing.html
Jul 29
DIY Robocars via Twitter
Jul 29
DIY Robocars via Twitter
Jul 29
DIY Robocars via Twitter
RT @chr1sa: Don't miss our virtual AI car race this Saturday! Real developers + virtual cars =🏎️🏎️🏎️ Head-to-head battles with thrills, sp…
Jul 28
DIY Robocars via Twitter
Jul 27
DIY Robocars via Twitter
RT @usashirou1: Jetson nano by Isaac Kaya #jetson https://t.co/Mu1N0CyQkN
Jul 23
DIY Robocars via Twitter
RT @GPUsolution: JetRacer mady by Iflytek company #JetsonNANO #Nvidia https://t.co/MimTymIwge
Jul 23
DIY Robocars via Twitter
RT @openmvcam: I love this: Mega or Mini? Image Classification on the 1MB OpenMV Cam H7 by Ish Ot Jr. in OpenMV, Edge Impulse, Internet of…
Jul 23
More…