Photo transmission up to 1.5 km over WiFi

Hi,

My student team is currently using a Ubiquiti Rocket M5 and AMO-5G13 antenna on the ground connected to a Ubiquiti Bullet BM5HP and Fat Shark antenna on our fixed-wing UAV. This link is used for sending commands to the camera system on a Raspberry Pi 2 onboard and transferring images from the plane to the ground.

In the past we were sending about 2 MB 12 megapixel JPGs every 5 seconds. Recently we have upgraded our camera and will be sending smaller images but more quickly (5 MP at 1-2 per second). Due to a limitation of the new camera's API on ARM we may not be able to convert raw image data to JPG onboard and may have to send uncompressed raw data to the ground (5 to 15 megabytes depending on bit depth). This will be up to a distance of 1.5 km line of sight.

In a very simple test we set up the link indoors with about 20 feet between the antennas and got an average of 5 MBps/40 Mbps using scp to transfer a 100 MB file between a laptop and the Raspberry Pi in either direction. The Pi has a high-speed microSD card. It seems like we should be able to get a lot more. The Bullet has a maximum configurable speed of 65 Mbps (why 100 advertised?) and the Rocket has a max of MCS 15 / 130.

If we only get 40 Mbps at a very short distance, then won't it be significantly less over a long distance? Or is the short distance test causing issues? How can we improve performance with the current antennas or by buying better antennas or other hardware?

Thanks.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Activity

DIY Robocars via Twitter
yesterday
DIY Robocars via Twitter
RT @SahikaGenc: AWS DeepRacer & Hot Wheels Track https://youtu.be/4H0Ei07RdR4 via @YouTube
Sep 14
DIY Robocars via Twitter
Sep 8
DIY Robocars via Twitter
RT @davsca1: We are releasing the code of our Fisher Information Field, the first dedicated map for perception-aware planning that is >10x…
Sep 8
DIY Robocars via Twitter
RT @SmallpixelCar: How this works: 1)object detection to find cones in single camera image, 30 frames/sec on @NVIDIAEmbedded Xavier. 2)comp…
Sep 8
DIY Robocars via Twitter
RT @SmallpixelCar: Use two color cones to guide the robocar. No map needed, on onsite training needed. Just place the cones and it will fol…
Sep 7
DIY Robocars via Twitter
Sep 7
DIY Robocars via Twitter
RT @roboton_io: Great to see http://roboton.io running at 60fps on the cheapest #chromebook we could find! #edtech #robotics #educat…
Sep 3
DIY Robocars via Twitter
RT @openmvcam: Crazy in-depth article about using the OpenMV Cam for Astrophotography: https://github.com/frank26080115/OpemMV-Astrophotography-Gear https://t.co/BPoK9QDEwS
Sep 3
DIY Robocars via Twitter
RT @openmvcam: Hi folks, it's finally here! Our first draft of our Arduino Interface Library is out! It works over SoftwareSerial, Hardware…
Sep 3
DIY Robocars via Twitter
RT @chr1sa: Please let them have an open API. This would be perfect for @DIYRobocars races https://twitter.com/NintendoAmerica/status/1301513099707658246
Sep 3
DIY Robocars via Twitter
RT @SmallpixelCar: Lanenet pretty much used all my GPU power on @NVIDIAEmbedded Xavier since I optimized with tensorRT. I need to run anoth…
Sep 3
DIY Robocars via Twitter
RT @LyftLevel5: Our @kaggle competition on Motion Prediction for Autonomous Vehicles is now live! Experiment with the largest-ever self-dri…
Aug 24
DIY Robocars via Twitter
RT @chr1sa: Our next @DIYRobocars virtual AI car race will be on Sept 26th. Sign up here https://www.meetup.com/DIYRobocars/events/272786977/ https://t.co/UENKGSOWO8
Aug 24
DIY Robocars via Twitter
New ready-to-run @NVIDIAEmbedded JetRacer car from Waveshare. Perfect for the next @diyrobocars race as soon as we… https://twitter.com/i/web/status/1297960223013867520
Aug 24
DIY Drones via Twitter
RT @chr1sa: The US government just approved 5 US-made drones for purchase, all based on the @Dronecode @PX4Autopilot standard. Great news f…
Aug 20
More…