All Posts (13953)

Sort by


Olá pessoal,


Estou em busca de pessoas interessadas em fazer parte de uma equipe de uma StartUp da área de drones recem aprovada em edital nacional no Brasil.

Estamos precisando de pessoas com conhecimento em desenvolvimento de hardware e software de drones ou conhecimentos em processamento de imagens.

Estamos na fase de prototipação de produtos e serviços. É necessário também conhecimento do idioma português, pois todo o processo de mentoria será realizado neste idioma.

Interessados entrar em contato comigo até o dia 03/04/21 por aqui ou pelo email




Hey guys,


I am looking for people interested in being part of a team of a Drone StartUp recently approved in national competition in Brazil.

We are looking for people with knowledge in the development of hardware and software for drones or knowledge in image processing.

We are in the prototyping phase of products and services. Knowledge of the Portuguese language is also necessary, as the entire mentoring process will be carried out in this language.

Interested to contact me until 04/04/21 from here or by email




Read more…

DJI TELLO becomes smart

This video demonstrates face tracking using a DJI TELLO drone. By means of the Python DJITELLOPY SDK and the mobilenet ssd artificial intelligence neural network the drone is converted into a smart drone. Tensorflow Lite and Google's Coral usb accelerator enables real time inference.

Read more…

Pioneering scientists lead the research of marine mammals utilizing drones in the most epic ways.


There are countless growing threats to whales and dolphins in the oceans, that range from pollution, ship strikes, entanglement in fishing gear, climate change among others. Many species remain critically endangered. Ocean Alliance is actively working with the daunting task of acquiring more and better data with the objective to protect and save these beautiful creatures. But collecting data of marine mammals is no easy task, it requires a permanent, consistent effort, the most resourceful brains and the best tools. Therefore, Ocean Alliance developed their Drones for Whale Research (DFWR) program, in which they have been actively utilizing drones in the most creative ways. Apart from using multirotor drones that actually fly a few meters above the whales getting right into their blow of misty exhales to collect biological information (SnotBot), they are also using fixed wing amphibious drones to capture a broad range of data on the whales and their habitat. 

 Dr. Iain Kerr, CEO Ocean Alliance, recalls after using the Aeromapper Talon Amphibious for over 8 months:

“Ocean Alliance is a Conservation Science organization, in that we collect data so that we can best advise wildlife managers and policy makers on strategies to help conserve endangered marine species. Alas Oceanography has long been a rich man’s game, prior to our acquisition of an Aeromapper Talon Amphibious it has been a real challenge for us to collect the types of persistent, consistent offshore data on whales that we would like.

Our Aeromapper Talon Amphibious, has met all of the key touchstones that we look for in a new tool, affordable, field friendly, user friendly, scalable and easily modified or updated. I really think that you have made the right choice with regards to hand launch and water landing.  VTOL fixed wing drones (and I have tested a few) just seem to chew up too much battery time taking off and landing.  While our work to date has been primarily flying our Aeromapper Talon Amphibious on Stellwagen banks to find and count humpback whales, we have been approached by two aquaculture groups (principally mussels & oysters) and a field marsh conservation group to demonstrate the capacity of our Aeromapper Talon Amphibious for their use cases.

Ocean Alliance’s Drones For Whale Research program has been receiving international interest, our lead drone initiative SnotBot has been featured on BBC Blue Planet Live and twice on National Geographic, once with Will Smith in the series One Strange Rock.  Also, more than 18 groups worldwide now use protocols we have developed with SnotBot for whale research. We are excited to have added the Aeromapper Talon Amphibious to our drone stable and I am sure will have more exciting news to report on when we get back into the field.

I think that the Aeromapper Talon represents one of the best solutions out there for offshore research, monitoring and mapping. The hand launch water landing option makes the best use of battery and support resources. Those engaged in the new blue economy need the type of solution that Aeromao Inc., is offering, affordable, field friendly, scalable, robust and user friendly!”








Read more…

AMX UAV Officially Launch Vertic VTOL Drone

After 3 years development and test, AMX UAV officially launch Vertic, a quad tailsitter configuration VTOL drone specially designed for aerial mapping mission. Vertic is a drone with copter like (helicopter & multicopter) capability that can take-off on narrow area. After reach safe altitude, Vertic will transition to fixed-wing mode, ensure more flight efficient. Vertic will transition back to copter mode after the mission was finished and right above landing position. Vertic can be used for various application such as agriculture, plantation, urban planning, mining, forestry, and oil & gas.


Vertic can fly fully autonomous, from take-off to landing, with preconfigured mission parameters. Vertic equipped with a failsafe system for safety, for example is quadchute system. This feature automatically change Vertic to copter mode if the flight controller detects anomaly on altitude loses. User with multicopter skills can manually control this drone easily since it’s similar with multicopter control. This easy-to-operate capability can minimized time spent on training.


Vertic uses composite materials to ensure toughness and reliability. The detachable fuselage-wing design, the 1.4 meters wingspan drone is very compact and easy to deploy. Vertic is powered by 14WH Lithium Polymer. The drone can fly up to 35 minutes or cover about 250 hectares for single aerial mapping mission. Vertic ground control station (GCS) software are supported both on Windows or Android operating system connect to the drone up to 15KM. There are 2 type payload that can intergrated with Vertic, the Sony RX0 and Parrot Sequoia+. Sony RX0 is 15,3 mega pixels RGB camera, support with CarlZeiss fixed lens. For agriculture, forestry, and plantation mission, Parrot Sequoia+ multispectral camera can be used for deep analysis.


All the advantages that Vertic has to offer, also supported by very competitive prices. Vertic standart packages is offered at a price $8500. The standart packages include ready-to-fly drone, Sony RX0 camera, GCS laptop, 3pcs battery, 2set propeller spareparts, tools and hard case. The standart packages also include training program for 2 operator, location in Yogyakarta, Indonesia. Contact us through email or phone +62-811292565 (Whatsapp/voice call/message) to get more info about Vertic and other AMX UAV products. 

Read more…

The threat from physical intrusion still remains one of the top concerns in both commercial and non-commercial contexts. According to a report from Markets and Markets, the video surveillance market, which includes both hardware and software, is presently at USD 45.5 billion and expected to reach USD 74.6 billion by 2025.

Over the years, there have been many advancements in optics and detection systems but limitations still exist in the conventional ways of using them. To overcome these limitations, security stakeholders are now incorporating drone technology in their operations.

In this blog, we will talk about drones and the FlytNow solution for perimeter security.

What is perimeter security?

automated perimeter security

Perimeter security is an active barrier or fortification around a defined area to prevent all forms of intrusion. Modern security systems are an amalgamation of sophisticated hardware and software that generally include cameras, motion sensors, electric fencing, high-intensity lights, and a command center to manage them all. 

Challenges with conventional security systems (without drones) for perimeter security

Below are some of the drawbacks and limitations that are inherent in a conventional security system:

  • CCTV cameras and motion detectors are stationary, thus leaving plenty of room for blind spots.
  • Patrolling requires human guards - for larger areas, this is the least efficient way of securing a premise.
  • Response to an intrusion is delayed since a human responder has to reach the location.

Benefits of using drones for perimeter security

Drones have the following advantages over a conventional security system:

  • Drones are mobile flying machines that can go to any location quickly, with HD camera(s), thus eliminating blind spots.
  • Drones can also be equipped with a thermal camera(s) which are useful during nighttime surveillance.
  • Drones can be automated for patrolling using the FlytNow cloud-connected solution and commercially available DiaB (Drone in a Box) hardware.

Note: A DiaB is box-like hardware that houses one or more drones. The hardware keeps the drone flight-ready (24x7) and also automates the launching and docking processes of a drone.

Drones automation for security

For perimeter security, drones are generally used in conjunction with Drone-in-a-Box hardware and a fleet management system that powers the command center. Other security system hardware, including CCTV cameras, motion sensors, etc. can complement the drones and can be connected to the command center, thus integrating into a complete system. In a real-life scenario, such a system might work in the following way:

Drone Command Center

  • An intrusion is detected by one of the CCTV cameras in an area under surveillance. 
  • The command center receives the alert and initiates a drone launch. 
  • A connected DiaB receives the launch request and releases a drone. 
  • The drone flies to the location where the intrusion was detected and begins streaming a live video feed. 
  • An operator maneuvers the drone to cover all blind spots.
  • On finding the intruder, the operator has the option to warn him/her about the transgression using the drone’s onboard payload such as a beacon, spotlight, speaker, etc.

To know about the kind of drones and sensors that can be used for security and surveillance operations please refer to our Drone Surveillance System: The Complete Setup Guide.

How FlytNow enabled perimeter security?

FlytNow is a cloud-based application that helps in managing and controlling a fleet of drones from a unified dashboard through automation, live data streaming and integration. In the context of perimeter security, this translates into a command center that connects drones with the traditional components of a perimeter security system.

6 Reasons to use FlytNow for perimeter security

#1 Easy Setup: FlytNow is cloud-hosted i.e. a user can access the application from any standard web browser, without any complicated server setup. Connecting the drones with the system is also easy and is done using FlytOS.

#2 Unified Dashboard: FlytNow features an advanced dashboard that shows the following:

  1. A live map showing the real-time location of all the drones. The map can be customized to show points of interest, and virtual geofence, and CCTV zones.
  2. On-screen GUI controllers and keyboard & mouse support to control a drone. This allows an operator to easily maneuver a drone to a point of interest from the command center.
  3. Multicam support that allows streaming video feeds from more than one drone.
  4. Different view modes that allow an operator to switch between RGB and thermal mode. In the thermal mode, there is the option to switch between different color pallets, allowing a user to identify warm objects against different backdrops.
  5. Pre-flight checklist which is a list of checks the system prompts an operator to perform before initiating a drone launch.

#3 Live Data Sharing: An operator can share the live video feed from a drone directly from the dashboard. The feature can be used to share video with the police or other remote stakeholders.

Using Drones for Perimeter Security

#4 Advanced Automation: Operating drones through manual control is quite an inefficient way to use drones. Instead, automation should be employed to perform activities like security patrols. FlytNow comes with an advanced mission planner that allows a user to define a path for a drone to follow and save it as a mission. The mission can be executed periodically, thus making a fleet of drones perform automated patrolling.

Self charging security drone software

#5 Add-on Modules: FlytNow provides add-ons to make a drone intelligent; this includes precision landing over a computer-generated tag, obstacle detection, and object identification. These add-ons enable a drone to autonomously fly to a location, identify a threat, and return to the DiaB hardware.

#6 Drone-in-a-Box Hardware Support: The functions of DiaB hardware, in the context of perimeter security, can be broadly classified into four categories:

  1. Securely house a drone.
  2. Keeping the drone fully charged all the time.
  3. Initiate a drone launch.
  4. Successfully dock a returning drone.


In this blog, we discussed the concept of perimeter security, the limitations of conventional security set up, and how these limitations can be overcome using drones. Then we covered how drones are actually used for aerial patrols and 6 reasons why FlytNow is an ideal solution for automating drones for perimeter security.

There are plenty more reasons to use FlytNow for perimeter security that you can find out by signing for our 7 days free trial.

Read more…

8665087083?profile=RESIZE_710xI am looking for a collaboration with a drone company to do a high altitude flight experience. We are starting a small company called Loweheiser, we are developing EFIs for small engines for UAVs. Right now we are developing a throttle body for an RCGF 15RE, and I also want to design one for the 10cc of this same brand. The 10cc is probably the smallest electronic fuel injection engine that has ever been made, At the moment the smallest engines we have worked on are the Saito FG-21 and the FG14, the FG14 has a 7mm throttle body!

Many clients ask me how high the EFI works. I tell them that the EFI can operate at any altitude that a fixed-wing UAV can reach. The absolute pressure sensor of the ECU can measure from 15 to 115 kPa, 15kPa are 60000m so it seems not a problem. We have tested at 2000m because it is the maximum we can reach in a few hours of travel, but I would like to test it in flight with higher altitude.
In my initial calculations I see that a normal plane could reach 10,000m without problems. A plane of about 2m wingspan and about 3kg of weight consumes less than 100w to maintain a cruise.

The 15cc engine has about 1.54Kw. The motor loses power linearly with the loss of density of the atmosphere The density of the atmosphere at 25ºC 0m (sea level) is 1.12720 kg / m3, the density of the atmosphere at -20ºC and 10000m is 0.453337kg / m3, that's 40% of the power of the engine at sea level. it is 600w, even introducing losses of another 50% due to propeller performances and other ineficiencies ... there is still 300w which is more than enough to fly at 10000m

Where can be done this high altitude challenge?
I am looking for a collaboration with a drone company who has experience in flying in an airspace where this test can be done and who can take permission from the concerned authorities. 

Best regard, jlcortex

Read more…

I feel like this has been done many times before, including by the Skydio team when they are at MIT, but DARPA was impressed:


From JHU:

Johns Hopkins APL Helps DARPA OFFSET Program Take Flight

​In the video above, the APL team’s fixed-wing unmanned aerial vehicle launches in fierce winds from approximately 100 meters away from its target and successfully completes two passes through the crowded urban area while maintaining three meters of space from the target building.

Credit: Johns Hopkins APL

​In a second test, the team launched from 250 meters out, flying four times faster than the quadcopters, around a bigger building and under an overpass, autonomously and without crashing.

Credit: Johns Hopkins APL

With each second of the video that ticks away, the suspense builds. Joseph Moore launches a fixed-wing unmanned aerial vehicle (UAV) into the air and it’s buffeted by the wind. Undeterred, the UAV goes about its task to navigate around buildings at high speeds in an urban environment.

The wind picks up at points, and the neon-green fixed-wing UAV steadies itself on those occasions. But ultimately, it navigates the course adeptly, coming within about 10 feet of the buildings and steering around them with relative ease. Most importantly: it doesn’t crash.

“That was a gate for us to get through,” said Moore, the project manager of the Research and Exploratory Development Department team that ran the test at Joint Base Lewis-McChord in Washington state this past August. “We’d never tested anything in an actual physical environment, so proving what we did was huge.”

The test was part of the Defense Advanced Research Projects Agency (DARPA) OFFensive Swarm-Enabled Tactics (OFFSET) program, which envisions swarms of up to 250 collaborative autonomous systems providing insights to ground troops as they operate in dense metropolitan environments. The program is about four years old, Moore said, and it’s unique in structure because the two swarm system integrators — Northrop Grumman and Raytheon — are creating the testbeds and simulation environments for crafting tactics for large-scale autonomous swarms in urban environments.

“OFFSET is developing a variety of swarm-enabling technologies,” said Timothy Chung, the DARPA OFFSET program manager, “from a rich repository of swarm tactics, to virtual environments for swarm simulation, to physical testbeds with real robots where these swarm tactics can be demonstrated in real-world settings.”

This specific test was an effort to answer Moore’s team’s central question for this phase of the project, known as sprints: could fixed-wing UAVs have quadcopter UAV agility and mobility but add greater range, endurance and speed, given that they were fixed-wing in form?

“Imagine you have a futuristic sensor on your aircraft that could, theoretically, map the interior of a building and produce a floor plan,” Moore explained. “You want to put that sensor on a fixed-wing UAV, fly really fast and really close to the building, and come away with a rapid interior scan of the floor plan.

“We’re not there yet, but our goal was to control the vehicle at high speeds in an urban, outdoor environment and do multiple passes around the target building without hitting it.”

UAVs are typically thought of as propeller-armed quadcopters, but previous Independent Research and Development (IRAD) work featuring aerobatic maneuvers with fixed-wing UAVs put APL in an advantageous position to push OFFSET’s fourth sprint forward.

The team took that base work from the IRAD, including its Aerobatic Control and Collaboration for Improved Performance in Tactical Evasion and Reconnaissance (ACCPITER) technology, and spent the first six months of the sprint working with virtual aircraft in a virtual world and the final six using a physical aircraft in a virtually constructed environment.

Using a mesh — a virtual map — of previous DARPA field tests in urban environments, the team flew their fixed-wing UAVs in a virtual world on APL’s campus. They tested, developed the proper algorithms and software, and worked to program an essentially “off-the-shelf” aircraft with bespoke APL-developed electronics package and software.

They did all that at the Lab, but until they trekked to Washington in August, they hadn’t tested it in the physical world. The vehicle’s performance in the virtual world was good. The validation in the physical world performance was exceptional.

In fierce winds, the team launched the craft from approximately 100 meters away, successfully completed two passes through the crowded urban area while maintaining three meters of space from the target building, and then pushed the test to a 250-meter launch, flying four times faster than the quadcopters around a bigger building and under an overpass, autonomously and without crashing.

The program’s fifth sprint is underway, and Moore said this period will focus on adding larger numbers of fixed-wing vehicles operating in urban environments together. The groundwork laid in Sprint 4, especially in validating vehicle performance in the physical world, will be crucial as the team moves forward to address more challenging and complex urban swarm scenarios.

Read more…

Long range autonomous boat with ArduPilot

From Hackaday:

Thanks to the availability of cheap, powerful autopilot modules, building small autonomous vehicles is now well within the reach of the average maker. [rctestflight] has long been an enthusiast working in this space, and has been attempting long range autonomous missions on the lakes of Washington for some time now. His latest attempt proved to be a great success. (Video, embedded below.)

The build follows on from earlier attempts to do a 13 km mission with an airboat, itself chosen to avoid problems in early testing with seaweed becoming wrapped around propellers. For this attempt, [Daniel] chose to build a custom boat hull out of fiberglass, and combine both underwater propellers and a fan as well. The aim was to provide plenty of thrust, while also aiming for redundancy. As a bonus, the fan swivels with the boat’s rudder, helping provide greater turn authority.

After much tuning of the ArduPilot control system, the aptly-named SS Banana Slug was ready for its long range mission. Despite some early concerns about low battery voltages due to the cold, the boat completed its long 13 km haul across the lake for a total mission length of over three hours. Later efficiency calculations suggests that the boat’s onboard batteries could potentially handle missions over 100 km before running out.

Read more…

Sky-Drones SmartLink Update



Sky-Drones Technologies power drones for enterprise business solutions. As creators of full-stack UAV avionic technology, the company aims to accelerate the development and adoption of UAVs for enterprise. This year started off strong for Sky-Drones with an upgrade to their product range: the new and improved SmartLink.

Each SmartLink set contains an air and ground module. The air module is attached to the UAV whilst the ground module is connected to the ground control station. Improvements made to the air unit mean the system is capable of handling two real time HD video streams from CSI and HDMI cameras, the CSI camera itself being an additional element added to the SmartLink set. Alongside this, a huge array of payload interfaces are now embedded into the unit including USB, UART, I2C, and SPI for allowing the tight integration into Sky-Drones’ hardware and software.


Improvements made to the ground control device include manufacturing this compact and lightweight module with a micro-USB connector, thus allowing the interaction with multiple devices including laptops, smart phones, tablets, and desktop computers. The addition of an adaptive fan implements a cooling system which allows the module to withstand high ambient temperatures. This makes the entire system that much more reliable in harsh environments and increased longevity on lengthier missions.

The drone datalink and integrated onboard computer has a standard range of up to 20km. However, an unlimited flight range can be achieved thanks to Sky-Drones’ advancement in LTE connectivity. Connecting your UAV via LTE will enable an endless range, all being your UAVs remain within the LTE coverage area. Planning to cover a much larger area than your coverage zone? Our specialised antennas can ensure your range reaches several dozens of kilometres further afield. For single or cluster drones in the surveillance, search and rescue, and delivery sectors, there has never been a more obvious solution to multi-drone control with unlimited possibilities.  


In order to bring these additions into view, Sky-Drones founder and CEO Kirill Shilov undertook a live webinar on Wednesday 24th February. This webinar included a full product unpackaging, demonstration of compatibility with Sky-Drones software, and a live Q&A from members of the audience. As a result, all the asked and answered questions have become a permanent feature on the SmartLink product page on the Sky-Drones website

The new and improved SmartLink hardware works in tight unison with Sky-Drones SmartAP GCS software, allowing completely autonomous flight control and communications from anywhere in the world. Now, pilots and drones no longer need to be in the same vicinity, allowing BVLOS flights to be planned and initiated quicker than ever before. Access to flight data and logs in Sky-Drones Cloud can also be accessed worldwide by all who are granted access with your personalised company login.

This product renovation has been a major advancement for the Sky-Drones team. Meeting the needs of enterprise business solutions and breaking the ceiling of UAV excellence is something our company is driven to exceed and making sure our products excel in every possible way is what we will continue working towards. Please visit the new SmartLink product page for more information and detailed technical specificaitons.


Read more…

Simple Waypoint Navigation for Fixed-Wing UAVs

There is a simple way to implement waypoint navigation. The algorithm was introduced by Lawrence, Frew, and Pisano back in 2008. You can view the paper here:

I have flight tested this algorithm multiple times on a low-resource autopilot embarked on a fixed-wing UAV (see figure below). In this blog post, I want to share my approach to implementing this algorithm. I cannot claim any academic novelty. However, I hope to make this simple and effective algorithm more accessible to other researchers/hobbyists.


In the remainder of this post, I will provide a high-level view of the algorithm. I will also show some flight test results. If the reader wants a more mathematical treatment of the implementation, the reader is referred to the attached PDF.

The algorithm is simple because it uses a single guidance vector field. Vector fields, in general, assign a vector to each point in space. Guidance vector fields, in particular, assign a desired velocity vector to each point in a 2D plane. The vector fields are designed such that they guide the vehicle to a particular path. In this case, the vector field brings the vehicle into a loiter circle. The following figure shows the vector field. The vector field also generates a desired acceleration vector. The acceleration vector is needed for accurate path following.

8598890294?profile=RESIZE_400xWe feed the guidance commands from the guidance vector field into the lateral control system. However, the input to the lateral control system is a bank angle (see figure below).  The block labeled “AC” represents the aircraft dynamics and the block labeled “Ail” represents the dynamics of the aileron actuator.

Thus, it is necessary to convert the desired velocity and the desired acceleration into a desired bank angle. We use a two-step conversion.

First, we convert the guidance commands into a lateral acceleration command. The lateral acceleration command has two terms. The first term drives the angular misalignment between the vehicle velocity and the desired velocity to zero. The second term incorporates the desired acceleration vector; the second term functions as a feed-forward acceleration term needed for accurate path tracking.

Second, we convert the lateral acceleration command to a desired bank angle. The relationship between lateral acceleration and bank angle is illustrated in the figure below. By banking, the lift force attains a lateral component, which produces the desired lateral acceleration.


Having described basic loiter circle tracking, we are ready to move on to waypoint navigation. The waypoint navigation routine is actually loitering in disguise: the routine positions loiter circles so that the vehicle smoothly intersects each waypoint in succession. The positioning algorithm is shown in the figure below.  The positions of the previous waypoint, current waypoint, and next waypoint are denoted by A, B, and C, respectively. The center point lies along the bisector of angle ABC. The loiter radius sets the distance between the center point and the current waypoint. Having determined the loiter center, the next step is to determine the sign of the circulation constant. If C is to the left of the line AB, then the circulation is counter-clockwise (<0). If C is to the right the line AB, then the circulation is clockwise (>0).


A nice feature of the positioning routine is that it can work with only two waypoints. The waypoints are stored in a circularly linked list. Hence, the "next" waypoint and the "previous" waypoint can point to the same waypoint. 

Next, we describe the algorithm that governs how the aircraft switches from one waypoint to the next. 

The positions of the current waypoint and next waypoint are denoted by A and B, respectively. Let LA denote the loiter circle that brings the aircraft into A. Suppose the aircraft has just "hit" the current waypoint. The navigation routine sets the current waypoint to B and computes the parameters of LB, the loiter circle that brings the aircraft into B.

Now, the straightforward approach is to immediately switch from using LA to LB. However, this approach will change the guidance vector in an abrupt manner. To achieve a smooth transition between waypoints, the switch from LA to LB occurs when the velocity vectors from both loiter circles are pointing in roughly the same direction.  A smooth transition protects the low-level control system from large changes in the input command.

The figure below illustrates the switching algorithm. In the top plot, the aircraft is about to hit A. In the second plot, the aircraft has hit A and has set up the loiter circle of B. The aircraft, however, continues tracking the loiter circle of A. In the third plot, the guidance vectors from both loiter circles are aligned. The aircraft begins tracking the loiter circle of B.  In the fourth plot, the aircraft is en route to B.



8603334665?profile=RESIZE_400x8603342856?profile=RESIZE_400xI implemented the guidance system described herein using fixed-point arithmetic on the AMP autopilot, which is made in-house by our research group (see The microprocessor belongs to the DSPIC33 family of microprocessors made by Microchip.

Flight test results are shown in the figures below (map data ©2020 Google). The test took place at the Flying Gators RC Airport in Archer, FL. The first plot shows the flight path of the delta wing UAV performing waypoint navigation with four waypoints. The waypoints are positioned to create a figure-eight trajectory. The second plot shows the flight path of the delta wing UAV performing waypoint navigation with two waypoints. You can view a synthetic video of the flight test here: video is reconstructed from downlinked telemetry data. You can view an actual video of a portion of the flight test here:



In the first plot, we note that there is a clear asymmetry in the flight pattern. This asymmetry was due to the aircraft being out of trim. When I examined the roll data after the flight test, I found that the aircraft was better at tracking negative roll commands as opposed to positive roll commands. Hence, the asymmetry has to do with the control system, not the guidance system.

In conclusion, this blog post provides an overview of a simple and effective waypoint navigation scheme. For a more mathematical treatment, the reader is referred to the attached PDF. I have also attached a MATLAB code that simulates loiter circle tracking. The vehicle dynamics are represented using a matrix state-space model.


Read more…

Stereoscopic systems widely used in drone navigation, but this project is using a new approach - variable baseline.

In the related (PDF) the team showcases three different applications of this system for quadrotor navigation:

  • flying through a forest
  • flying through an unknown shaped/location static/dynamic gap
  • accurate 3D pose detection of an independently moving object

They show that their variable baseline system is accurate and robust in all three scenarios.

For the video capture, the Raspberry Pi-based StereoPi board was used. Additional AI-acceleration hardware (Intel Movidus) is considered as a next step, as well as using a more powerful CM4-based version of the StereoPi (v2).

Here is the brief video of the project:


Read more…

NexuS UAV presentation

I want to share the first flight of the third development of a fixed wing aircraft for autonomous flight.
This design was made entirely with composite materials and the molds made with 3d printers.
tests resulted in great aerodynamic efficiency and stability and maneuverability
final weight with full payload is 2.6 kg.
for the flight he was ballasted with the final weight for flight.
we estimate endurance of 90 minutes with 10,000 mA and 4S8529323882?profile=RESIZE_710x8529327272?profile=RESIZE_710x

Read more…

Centeye Modular Vision Sensors

8529205664?profile=RESIZE_710xIt has been awhile since I’ve posted anything about recent Centeye hardware. Some of you may remember my past work implementing light weight integrated vision sensors (both stereo and monocular) for “micro” and “nano” drones. These incorporated vision chips I personally designed (and had fabricated in Texas) and allowed combined optical flow, stereo depth, and proximity sensing in just 1.1 grams. Four of these on a Crazyflie were enough to provide omnidirectional obstacle avoidance and tunnel following in all ambient lighting conditions. Their main weakness was that they were time-consuming and expensive to make and adapt to new platforms. Some of you may also remember our ArduEye project, which used our Stonyman vision chip and was our first foray into open hardware. Although that project had a slow start, it did find use in a variety of applications ranging from robotics to eye tracking. I have discussed, privately with many people, rebooting the ArduEye project in some form.

Like many people we faced disruption last year from COVID. We had a few slow months last Summer and I used the opportunity to create a new sensor configuration from scratch that has elements of both Ardueye and our integrated sensors. My hypothesis is that most drone makers would rather have a sensor that was modular and easy to reconfigure or adapt, or even redesign, and are OK if it weighs “a few grams” rather than just one gram. Some users even told me they prefer a heavier version if it is more physically robust. Unlike the nano drones I personally develop, if your drone weighs several kilograms, an extra couple grams is negligible. I am writing here to introduce this project, get feedback, and gauge interest in making this in higher quantities.

My goals for this “modular” class of sensors were as follows:

  • Use a design that is largely part agnostic, e.g. does not specifically require any one part (other than optics and our vision chip) in order to minimize supply chain disruptions. This may sound quaint now, but this was a big deal in 2020 when the first waves of COVID hit.
  • Use a design that is easy and inexpensive to prototype, as well as inexpensive to modify. We were influenced by the “lean startup” methodology. This includes making it easier for a user to modify the sensor and it’s source code.
  • Favor use of open source development platforms and environments. I decided on the powerful Teensy 4.0 as a processor, using the Arduino framework, and using Platform IO as the development environment.

We actually got it working. At the top of this post is a picture of our stereo sensor board, with a 5cm baseline and mass of 3.2 grams, and below is a monocular board suitable for optical flow sensing that weighs about 1.6 grams. We have also made a larger 10cm baseline version of the stereo board, and have experimented with a variety of optics. All of these connect to a Teensy 4.0 via a 16-wire XSR cable- The Teensy 4.0 operates the vision chips, performs all image processing, and generates the output. We have delivered samples to collaborators (as part of a soft launch) who have indeed integrated them on drones and flown them. Based on their feedback we are designing the next iteration.


As with any new product you have to decide what it does and what it does not do. Our goal was not to have an extremely high resolution- those already exist, and the reality is that having a high resolution has other costs in terms of mass, power, and light sensitivity. Instead, we sought to optimize intensity dynamic range. The vision chips use a bio-inspired architecture in which each pixel individually adapts to its light level independent of other pixels. The result is a sensor that can work in all light levels (“daylight to darkness”, the latter with IR LED illumination), can adapt nearly instantaneously when moving between bright and dark areas, and function even when both bright and dark areas are visible.

Below shows an example of the stereo sensor viewing a window that is open or closed. (Click on the picture to see at native resolution.) The current implementation divides the field of view into a 7x7 array of distance measurements (in meters) which are shown. Red numbers are those measurements that have passed various confidence tests; cyan numbers are those that have not (thus should not be used for critical decisions). Note that when the window is open, the sensor detects the longer range to objects inside even though the illumination levels are about 1% that of outside. A drone with this sensor integrated would be able to sense the open window and fly though it, and not suffer a temporary black-out once inside.


A more extreme case of light dynamic range is shown in the picture below. This was taken with a different sensor that uses the same vision chip. On the top left is a picture of the sensor- note that it was in the sunlight, thus would be subject to the “glare” that disrupts most vision systems. On the top right is a photograph of the scene (taken with a regular DSLR) showing sample ranges to objects in meters. On the bottom is the world as seen by the sensor- note that the Sun is in the field of view at the top right, yet the objects in the scene were detected. Other examples can be found on Centeye’s website.


We are currently drafting up plans for the next iteration of sensors. For sure we will be including a 6-DOF IMU, which will be particularly useful for removing the effects of rotation from the optical flow. We are also envisioning an arrangement with the Teensy 4.0 placed nearly flush with the sensor for a more compact form factor. There is still discussion on how to balance weight (less is better) with physical robustness (thicker PCBs are better)! Finally I am envisioning firmware examples for other applications, such as general robotics and environment monitoring. I am happy to discuss the above with anyone interested, private or public.

Read more…
3D Robotics

Demo of Microsoft AirSim with PX4

From the video description:

I wanted to put this video together to share what I've been working on as it relates to PX4 simulation. I've been really impressed with the capabilities of AirSim and I hope this video makes it a little easier to understand. You can learn more about AirSim here: and my GitBook notes can be found here: To learn more about DroneBlocks please visit: Please feel free to leave a comment below if you have any questions and I hope to share more information in the near future. Thanks for watching.

Read more…

Quick install BatMon v4 released


BatMon v4 released
One of the main challenges faced with BatMon was the installation overhead. Installing BatMon v3 took over an hour on a new battery pack. Second challenge was the cost overhead of a BMS system on each battery. We have reduced these issues significantly with the BatMon v4 release.
  • v4 is super fast to install on most batteries with a tool, and connects to the balance leads.
  • The modular board make it possible to reuse BatMon after end of life of a battery. The XT90 leads can be replaced if they are worn, but can practically be reused few times, reducing the cost overhead on each smart battery pack.



Read more…
3D Robotics

Clever research from ETH showing how it uses the drone camera to maintain position while a quadcopter spins to maintain control after one motor fails. 

From DroneDJ:

Researchers at the University of Zurich and the Delft University of Technology have been able to keep a drone flying after a motor fails. The researchers have managed to use onboard cameras to keep the test drones in the air and flying safely.




team of researchers has come up with a simple yet ingenious way to solve a problem that will usually result in a drone falling to the ground due to a motor failure.

Well, motor failures don’t often happen, but when they do, the drone needs to stay in the air regardless, especially if people are nearby or the drone is being used for a commercial job. Redundancy is important when it comes to drones.

Davide Scaramuzza, head of the Robotics and Perception Group at UZH and of the Rescue Robotics grand challenge at NCCR Robotics, shared:

When one rotor fails, the drone begins to spin on itself like a ballerina. This high-speed rotational motion causes standard controllers to fail unless the drone has access to very accurate position measurements.

Scaramuzza essentially says that the standard controllers in drones cannot cope with the fast and random spinning of a free-falling drone. This led the team to onboard RGB cameras and event cameras, which we’ve gone into in the past for obstacle avoidance.

GPS methods were also explored before the cameras, but the researchers ended up dumping the idea as GPS isn’t available in all situations, especially when it comes to specific drone missions.

The changes between the frames

Now for the way to keep the drone flying. The team equipped a drone with an RGB camera and an event camera. The standard RGB camera detects movements in the whole frame, where the event camera detects changes on the pixel level, allowing for tiny changes to be spotted.

The data from the two cameras are combined using a specially developed algorithm that then tracks the quadcopter’s position relative to its surroundings. This allows the flight controller to take control of the drone as it spins and flies.

Both cameras work great in well-lit environments, but the RGB camera begins to suffer as light decreases. In testing, the researchers were able to keep the drone stable with the event camera all the way down to 10 lux, which is about equivalent to a dimly lit room.

Video Player
Read more…

Zion Market Research has published a new report titled “Drone Logistics and Transportation Market By Solution (Shipping, Warehousing, Software, and Infrastructure), By Drone (Passenger, Freight, and Ambulance), and By Sector (Commercial and Military): Global Industry Perspective, Comprehensive Analysis, and Forecast, 2018–2025”. According to the report, the global drone logistics and transportation market was USD 4.56 billion in 2018 and is expected to reach around USD 18.05 billion by 2025, at a CAGR approximately 21.9% between 2019 and 2025.


Unmanned aerial vehicles, also known as drones, are small aircrafts that don’t have a human pilot onboard that can either operated remotely or automated and travel with the help of GPS coordinates. They are made of light material to reduce weight, which enables them to fly at high altitudes. Drones are controlled by a ground cockpit and can easily return to their marked starting position in case of low battery or when the drone loses contact with the controller. Initially, these were used for photography and videography; however, these are used for applications in the supply chain of the modern world. Drones are considered to revolutionize the supply chain.

In this constantly evolving world, entrepreneurs and visionary leaders are focusing on integrating drone delivery in the supply chain. This is the primary growth factor of the drone logistics and transportation market, as including drones in the supply chain is believed to revolutionize the way shipments reach the customers. It will be the fastest way to transport and has the potential to deliver orders within minutes. Drones will have a diverse range of impact on the supply chain. Drones can track inventory and act as a significant system for inventory management, which will reduce the inventory carrying the cost of companies. Around 90% of the inventory of a warehouse is stationary and companies end up with extra inventory due to improper management. However, it is often difficult even for planes to fly in extreme weather conditions like snow, rain, and strong winds. Drones are smaller versions of these flying machines and could pose a major challenge for delivery during extreme climatic conditions, which might imply that drones can deliver only in certain climatic conditions. This is a major restraint for the drone logistics and transportation market.

By solution, the shipping sector is expected to hold the highest market share. Drone shipping has been targeted by various companies like Amazon, Google, UPS, DHL, etc. By drone, the ambulance drone segment will hold a major market share, owing to the increasing casualties and growing traffic congestion in major cities across the globe. The first few moments are the most crucial during an accident to prevent any further escalations. Lifesaving technologies like CPR (Cardiopulmonary Resuscitation), medication, and AED (Automated External Defibrillator) can be made compact enough to be performed by a drone. Moreover, drones can also transfer other crucial medical supplies during disasters and across tough terrains.

Download Free Research Report Sample PDF for more Insights -

North America will hold a substantial share of the drone logistics and transportation market in the future, owing to the various developments and innovations witnessed in the inventory management domain. The U.S. is witnessing continuous growth in tech start-ups every year, which is backed by numerous venture capitalists, thereby increasing the regional market scope. The presence of key market players is also predicted to accelerate the demand for drone logistics and transportation market. In Europe, the presence of developed economies of the UK, Germany, and France is contributing to the drone logistics and transportation market.

Some major key players operating in the drone logistics and transportation market are PINC Solutions, Matternet, Drone Delivery Canada, Hardis Group, CANA Advisors, Infinium Robotics, Workhorse Group, Aerovironment, DroneScan, Skycart, and Zipline.

Read more…

The main focus of this research is to develop a real-time forest fire monitoring system using an Unmanned Aerial Vehicle (UAV). The UAV is equipped with sensors, a mini processor (Raspberry Pi) and Ardu Pilot Mega (APM) for the flight controller. This system used five sensors. The first is a temperature sensor that served to measure the temperature in the monitored forest area. The others sensors are embedded in the APM. There are a barometer, Global Positioning Sensor (GPS), inertial measurement unit (IMU) and compass sensor. GPS and compass are used in the navigation system. The barometer measured the air pressure that is used as a reference to maintain the height of the UAV. The IMU consists of accelerometer and gyroscope sensors that are used to estimate the vehicle position. The temperature data from the sensor and the data from GPS are processed by the Raspberry Pi 3, which serves as a mini processor. The results of the data processing are sent to the server to be accessible online and real-time on the website. The data transmission used the Transmission Control Protocol (TCP) system. The experimental setup was carried out in an area of 40 meters × 40 meters with 10 hotspots. The diameter of the hotspots is 0.4 meters with a height of 0.5 meters. The UAV is flown at a constant speed of 5 m/s at an altitude of 20 meters above the ground. The flight path is set by using a mission planner so the UAV can fly autonomously. The experimental results show that the system could detect seven hotspots in the first trial and nine hotspots in the second trial. This happened because there is some data loss in the transmission process. Other results indicate that the coordinates of hotspots detected by the UAV have a deviation error of approximately 1 meter from the actual fire point coordinates. This is still within the standard GPS accuracy as this system uses GPS with a range accuracy of 2.5 meters. 

You can Download this article on Research Gate or Journal of Engineering Science and Technology

Read more…

The main focus of this research is early detection system for forest fires by using Unmanned Aerial Vehicle (UAV). Data source of fires are collected by mobile devices such as GPS-equipped UAV quadcopter, non-contact infrared sensor, and the sensor stabilizer. The data from sensor is sent via telemetry link 433 Mhz, towards the Ground Control Station. Then the data is processed by a program that has been made, namely SPTA Real-time v0.1.0, which can show the temperature data as well as the color layer based on the difference of temperature levels in real-time on a digital map layer. The coordinates of fires are observed by SPTA realtime v0.1.0, and then it is compared with the coordinates of the source of the fire that is recorded manually. The results of data retrieval, the area that monitored is 3662 m2, constant height of 30 m, quadcopter speed of 5 m / s. First Data, with wind speeds of 3.2 m / s has a difference of 1.18 m from the coordinate source of the fire, within GPS tolerance accuracy is 2.5 m. For the second data with a wind speed of 6 m / s, has a difference of 5.16 m from the coordinates of the source of fire, deviated from the tolerance is 2.66 meter GPS accuracy.


Read more…