All Posts (13930)

Sort by

8798737491?profile=RESIZE_710x

Endangered Snow Leopards and their habitat are under threat, KwF to use drones to protect these elusive cats of the Himalayas

For more than a decade, Kashmir World Foundation (KwF) has been operating aircraft around the world protecting endangered cats, rhinos, sea turtles, and many other species. We integrate airframes with power and propulsion systems to carry sensors, high performance computers, and special components needed to perform our missions. Operating on battery or liquid fuel, our aircraft meet most of our mission needs.

But the mission that drove us to create KwF -- protecting snow leopards and other endangered species in the Himalayas -- eluded us with demands on flight performance that could not be met by any existing aircraft. To support this mission, the aircraft must operate at very high altitude while close to ground below and to the sides as they navigate through extremely rocky terrain while encountering high and highly variable winds with snow, ice, and freezing rain. Remote from any conceivable operating base, and with needs to patrol over tens of thousands of square kilometers, the aircraft must be able to stay on station for at least 8 hours processing data, making decisions, and taking swift action when needed, all while barely making a sound.

Background

Fifty million years ago, Eurasia and current day India collided to create the tallest mountain range in the world -- the Himalayas. Lifted by the subduction of the Indian tectonic plate under the Eurasian Plate, the Himalayan mountain range runs west-northwest to east-southeast in an arc 2,400 km (1,500 mi) long. Standing at 8,848 meters (29,031.7 ft), nestled between Nepal, Tibet, and China, Mount Everest is the tallest mountain peak in the world. Glaciers and fast flowing water have cut the mountains creating jagged peaks and valleys through which winds blow with rapidly varying direction.

8798797858?profile=RESIZE_710x

 

A map of mountain ranges in snow leopard habitat, with the sites of published population studies marked. Image taken from Suryawanshi et al, 2019.

 

Masters of disguise, the snow leopards (PANTHERA UNCIA) evolved about 3.9 million years ago from Miacids, a common ancestor of tigers, to become king of the Himalayas. These elusive cats live above the high alpine areas where they hunt the blue sheep, Argali wild sheep, ibex, marmots, pikas, deer and other small mammals.

8798802064?profile=RESIZE_710x

The global snow leopard population remains unknown, but based on their research, Suryawanshi and his colleagues fear it may be lower than prevailing guesstimates suggest.

Photo by Shan Shui / Panthera / Snow Leopard Trust

Having thrived under some of the harshest conditions on Earth, snow leopard populations are now in rapid decline and will be gone soon unless action is taken to ensure their protection. Despite being listed as endangered since 1972, and legally protected, humans hunt and poison them for trophies, fur and body parts; hunt their prey; and destroy their habitat through over grazing of domestic animals. Researchers aren’t sure how many snow leopards are left in the world. In 2016, IUCN estimated that there are between 2,710 and 3,386 snow leopards found in the high mountains of Central and South Asia.

Video by Dr. Ali Nawaz

Problem

Protecting snow leopards and other endangered species in the Himalayas requires surveillance over broad areas. To meet these stressing needs, Kashmir-Robotics, the Science and Technology division of KwF, proposed using aerial robotics enabled with artificial intelligence to provide self awareness, situational awareness, and the ability to act quickly and decisively.

In collaboration with the Technology Assisted Counter Poaching (TACP) network, Dr. Ronald Pandolfi, Chief Technology Officer for KwF and Dr. Muhammad Ali Nawaz, Executive Director of Snow Leopard Foundation in Pakistan derived a set of operational requirements for an unmanned aerial system (UAS) to perform the mission. Dr. Ronald Pandolfi concluded, “the aircraft must be able to launch from low altitude, climb to over 5,000 meters while traversing to snow leopard habitat, then stay on station for at least 8 hours while navigating between rocky slopes and deep valleys, all while exposed to high and variable winds, snow, and ice.” Hence, Eagle Ray was born.

Baseline

With no current aircraft able to meet these requirements,  Princess Aliyah Pandolfi, Executive Director of KwF turned to Chief Scientist & Vice President of DZYNE Technologies, Mark Page, world leading designer of high performance unmanned aerial systems and creator of the modern blended wing body (BWB) aircraft. She asked him to design a planform and airfoil stack for an aircraft with gross takeoff mass less than 25 kg (to comply with Federal Aviation Administration regulations) that is efficient and agile while able to operate in high and variable winds under some of the most extreme weather conditions.

Rooted in a long history of successful BWB designs, Page provided a Baseline adjusted for the flight characteristics required for the KwF missions. Page says, “Blended Wings like Eagle Ray are well suited to flight at extreme altitudes.  First, high altitude flight is much easier with an electric airplane since the powerplant doesn’t rely on oxygen from the atmosphere to generate power.   At high altitudes the oxygen density is about half that available at Sea Level.  A gas-powered plane would lose half its horsepower, while an electric plane loses none. Second to maintain efficient flight in the thinner air the true flight speed must increase about 40%.  The power required to fly is the product of flight speed times aerodynamic drag.  Eagle Ray will need to fly faster like any other plane, so the only question is this, “which plane makes the least drag?”.  Efficiency is the gift that keeps on giving, and the BWB produces about 30% less drag to bear the airplane’s weight.”

Nepal Collaboration

“At KwF, we look to the world for good ideas and great engineering,” Princess Aliyah says, “Our mission is to save endangered species from extinction, so we have reached out in the broadest sense for innovation and invention to meet our demanding mission capabilities.”

 

8798803666?profile=RESIZE_710x

Dr. Ronald Pandolfi, Princess Aliyah, and Kapil Regmi share their signed copies of an MoU between KwF & acem

that will inspire students in Nepal to collaborate internationally to help protect the environment

and endangered species through the use of innovative drones and artificial intelligence.

KwF’s partnership with Advanced College of Engineering & Management (acem) in Nepal will give Eagle Ray the platform to test at 18,000 feet and students to participate in an international design challenge. Kapil Dev Regmi, the Executive Director of acem is excited for the collaboration because “this state-of-the-art Eagle Ray Project is going to inspire students at so many different levels. It will motivate students to keep dreaming and prove that the true dreamers are the true achievers. Eagle Ray project is not just another project being carried out solely by an international organization but the project which collaborated with a local partner (like acem) to involve students right from the project development phase to ensure the end to end project overview. While most drone projects are being done just at local level (mostly on medical deliveries to rural areas), Eagle Ray project will open new dimensions of exploration like using tech to understand biodiversity and help to conserve them. This will showcase the idea of *innovation* to students. It is a project which is way ahead of time for Nepali students that will excite them to innovate for local problems.”

Global Challenge

8798855700?profile=RESIZE_710x

Just a few years ago, only large aerospace corporations could design advanced performance aircraft. Now with global communications resources like DIY DRONES and open source tools like OpenVSP, individuals and small treams from around the world can contribute with design and breakthrough technologies.

KwF now challenges students, academics, and professionals around the world to improve on the Mark Page BWB Eagle Ray design, and in so doing help KwF create an aircraft that is better suited to the mission of protecting endangered species throughout the Himalayas. More specially, the challenge is to improve on the planform and airfoil stack, increasing endurance while maintaining stability and maneuverability. All challenge participants receive access to the current and final design, and those who succeed in improving Eagle Ray have their name appear on the official list of contributors and their logo emblazoned on every Eagle Ray aircraft.

Challenge Teams can attempt to make improvement to the overall planform and airfoil stack, or they can concentrate on special features including elevons, trim tabs, tip rudders, or other control surfaces. Each proposed change must include a statement of why the change is suggested and how it is expected to improve performance and enable greater mission capability along with calculations, simulations and/or charts.

The Eagle Ray Design Challenge Opens April 15, 2021! Join the challenge to help protect endangered snow leopards. Details of the challenge criteria and rules of engagement will be available at www.kashmirworldfoundation.org or contact: info@kashmirworldfoundation.org

 

 

Read more…

Introduction

It can be difficult to get useful data from a magnetometer. It can be especially difficult if the data is used to estimate the yaw attitude of the vehicle. For example, the sensor may indicate the proper direction for certain attitudes, but a wrong direction for other attitudes. Or, the yaw estimate may be accurate one day and off the next.

In order for magnetometer data to yield the yaw attitude of a vehicle, the magnetometer must measure the direction of the geomagnetic field (i.e. the Earth's magnetic field). The geomagnetic field points north (more or less) and has been used for navigation for hundreds of years. The challenge is that the geomagnetic field is relatively weak. It is common for the geomagnetic field to be distorted or obscured by extraneous magnetic sources in the vicinity of the magnetometer. The purpose of magnetometer calibration is to extract an observation of the geomagnetic field vector from the sensor’s output, which is corrupted by various errors. 

Over the last several years, I have tinkered with the AK8963 magnetometer, which is part of the MPU-9250 IMU. Through trial-and-error, I eventually arrived at a dependable calibration procedure. The calibration procedure is repeatable and produces sufficiently accurate estimates of the yaw attitude. It is typical for the yaw attitude to have less than 2 degrees of error, provided the aircraft is in the wings-level configuration. Because of this work, the magnetometer now serves as the primary yaw attitude sensor within our autopilot system. 

In this article, I want to share four lessons I learned while integrating the magnetometer into an autopilot system. These lessons are probably familiar to those who work with magnetometers. However, for those who are new to magnetometers, this article may alert you to some common pitfalls. The four lessons are:

  1. Artificial errors impose special considerations on magnetometer calibration.
  2. Autocalibration methods have a fundamental flaw.
  3. Keep it simple with the compass swinging method.
  4. The throttle command can be related to a magnetometer bias.

In the remainder of this article, I will expand upon each of the previous points. For a more mathematical treatment of the proposed calibration procedure, the reader is referred to the attached PDF.

Artificial errors impose special considerations on magnetometer calibration

Magnetometers are subject to two types of errors: instrument errors and artificial errors. Instrument errors are associated with the sensor itself. Instrument errors are less noticeable in high-grade sensors than low-grade sensors. Artificial errors are extraneous magnetic sources that compete with the geomagnetic field. Artificial errors are associated with the environment. For this reason, you should calibrate the magnetometer in an environment that is similar to the sensor's operative environment. Practically, this means calibrating the sensor in its final configuration on the vehicle. This also means calibrating the sensor outdoors, away from buildings and power lines, which may distort the geomagnetic field. While it may be more convenient to calibrate an isolated magnetometer on a workbench, the calibration would likely become obsolete as soon as the magnetometer is mounted on the vehicle. 

It is worth emphasizing that artificial errors are not associated with the sensor itself.  Consequently, you cannot fix artificial errors by getting a better sensor. You fix artificial errors by (i) shielding/isolating the extraneous sources or (ii) removing the effects of the artificial errors from the data. In this work, we take the latter approach.

That said, the choice of magnetometer is still important. The full-scale range (FSR) of the sensor is particularly important. You don't want a sensor with a FSR that is orders-of-magnitude greater than the magnitude of the quantity of interest. This is because range and resolution are competing factors: what you gain in range you lose in resolution. For our application (vehicle state estimation), the quantity of interest is the geomagnetic field, the magnitude of which is about 0.5 Gauss. The AK8963 is a poor choice for our application because its FSR is 50 Gauss, which is 100x greater than the quantity of interest!

Autocalibration methods have a fundamental flaw

An autocalibration method determines the calibration parameters solely from magnetometer measurements. This feature makes it easy to collect the calibration data. Undoubtedly, the ease of data collection contributes to the popularity of a particular autocalibration method, the ellipsoid-fitting method. However, it can be shown that autocalibration methods are unable to correct for misalignment between the magnetometer and other inertial sensors [1]. Furthermore, it can be shown that misalignment is detrimental to attitude estimation [2]. In order to correct for magnetometer misalignment, additional sensors must be used. The theoretical flaw with autocalibration methods has real-world ramifications. In my experience, I have yet to see the ellipsoid-fitting method produce satisfactory estimates of the yaw attitude.

Keep it simple with the compass swinging method

Alternatives to the ellipsoid-fitting method include the dot-product invariance (DPI) method [3] and the compass swinging method. These alternative methods correct for misalignment by assimilating data from an additional sensor. The DPI method assimilates accelerometer data. The swinging method assimilates data from an "imaginary" magnetometer. We obtain data from the "imaginary" magnetometer by deducing the components of the geomagnetic field based on the orientation of the vehicle with respect to a compass rose. The mathematical details for all three calibration methods are included in the attached PDF.

The calibration procedure that I recommend applies the DPI method and the compass swinging method in succession. The DPI method is used to obtain a crude 3D calibration. The swinging method is used to enhance the measurement accuracy in the wings-level position. Of course, the calibration accuracy will decrease as the vehicle departs from the wings-level position. However, it is reasonable to assume the vehicle will remain close to the wings-level position during constant-altitude operations. Furthermore, deviations from the wings-level position are bounded due to roll and pitch constraints.

Ideally, the magnetometer would be well-calibrated after applying the DPI method. In practice, however, the estimated yaw angle can have up to 10 degrees of error. The error is linked to the off-diagonal elements of the matrix that appears in the inverse error model. The off-diagonal elements are difficult to observe using the DPI method. That is, the off-diagonal elements vary from run to run. For this reason, the swinging method is needed as an additional calibration step.

The throttle command can be related to a magnetometer bias

An electrical current will generate a magnetic field according to the Biot-Savart law (see this HyperPhysics article). A large current close to the magnetometer will bias the sensor and alter the estimated yaw attitude. A common source of such a current is the current drawn by the electric powertrain of the aircraft. The current-induced bias can be canceled by subtracted an estimated bias from the magnetometer readings. The estimated bias is proportional to the current. The proportionality constants can be estimated from system identification tests.

Of course, the previous solution requires a current sensor on the powertrain. The current sensor may be avoided by recognizing that the current is related to the throttle command. Using physical models of the UAV's powertrain (see the figure above), we can show that the current-induced bias is proportional to the square of the throttle command [4]. The figure below plots the magnetometer bias versus throttle command. The variable h3 denotes the component of the current-induced magnetometer bias along the z axis of the sensor. Overlaying the data is the quadratic model predicted by the powertrain analysis.8793495290?profile=RESIZE_400x

Conclusion

In conclusion, this article offers four insights on magnetometer calibration. First, we show that artificial errors impose special considerations on magnetometer calibration. Second, we caution the reader about autocalibration methods, such as the ellipsoid-fitting method. Third, we propose a calibration procedure that combines the DPI method and the swinging method. Finally, we propose a quadratic model for throttle-induced magnetometer biases.

magnetometer_calibration_procedure.pdf

References

[1] J. L. Crassidis, K.-L. Lai, and R. R. Harman, “Real-time attitude-independent three-axis magnetometer calibration,” J. Guid., Control Dyn., vol. 28, no. 1, pp. 115–120, 2005.

[2] D. Gebre-Egziabher, “Magnetometer autocalibration leveraging measurement locus constraints,” J. Aircr., vol. 44, no. 4, pp. 1361–1368, Jul. 2007.

[3] X. Li and Z. Li, “A new calibration method for tri-axial field sensors in strap-down navigation systems,” Meas. Sci. Technol., vol. 23, no. 10, pp. 105105-1–105105-6, 2012.

[4]  M. Silic and K. Mohseni, “Correcting current-induced magnetometer errors on UAVs: An online model-based approach,” IEEE Sens. J., vol. 20, no. 2, pp. 1067–1076, 2020.

Read more…

8744720662?profile=RESIZE_710x

Olá pessoal,

 

Estou em busca de pessoas interessadas em fazer parte de uma equipe de uma StartUp da área de drones recem aprovada em edital nacional no Brasil.

Estamos precisando de pessoas com conhecimento em desenvolvimento de hardware e software de drones ou conhecimentos em processamento de imagens.

Estamos na fase de prototipação de produtos e serviços. É necessário também conhecimento do idioma português, pois todo o processo de mentoria será realizado neste idioma.

Interessados entrar em contato comigo até o dia 03/04/21 por aqui ou pelo email mesquita.geison@gmail.com.

Obrigado!!!

 

 #########################################################################

Hey guys,

 

I am looking for people interested in being part of a team of a Drone StartUp recently approved in national competition in Brazil.

We are looking for people with knowledge in the development of hardware and software for drones or knowledge in image processing.

We are in the prototyping phase of products and services. Knowledge of the Portuguese language is also necessary, as the entire mentoring process will be carried out in this language.

Interested to contact me until 04/04/21 from here or by email mesquita.geison@gmail.com

Thanks!!!

 

 

Read more…

DJI TELLO becomes smart

This video demonstrates face tracking using a DJI TELLO drone. By means of the Python DJITELLOPY SDK and the mobilenet ssd artificial intelligence neural network the drone is converted into a smart drone. Tensorflow Lite and Google's Coral usb accelerator enables real time inference.

Read more…

Pioneering scientists lead the research of marine mammals utilizing drones in the most epic ways.

8684740677?profile=RESIZE_710x

There are countless growing threats to whales and dolphins in the oceans, that range from pollution, ship strikes, entanglement in fishing gear, climate change among others. Many species remain critically endangered. Ocean Alliance is actively working with the daunting task of acquiring more and better data with the objective to protect and save these beautiful creatures. But collecting data of marine mammals is no easy task, it requires a permanent, consistent effort, the most resourceful brains and the best tools. Therefore, Ocean Alliance developed their Drones for Whale Research (DFWR) program, in which they have been actively utilizing drones in the most creative ways. Apart from using multirotor drones that actually fly a few meters above the whales getting right into their blow of misty exhales to collect biological information (SnotBot), they are also using fixed wing amphibious drones to capture a broad range of data on the whales and their habitat. 

 Dr. Iain Kerr, CEO Ocean Alliance, recalls after using the Aeromapper Talon Amphibious for over 8 months:

“Ocean Alliance is a Conservation Science organization, in that we collect data so that we can best advise wildlife managers and policy makers on strategies to help conserve endangered marine species. Alas Oceanography has long been a rich man’s game, prior to our acquisition of an Aeromapper Talon Amphibious it has been a real challenge for us to collect the types of persistent, consistent offshore data on whales that we would like.

Our Aeromapper Talon Amphibious, has met all of the key touchstones that we look for in a new tool, affordable, field friendly, user friendly, scalable and easily modified or updated. I really think that you have made the right choice with regards to hand launch and water landing.  VTOL fixed wing drones (and I have tested a few) just seem to chew up too much battery time taking off and landing.  While our work to date has been primarily flying our Aeromapper Talon Amphibious on Stellwagen banks to find and count humpback whales, we have been approached by two aquaculture groups (principally mussels & oysters) and a field marsh conservation group to demonstrate the capacity of our Aeromapper Talon Amphibious for their use cases.

Ocean Alliance’s Drones For Whale Research program has been receiving international interest, our lead drone initiative SnotBot has been featured on BBC Blue Planet Live and twice on National Geographic, once with Will Smith in the series One Strange Rock.  Also, more than 18 groups worldwide now use protocols we have developed with SnotBot for whale research. We are excited to have added the Aeromapper Talon Amphibious to our drone stable and I am sure will have more exciting news to report on when we get back into the field.

I think that the Aeromapper Talon represents one of the best solutions out there for offshore research, monitoring and mapping. The hand launch water landing option makes the best use of battery and support resources. Those engaged in the new blue economy need the type of solution that Aeromao Inc., is offering, affordable, field friendly, scalable, robust and user friendly!”

 

 

 

8684744097?profile=RESIZE_710x

8684744883?profile=RESIZE_710x

8684745272?profile=RESIZE_710x

8684745092?profile=RESIZE_710x

Read more…

AMX UAV Officially Launch Vertic VTOL Drone

After 3 years development and test, AMX UAV officially launch Vertic, a quad tailsitter configuration VTOL drone specially designed for aerial mapping mission. Vertic is a drone with copter like (helicopter & multicopter) capability that can take-off on narrow area. After reach safe altitude, Vertic will transition to fixed-wing mode, ensure more flight efficient. Vertic will transition back to copter mode after the mission was finished and right above landing position. Vertic can be used for various application such as agriculture, plantation, urban planning, mining, forestry, and oil & gas.

8672550071?profile=RESIZE_710x

Vertic can fly fully autonomous, from take-off to landing, with preconfigured mission parameters. Vertic equipped with a failsafe system for safety, for example is quadchute system. This feature automatically change Vertic to copter mode if the flight controller detects anomaly on altitude loses. User with multicopter skills can manually control this drone easily since it’s similar with multicopter control. This easy-to-operate capability can minimized time spent on training.

8672559464?profile=RESIZE_710x

Vertic uses composite materials to ensure toughness and reliability. The detachable fuselage-wing design, the 1.4 meters wingspan drone is very compact and easy to deploy. Vertic is powered by 14WH Lithium Polymer. The drone can fly up to 35 minutes or cover about 250 hectares for single aerial mapping mission. Vertic ground control station (GCS) software are supported both on Windows or Android operating system connect to the drone up to 15KM. There are 2 type payload that can intergrated with Vertic, the Sony RX0 and Parrot Sequoia+. Sony RX0 is 15,3 mega pixels RGB camera, support with CarlZeiss fixed lens. For agriculture, forestry, and plantation mission, Parrot Sequoia+ multispectral camera can be used for deep analysis.

8672555499?profile=RESIZE_710x

All the advantages that Vertic has to offer, also supported by very competitive prices. Vertic standart packages is offered at a price $8500. The standart packages include ready-to-fly drone, Sony RX0 camera, GCS laptop, 3pcs battery, 2set propeller spareparts, tools and hard case. The standart packages also include training program for 2 operator, location in Yogyakarta, Indonesia. Contact us through email admin@amx-uav.com or phone +62-811292565 (Whatsapp/voice call/message) to get more info about Vertic and other AMX UAV products. 

Read more…

The threat from physical intrusion still remains one of the top concerns in both commercial and non-commercial contexts. According to a report from Markets and Markets, the video surveillance market, which includes both hardware and software, is presently at USD 45.5 billion and expected to reach USD 74.6 billion by 2025.

Over the years, there have been many advancements in optics and detection systems but limitations still exist in the conventional ways of using them. To overcome these limitations, security stakeholders are now incorporating drone technology in their operations.

In this blog, we will talk about drones and the FlytNow solution for perimeter security.

What is perimeter security?


automated perimeter security

Perimeter security is an active barrier or fortification around a defined area to prevent all forms of intrusion. Modern security systems are an amalgamation of sophisticated hardware and software that generally include cameras, motion sensors, electric fencing, high-intensity lights, and a command center to manage them all. 

Challenges with conventional security systems (without drones) for perimeter security


Below are some of the drawbacks and limitations that are inherent in a conventional security system:

  • CCTV cameras and motion detectors are stationary, thus leaving plenty of room for blind spots.
  • Patrolling requires human guards - for larger areas, this is the least efficient way of securing a premise.
  • Response to an intrusion is delayed since a human responder has to reach the location.

Benefits of using drones for perimeter security


Drones have the following advantages over a conventional security system:

  • Drones are mobile flying machines that can go to any location quickly, with HD camera(s), thus eliminating blind spots.
  • Drones can also be equipped with a thermal camera(s) which are useful during nighttime surveillance.
  • Drones can be automated for patrolling using the FlytNow cloud-connected solution and commercially available DiaB (Drone in a Box) hardware.


Note: A DiaB is box-like hardware that houses one or more drones. The hardware keeps the drone flight-ready (24x7) and also automates the launching and docking processes of a drone.

Drones automation for security


For perimeter security, drones are generally used in conjunction with Drone-in-a-Box hardware and a fleet management system that powers the command center. Other security system hardware, including CCTV cameras, motion sensors, etc. can complement the drones and can be connected to the command center, thus integrating into a complete system. In a real-life scenario, such a system might work in the following way:

Drone Command Center

  • An intrusion is detected by one of the CCTV cameras in an area under surveillance. 
  • The command center receives the alert and initiates a drone launch. 
  • A connected DiaB receives the launch request and releases a drone. 
  • The drone flies to the location where the intrusion was detected and begins streaming a live video feed. 
  • An operator maneuvers the drone to cover all blind spots.
  • On finding the intruder, the operator has the option to warn him/her about the transgression using the drone’s onboard payload such as a beacon, spotlight, speaker, etc.


To know about the kind of drones and sensors that can be used for security and surveillance operations please refer to our Drone Surveillance System: The Complete Setup Guide.

How FlytNow enabled perimeter security?


FlytNow is a cloud-based application that helps in managing and controlling a fleet of drones from a unified dashboard through automation, live data streaming and integration. In the context of perimeter security, this translates into a command center that connects drones with the traditional components of a perimeter security system.

6 Reasons to use FlytNow for perimeter security


#1 Easy Setup: FlytNow is cloud-hosted i.e. a user can access the application from any standard web browser, without any complicated server setup. Connecting the drones with the system is also easy and is done using FlytOS.

#2 Unified Dashboard: FlytNow features an advanced dashboard that shows the following:

  1. A live map showing the real-time location of all the drones. The map can be customized to show points of interest, and virtual geofence, and CCTV zones.
  2. On-screen GUI controllers and keyboard & mouse support to control a drone. This allows an operator to easily maneuver a drone to a point of interest from the command center.
  3. Multicam support that allows streaming video feeds from more than one drone.
  4. Different view modes that allow an operator to switch between RGB and thermal mode. In the thermal mode, there is the option to switch between different color pallets, allowing a user to identify warm objects against different backdrops.
  5. Pre-flight checklist which is a list of checks the system prompts an operator to perform before initiating a drone launch.


#3 Live Data Sharing: An operator can share the live video feed from a drone directly from the dashboard. The feature can be used to share video with the police or other remote stakeholders.

Using Drones for Perimeter Security

#4 Advanced Automation: Operating drones through manual control is quite an inefficient way to use drones. Instead, automation should be employed to perform activities like security patrols. FlytNow comes with an advanced mission planner that allows a user to define a path for a drone to follow and save it as a mission. The mission can be executed periodically, thus making a fleet of drones perform automated patrolling.

Self charging security drone software

#5 Add-on Modules: FlytNow provides add-ons to make a drone intelligent; this includes precision landing over a computer-generated tag, obstacle detection, and object identification. These add-ons enable a drone to autonomously fly to a location, identify a threat, and return to the DiaB hardware.

#6 Drone-in-a-Box Hardware Support: The functions of DiaB hardware, in the context of perimeter security, can be broadly classified into four categories:

  1. Securely house a drone.
  2. Keeping the drone fully charged all the time.
  3. Initiate a drone launch.
  4. Successfully dock a returning drone.

Summary


In this blog, we discussed the concept of perimeter security, the limitations of conventional security set up, and how these limitations can be overcome using drones. Then we covered how drones are actually used for aerial patrols and 6 reasons why FlytNow is an ideal solution for automating drones for perimeter security.

There are plenty more reasons to use FlytNow for perimeter security that you can find out by signing for our 7 days free trial.

Read more…

8665087083?profile=RESIZE_710xI am looking for a collaboration with a drone company to do a high altitude flight experience. We are starting a small company called Loweheiser, we are developing EFIs for small engines for UAVs. Right now we are developing a throttle body for an RCGF 15RE, and I also want to design one for the 10cc of this same brand. The 10cc is probably the smallest electronic fuel injection engine that has ever been made, At the moment the smallest engines we have worked on are the Saito FG-21 and the FG14, the FG14 has a 7mm throttle body!


Many clients ask me how high the EFI works. I tell them that the EFI can operate at any altitude that a fixed-wing UAV can reach. The absolute pressure sensor of the ECU can measure from 15 to 115 kPa, 15kPa are 60000m so it seems not a problem. We have tested at 2000m because it is the maximum we can reach in a few hours of travel, but I would like to test it in flight with higher altitude.
In my initial calculations I see that a normal plane could reach 10,000m without problems. A plane of about 2m wingspan and about 3kg of weight consumes less than 100w to maintain a cruise.

The 15cc engine has about 1.54Kw. The motor loses power linearly with the loss of density of the atmosphere The density of the atmosphere at 25ºC 0m (sea level) is 1.12720 kg / m3, the density of the atmosphere at -20ºC and 10000m is 0.453337kg / m3, that's 40% of the power of the engine at sea level. it is 600w, even introducing losses of another 50% due to propeller performances and other ineficiencies ... there is still 300w which is more than enough to fly at 10000m


Where can be done this high altitude challenge?
I am looking for a collaboration with a drone company who has experience in flying in an airspace where this test can be done and who can take permission from the concerned authorities. 

Best regard, jlcortex

Read more…

I feel like this has been done many times before, including by the Skydio team when they are at MIT, but DARPA was impressed:

 

From JHU:

Johns Hopkins APL Helps DARPA OFFSET Program Take Flight

​In the video above, the APL team’s fixed-wing unmanned aerial vehicle launches in fierce winds from approximately 100 meters away from its target and successfully completes two passes through the crowded urban area while maintaining three meters of space from the target building.

Credit: Johns Hopkins APL


​In a second test, the team launched from 250 meters out, flying four times faster than the quadcopters, around a bigger building and under an overpass, autonomously and without crashing.

Credit: Johns Hopkins APL

With each second of the video that ticks away, the suspense builds. Joseph Moore launches a fixed-wing unmanned aerial vehicle (UAV) into the air and it’s buffeted by the wind. Undeterred, the UAV goes about its task to navigate around buildings at high speeds in an urban environment.

The wind picks up at points, and the neon-green fixed-wing UAV steadies itself on those occasions. But ultimately, it navigates the course adeptly, coming within about 10 feet of the buildings and steering around them with relative ease. Most importantly: it doesn’t crash.

“That was a gate for us to get through,” said Moore, the project manager of the Research and Exploratory Development Department team that ran the test at Joint Base Lewis-McChord in Washington state this past August. “We’d never tested anything in an actual physical environment, so proving what we did was huge.”

The test was part of the Defense Advanced Research Projects Agency (DARPA) OFFensive Swarm-Enabled Tactics (OFFSET) program, which envisions swarms of up to 250 collaborative autonomous systems providing insights to ground troops as they operate in dense metropolitan environments. The program is about four years old, Moore said, and it’s unique in structure because the two swarm system integrators — Northrop Grumman and Raytheon — are creating the testbeds and simulation environments for crafting tactics for large-scale autonomous swarms in urban environments.

“OFFSET is developing a variety of swarm-enabling technologies,” said Timothy Chung, the DARPA OFFSET program manager, “from a rich repository of swarm tactics, to virtual environments for swarm simulation, to physical testbeds with real robots where these swarm tactics can be demonstrated in real-world settings.”

This specific test was an effort to answer Moore’s team’s central question for this phase of the project, known as sprints: could fixed-wing UAVs have quadcopter UAV agility and mobility but add greater range, endurance and speed, given that they were fixed-wing in form?

“Imagine you have a futuristic sensor on your aircraft that could, theoretically, map the interior of a building and produce a floor plan,” Moore explained. “You want to put that sensor on a fixed-wing UAV, fly really fast and really close to the building, and come away with a rapid interior scan of the floor plan.

“We’re not there yet, but our goal was to control the vehicle at high speeds in an urban, outdoor environment and do multiple passes around the target building without hitting it.”

UAVs are typically thought of as propeller-armed quadcopters, but previous Independent Research and Development (IRAD) work featuring aerobatic maneuvers with fixed-wing UAVs put APL in an advantageous position to push OFFSET’s fourth sprint forward.

The team took that base work from the IRAD, including its Aerobatic Control and Collaboration for Improved Performance in Tactical Evasion and Reconnaissance (ACCPITER) technology, and spent the first six months of the sprint working with virtual aircraft in a virtual world and the final six using a physical aircraft in a virtually constructed environment.

Using a mesh — a virtual map — of previous DARPA field tests in urban environments, the team flew their fixed-wing UAVs in a virtual world on APL’s campus. They tested, developed the proper algorithms and software, and worked to program an essentially “off-the-shelf” aircraft with bespoke APL-developed electronics package and software.

They did all that at the Lab, but until they trekked to Washington in August, they hadn’t tested it in the physical world. The vehicle’s performance in the virtual world was good. The validation in the physical world performance was exceptional.

In fierce winds, the team launched the craft from approximately 100 meters away, successfully completed two passes through the crowded urban area while maintaining three meters of space from the target building, and then pushed the test to a 250-meter launch, flying four times faster than the quadcopters around a bigger building and under an overpass, autonomously and without crashing.

The program’s fifth sprint is underway, and Moore said this period will focus on adding larger numbers of fixed-wing vehicles operating in urban environments together. The groundwork laid in Sprint 4, especially in validating vehicle performance in the physical world, will be crucial as the team moves forward to address more challenging and complex urban swarm scenarios.

Read more…

Long range autonomous boat with ArduPilot

From Hackaday:

Thanks to the availability of cheap, powerful autopilot modules, building small autonomous vehicles is now well within the reach of the average maker. [rctestflight] has long been an enthusiast working in this space, and has been attempting long range autonomous missions on the lakes of Washington for some time now. His latest attempt proved to be a great success. (Video, embedded below.)

The build follows on from earlier attempts to do a 13 km mission with an airboat, itself chosen to avoid problems in early testing with seaweed becoming wrapped around propellers. For this attempt, [Daniel] chose to build a custom boat hull out of fiberglass, and combine both underwater propellers and a fan as well. The aim was to provide plenty of thrust, while also aiming for redundancy. As a bonus, the fan swivels with the boat’s rudder, helping provide greater turn authority.

After much tuning of the ArduPilot control system, the aptly-named SS Banana Slug was ready for its long range mission. Despite some early concerns about low battery voltages due to the cold, the boat completed its long 13 km haul across the lake for a total mission length of over three hours. Later efficiency calculations suggests that the boat’s onboard batteries could potentially handle missions over 100 km before running out.

Read more…

Sky-Drones SmartLink Update

8652926272?profile=RESIZE_710x

 

Sky-Drones Technologies power drones for enterprise business solutions. As creators of full-stack UAV avionic technology, the company aims to accelerate the development and adoption of UAVs for enterprise. This year started off strong for Sky-Drones with an upgrade to their product range: the new and improved SmartLink.

Each SmartLink set contains an air and ground module. The air module is attached to the UAV whilst the ground module is connected to the ground control station. Improvements made to the air unit mean the system is capable of handling two real time HD video streams from CSI and HDMI cameras, the CSI camera itself being an additional element added to the SmartLink set. Alongside this, a huge array of payload interfaces are now embedded into the unit including USB, UART, I2C, and SPI for allowing the tight integration into Sky-Drones’ hardware and software.

smartlink2-air-left.jpg

Improvements made to the ground control device include manufacturing this compact and lightweight module with a micro-USB connector, thus allowing the interaction with multiple devices including laptops, smart phones, tablets, and desktop computers. The addition of an adaptive fan implements a cooling system which allows the module to withstand high ambient temperatures. This makes the entire system that much more reliable in harsh environments and increased longevity on lengthier missions.

The drone datalink and integrated onboard computer has a standard range of up to 20km. However, an unlimited flight range can be achieved thanks to Sky-Drones’ advancement in LTE connectivity. Connecting your UAV via LTE will enable an endless range, all being your UAVs remain within the LTE coverage area. Planning to cover a much larger area than your coverage zone? Our specialised antennas can ensure your range reaches several dozens of kilometres further afield. For single or cluster drones in the surveillance, search and rescue, and delivery sectors, there has never been a more obvious solution to multi-drone control with unlimited possibilities.  

smartlink2-dmhx-use-case.jpg

In order to bring these additions into view, Sky-Drones founder and CEO Kirill Shilov undertook a live webinar on Wednesday 24th February. This webinar included a full product unpackaging, demonstration of compatibility with Sky-Drones software, and a live Q&A from members of the audience. As a result, all the asked and answered questions have become a permanent feature on the SmartLink product page on the Sky-Drones website

The new and improved SmartLink hardware works in tight unison with Sky-Drones SmartAP GCS software, allowing completely autonomous flight control and communications from anywhere in the world. Now, pilots and drones no longer need to be in the same vicinity, allowing BVLOS flights to be planned and initiated quicker than ever before. Access to flight data and logs in Sky-Drones Cloud can also be accessed worldwide by all who are granted access with your personalised company login.

This product renovation has been a major advancement for the Sky-Drones team. Meeting the needs of enterprise business solutions and breaking the ceiling of UAV excellence is something our company is driven to exceed and making sure our products excel in every possible way is what we will continue working towards. Please visit the new SmartLink product page for more information and detailed technical specificaitons.

 

Read more…

Simple Waypoint Navigation for Fixed-Wing UAVs

There is a simple way to implement waypoint navigation. The algorithm was introduced by Lawrence, Frew, and Pisano back in 2008. You can view the paper here: http://dx.doi.org/10.2514/1.34896.

I have flight tested this algorithm multiple times on a low-resource autopilot embarked on a fixed-wing UAV (see figure below). In this blog post, I want to share my approach to implementing this algorithm. I cannot claim any academic novelty. However, I hope to make this simple and effective algorithm more accessible to other researchers/hobbyists.

8598951697?profile=RESIZE_400x

In the remainder of this post, I will provide a high-level view of the algorithm. I will also show some flight test results. If the reader wants a more mathematical treatment of the implementation, the reader is referred to the attached PDF.

The algorithm is simple because it uses a single guidance vector field. Vector fields, in general, assign a vector to each point in space. Guidance vector fields, in particular, assign a desired velocity vector to each point in a 2D plane. The vector fields are designed such that they guide the vehicle to a particular path. In this case, the vector field brings the vehicle into a loiter circle. The following figure shows the vector field. The vector field also generates a desired acceleration vector. The acceleration vector is needed for accurate path following.

8598890294?profile=RESIZE_400xWe feed the guidance commands from the guidance vector field into the lateral control system. However, the input to the lateral control system is a bank angle (see figure below).  The block labeled “AC” represents the aircraft dynamics and the block labeled “Ail” represents the dynamics of the aileron actuator.
8598944085?profile=RESIZE_400x

Thus, it is necessary to convert the desired velocity and the desired acceleration into a desired bank angle. We use a two-step conversion.

First, we convert the guidance commands into a lateral acceleration command. The lateral acceleration command has two terms. The first term drives the angular misalignment between the vehicle velocity and the desired velocity to zero. The second term incorporates the desired acceleration vector; the second term functions as a feed-forward acceleration term needed for accurate path tracking.

Second, we convert the lateral acceleration command to a desired bank angle. The relationship between lateral acceleration and bank angle is illustrated in the figure below. By banking, the lift force attains a lateral component, which produces the desired lateral acceleration.

8598960700?profile=RESIZE_400x

Having described basic loiter circle tracking, we are ready to move on to waypoint navigation. The waypoint navigation routine is actually loitering in disguise: the routine positions loiter circles so that the vehicle smoothly intersects each waypoint in succession. The positioning algorithm is shown in the figure below.  The positions of the previous waypoint, current waypoint, and next waypoint are denoted by A, B, and C, respectively. The center point lies along the bisector of angle ABC. The loiter radius sets the distance between the center point and the current waypoint. Having determined the loiter center, the next step is to determine the sign of the circulation constant. If C is to the left of the line AB, then the circulation is counter-clockwise (<0). If C is to the right the line AB, then the circulation is clockwise (>0).

8599147484?profile=RESIZE_400x

A nice feature of the positioning routine is that it can work with only two waypoints. The waypoints are stored in a circularly linked list. Hence, the "next" waypoint and the "previous" waypoint can point to the same waypoint. 

Next, we describe the algorithm that governs how the aircraft switches from one waypoint to the next. 

The positions of the current waypoint and next waypoint are denoted by A and B, respectively. Let LA denote the loiter circle that brings the aircraft into A. Suppose the aircraft has just "hit" the current waypoint. The navigation routine sets the current waypoint to B and computes the parameters of LB, the loiter circle that brings the aircraft into B.

Now, the straightforward approach is to immediately switch from using LA to LB. However, this approach will change the guidance vector in an abrupt manner. To achieve a smooth transition between waypoints, the switch from LA to LB occurs when the velocity vectors from both loiter circles are pointing in roughly the same direction.  A smooth transition protects the low-level control system from large changes in the input command.

The figure below illustrates the switching algorithm. In the top plot, the aircraft is about to hit A. In the second plot, the aircraft has hit A and has set up the loiter circle of B. The aircraft, however, continues tracking the loiter circle of A. In the third plot, the guidance vectors from both loiter circles are aligned. The aircraft begins tracking the loiter circle of B.  In the fourth plot, the aircraft is en route to B.

8603314891?profile=RESIZE_400x

8603325682?profile=RESIZE_400x

8603334665?profile=RESIZE_400x8603342856?profile=RESIZE_400xI implemented the guidance system described herein using fixed-point arithmetic on the AMP autopilot, which is made in-house by our research group (see https://doi.org/10.2514/1.I010445). The microprocessor belongs to the DSPIC33 family of microprocessors made by Microchip.

Flight test results are shown in the figures below (map data ©2020 Google). The test took place at the Flying Gators RC Airport in Archer, FL. The first plot shows the flight path of the delta wing UAV performing waypoint navigation with four waypoints. The waypoints are positioned to create a figure-eight trajectory. The second plot shows the flight path of the delta wing UAV performing waypoint navigation with two waypoints. You can view a synthetic video of the flight test here: https://youtu.be/otRW2_80G0U.The video is reconstructed from downlinked telemetry data. You can view an actual video of a portion of the flight test here: https://youtu.be/jQyc3_tk7MA.

8603828281?profile=RESIZE_400x

8603768881?profile=RESIZE_400x

In the first plot, we note that there is a clear asymmetry in the flight pattern. This asymmetry was due to the aircraft being out of trim. When I examined the roll data after the flight test, I found that the aircraft was better at tracking negative roll commands as opposed to positive roll commands. Hence, the asymmetry has to do with the control system, not the guidance system.

In conclusion, this blog post provides an overview of a simple and effective waypoint navigation scheme. For a more mathematical treatment, the reader is referred to the attached PDF. I have also attached a MATLAB code that simulates loiter circle tracking. The vehicle dynamics are represented using a matrix state-space model.

simple_waypont_navigation.pdf

code.zip

Read more…

Stereoscopic systems widely used in drone navigation, but this project is using a new approach - variable baseline.

In the related Arxiv.org (PDF) the team showcases three different applications of this system for quadrotor navigation:

  • flying through a forest
  • flying through an unknown shaped/location static/dynamic gap
  • accurate 3D pose detection of an independently moving object

They show that their variable baseline system is accurate and robust in all three scenarios.

For the video capture, the Raspberry Pi-based StereoPi board was used. Additional AI-acceleration hardware (Intel Movidus) is considered as a next step, as well as using a more powerful CM4-based version of the StereoPi (v2).

Here is the brief video of the project:

 

Read more…

Full VR Drone

Hello, people.

I decided to open a new thread to register some progress about this project. I am working on it in my spare time, and I feel excited about it.

This is a very preliminar video -- please disregard the pink view, I was using an infrared camera -- with a normal camera the video is way more clear and interesting. I also need to find better 180º (or better yet, 230º) lens. I am also willing to go full 360 degrees eventually.

So, it is an experimental Virtual Reality controlled drone with some AR features which I will show when I finish implementing them. The plan is to have one or more people walking and controlling the drone as if they were truly inside the physical aircraft. Much like the traditional FPV, but not limited to the video view only, as the crew will be able to interact and walk around freely while they fly.

I have lots of ideas to implement on this experiment. I plan to talk about them while they are implemented.

At first I wanted to show it only after I had a real flight video. But for a long time I had no opportunity to bring all the equipment to a secure place for a flight. With the pandemy, I can't go to a safe place where I feel secure with my Oculus Quest and all the related hardware with me. Hopefully the next video will be more interesting, as it will show a real flight with a non-infrared camera, plus all interactions from arming the drone to taking-off, flying around and landing back, while showing some AR overlays.

Please note that this is NOT a fake video, I can already control this quadcopter from start to end using an Oculus Quest. I was using my home, conventional 2.4ghz Wifi when I recorded the video, but I could also use Raw Wifi (much like Befinitiv's WifiBroadcast, although I do have my own implementation from scratch), or 4G. It is always low latency, using a simple UDP protocol made by me.

The flying hardware has a Raspberry Pi commanding a Pixhawk through serial Mavlink, while exchanging information with the VR headset through a ground-based Rpi (this ground Pi has a 1w sunhaus and a Yagi antenna for extended (Raw)Wifi range). The shown 3D part on Oculus Quest was done using Unity3D for now -- not sure if I will keep that path, though. Unity is a nice engine and quick to develop on, but it has its drawbacks when used on custom projects like this. I do have my own 3D engine and a second implementation of the same environment (except for the nice hands), which might take over in the future. My low-level engine is very lightweight and can also run on Webasm if desired. It does not have an editor yet, though, and is harder to use, so there is an indecision in that regard. =)

This is planned to be free and open-sourced, when it reaches a certain maturity level.

Read more…

Renewable drones are planned to expand at a rapid pace as the industry focuses on UAV services, line-of-sight applications, ocean-going ship tracking, inspection or wind turbine inspection, an inspection of offshore platforms and refineries, monitoring of power lines, and solar panels in the energy sector. As one way to alleviate congestion and enhance the air quality in urban areas, passenger drones have been touted. Equipped with thermal cameras, drones make it possible to conduct inspections rapidly and on a scale. On wind farms around the world, drones are changing inspections. Wind turbines are left exposed to the elements as they run - both onshore and offshore. Also, minor damage and wasted energy can cause inefficiencies. By offering rapid and remote coverage of turbines, drones will reduce the time engineers need to spend in precarious positions. The technology is also much cheaper than a manned team, ensuring that wind farms can carry out drone inspections of wind turbines with greater regularity to keep operations running at 100%.

According to the research report published, renewable Drones Market to surpass USD 152 million by 2030 from USD 42 million in 2019 at a CAGR of 26.5% throughout the forecast period, i.e., 2020-30.

The key factors driving the demand for sustainable drones are expected to increase the adoption of drones to reduce the cost of inspection operations based on asset optimization and increasing construction of solar and wind farms. Rising environmental issues and increasing the use of clean energy alternatives are likely to be key factors driving the global demand for green drones during the forecast era. Recent technical developments and international agreements have also enabled countries worldwide to shift to the renewable energy market and develop their energy infrastructure which is expected that this will expand the global demand for renewable drones over the coming years.

Get a Free Sample Copy of Research Report Here: https://www.fatposglobal.com/sample-request-398

Multirotor Segment to Grow with the Highest CAGR During 2020-30

Renewable Drones Market is segmented by type as multirotor and fixed-wing. The greater market share in 2018 was accounted by multirotor segment owing to various advantages over Fixed-wing drones. Vertical takeoffs and landings can be done by multi-rotor planes. For easy inspection, visualization, and modelling, they also need less space to take off can hover mid-flight, and manoeuvre around objects. In addition, multirotor drones use multiple propellers to manoeuvre, so compared to fixed-wing drones, they do not need a greater surface area or wingspan. In addition, multirotor drones are designed to be folded down and packed into smaller cases, making it simpler to transport them.

A Solar Segment to Grow with the Highest CAGR During 2020-30

Renewable Drones Market is segmented by end-user into solar and wind. The solar segment is further categorized as solar PV and solar CSP. The solar segment based on end-user was held the maximum market share of XX.X% in 2018 as to meet the growing demand for solar farm inspection and maintenance, asset owners, inspectors, and drone service providers (DSPs) must develop a deep understanding of thermography and flight operations to take full advantage of the benefits of drone-based solar inspection. These factors are driving the solar sector's growth in the market for renewable drones. Increased emissions, high reliability and depletion of non-renewable energy sources are some of the main propellants in the organic construction of the market for renewable drones.

Growing Construction of Solar and Wind Farms

The sector of renewable energy is among the fastest-growing sectors worldwide. With advanced technology, renewable energy plants are being built at a rapid rate, with rising demand for clean and sustainable energy. Countries are moving their emphasis from traditional sources of energy toward rising renewable energy production. At a CAGR of more than 21% since 2000, wind power has increased. In addition, onshore wind power installations are expected to generate demand for new wind turbines as well as the replacement of old turbines. Wind power plant construction is capital-intensive, and asset owners aim to optimize returns and minimize investment. This is where the drones join the picture. Drones would help to minimize wind turbine inspection costs by at least 40%.

Rising Adoption of Drones to Reduce Cost of Inspection Operation

The growth of the Energy Industry Drone Market is mainly driven by the difficulty of remote and discrete systems inspection & monitoring. Renewable drone inspection helps to remove the need for inspection staff to operate at high altitudes. It also decreases maintenance time, when defining whether a repair needs to take place immediately or whether it can be safely postponed. Drones in the energy sector are likely to expand at a significant pace as the industry invests in UAV services, line-of-sight applications, ocean-going ship surveillance, offshore platform and refinery inspection, inspection or wind turbine inspection, power line monitoring and solar panel monitoring.

Strict Regulations for Performing Drone Operations

Drones are extremely important to utilities for conducting inspection activities. Legal provisions, however, have limited development in the drone industry. In certain cases, such as Behind Visual Line of Sight, over a long distance, or at night, these regulations exclude drone operations in particular. Considering that the FAA has not kept pace with the rapid development of drone technology, utility companies have not been able to use drones to the fullest extent possible to increase the effectiveness and quality of inspection operations. However, these regulations do not allow users to operate outside the visual line of sight and do not specify if the governmental operation is the use of drones by public power utilities.

Top Market Players Are:

DJI Enterprise, Terra Drone, Cyberhawk Innovations Limited, PrecisionHawk, ULC Robotics, Sharper Shape Inc., Sky Futures, Asset Drone and YUNEEC.

For More Report Details, Visit Here

Read more…

NexuS UAV presentation

I want to share the first flight of the third development of a fixed wing aircraft for autonomous flight.
This design was made entirely with composite materials and the molds made with 3d printers.
tests resulted in great aerodynamic efficiency and stability and maneuverability
final weight with full payload is 2.6 kg.
for the flight he was ballasted with the final weight for flight.
we estimate endurance of 90 minutes with 10,000 mA and 4S8529323882?profile=RESIZE_710x8529327272?profile=RESIZE_710x

Read more…

Centeye Modular Vision Sensors

8529205664?profile=RESIZE_710xIt has been awhile since I’ve posted anything about recent Centeye hardware. Some of you may remember my past work implementing light weight integrated vision sensors (both stereo and monocular) for “micro” and “nano” drones. These incorporated vision chips I personally designed (and had fabricated in Texas) and allowed combined optical flow, stereo depth, and proximity sensing in just 1.1 grams. Four of these on a Crazyflie were enough to provide omnidirectional obstacle avoidance and tunnel following in all ambient lighting conditions. Their main weakness was that they were time-consuming and expensive to make and adapt to new platforms. Some of you may also remember our ArduEye project, which used our Stonyman vision chip and was our first foray into open hardware. Although that project had a slow start, it did find use in a variety of applications ranging from robotics to eye tracking. I have discussed, privately with many people, rebooting the ArduEye project in some form.

Like many people we faced disruption last year from COVID. We had a few slow months last Summer and I used the opportunity to create a new sensor configuration from scratch that has elements of both Ardueye and our integrated sensors. My hypothesis is that most drone makers would rather have a sensor that was modular and easy to reconfigure or adapt, or even redesign, and are OK if it weighs “a few grams” rather than just one gram. Some users even told me they prefer a heavier version if it is more physically robust. Unlike the nano drones I personally develop, if your drone weighs several kilograms, an extra couple grams is negligible. I am writing here to introduce this project, get feedback, and gauge interest in making this in higher quantities.

My goals for this “modular” class of sensors were as follows:

  • Use a design that is largely part agnostic, e.g. does not specifically require any one part (other than optics and our vision chip) in order to minimize supply chain disruptions. This may sound quaint now, but this was a big deal in 2020 when the first waves of COVID hit.
  • Use a design that is easy and inexpensive to prototype, as well as inexpensive to modify. We were influenced by the “lean startup” methodology. This includes making it easier for a user to modify the sensor and it’s source code.
  • Favor use of open source development platforms and environments. I decided on the powerful Teensy 4.0 as a processor, using the Arduino framework, and using Platform IO as the development environment.

We actually got it working. At the top of this post is a picture of our stereo sensor board, with a 5cm baseline and mass of 3.2 grams, and below is a monocular board suitable for optical flow sensing that weighs about 1.6 grams. We have also made a larger 10cm baseline version of the stereo board, and have experimented with a variety of optics. All of these connect to a Teensy 4.0 via a 16-wire XSR cable- The Teensy 4.0 operates the vision chips, performs all image processing, and generates the output. We have delivered samples to collaborators (as part of a soft launch) who have indeed integrated them on drones and flown them. Based on their feedback we are designing the next iteration.

8529205301?profile=RESIZE_710x

As with any new product you have to decide what it does and what it does not do. Our goal was not to have an extremely high resolution- those already exist, and the reality is that having a high resolution has other costs in terms of mass, power, and light sensitivity. Instead, we sought to optimize intensity dynamic range. The vision chips use a bio-inspired architecture in which each pixel individually adapts to its light level independent of other pixels. The result is a sensor that can work in all light levels (“daylight to darkness”, the latter with IR LED illumination), can adapt nearly instantaneously when moving between bright and dark areas, and function even when both bright and dark areas are visible.

Below shows an example of the stereo sensor viewing a window that is open or closed. (Click on the picture to see at native resolution.) The current implementation divides the field of view into a 7x7 array of distance measurements (in meters) which are shown. Red numbers are those measurements that have passed various confidence tests; cyan numbers are those that have not (thus should not be used for critical decisions). Note that when the window is open, the sensor detects the longer range to objects inside even though the illumination levels are about 1% that of outside. A drone with this sensor integrated would be able to sense the open window and fly though it, and not suffer a temporary black-out once inside.

8529205888?profile=RESIZE_710x

A more extreme case of light dynamic range is shown in the picture below. This was taken with a different sensor that uses the same vision chip. On the top left is a picture of the sensor- note that it was in the sunlight, thus would be subject to the “glare” that disrupts most vision systems. On the top right is a photograph of the scene (taken with a regular DSLR) showing sample ranges to objects in meters. On the bottom is the world as seen by the sensor- note that the Sun is in the field of view at the top right, yet the objects in the scene were detected. Other examples can be found on Centeye’s website.

8529207066?profile=RESIZE_710x

We are currently drafting up plans for the next iteration of sensors. For sure we will be including a 6-DOF IMU, which will be particularly useful for removing the effects of rotation from the optical flow. We are also envisioning an arrangement with the Teensy 4.0 placed nearly flush with the sensor for a more compact form factor. There is still discussion on how to balance weight (less is better) with physical robustness (thicker PCBs are better)! Finally I am envisioning firmware examples for other applications, such as general robotics and environment monitoring. I am happy to discuss the above with anyone interested, private or public.

Read more…
3D Robotics

Demo of Microsoft AirSim with PX4

From the video description:

I wanted to put this video together to share what I've been working on as it relates to PX4 simulation. I've been really impressed with the capabilities of AirSim and I hope this video makes it a little easier to understand. You can learn more about AirSim here: https://github.com/microsoft/AirSim and my GitBook notes can be found here: https://droneblocks.gitbook.io/airsim... To learn more about DroneBlocks please visit: https://www.droneblocks.io Please feel free to leave a comment below if you have any questions and I hope to share more information in the near future. Thanks for watching.

Read more…

Quick install BatMon v4 released

8467210494?profile=RESIZE_710x

BatMon v4 released
One of the main challenges faced with BatMon was the installation overhead. Installing BatMon v3 took over an hour on a new battery pack. Second challenge was the cost overhead of a BMS system on each battery. We have reduced these issues significantly with the BatMon v4 release.
  • v4 is super fast to install on most batteries with a tool, and connects to the balance leads.
  • The modular board make it possible to reuse BatMon after end of life of a battery. The XT90 leads can be replaced if they are worn, but can practically be reused few times, reducing the cost overhead on each smart battery pack.

 

8467209253?profile=RESIZE_710x

Read more…