All Posts (13918)

Sort by

Simple Waypoint Navigation for Fixed-Wing UAVs

There is a simple way to implement waypoint navigation. The algorithm was introduced by Lawrence, Frew, and Pisano back in 2008. You can view the paper here: http://dx.doi.org/10.2514/1.34896.

I have flight tested this algorithm multiple times on a low-resource autopilot embarked on a fixed-wing UAV (see figure below). In this blog post, I want to share my approach to implementing this algorithm. I cannot claim any academic novelty. However, I hope to make this simple and effective algorithm more accessible to other researchers/hobbyists.

8598951697?profile=RESIZE_400x

In the remainder of this post, I will provide a high-level view of the algorithm. I will also show some flight test results. If the reader wants a more mathematical treatment of the implementation, the reader is referred to the attached PDF.

The algorithm is simple because it uses a single guidance vector field. Vector fields, in general, assign a vector to each point in space. Guidance vector fields, in particular, assign a desired velocity vector to each point in a 2D plane. The vector fields are designed such that they guide the vehicle to a particular path. In this case, the vector field brings the vehicle into a loiter circle. The following figure shows the vector field. The vector field also generates a desired acceleration vector. The acceleration vector is needed for accurate path following.

8598890294?profile=RESIZE_400xWe feed the guidance commands from the guidance vector field into the lateral control system. However, the input to the lateral control system is a bank angle (see figure below).  The block labeled “AC” represents the aircraft dynamics and the block labeled “Ail” represents the dynamics of the aileron actuator.
8598944085?profile=RESIZE_400x

Thus, it is necessary to convert the desired velocity and the desired acceleration into a desired bank angle. We use a two-step conversion.

First, we convert the guidance commands into a lateral acceleration command. The lateral acceleration command has two terms. The first term drives the angular misalignment between the vehicle velocity and the desired velocity to zero. The second term incorporates the desired acceleration vector; the second term functions as a feed-forward acceleration term needed for accurate path tracking.

Second, we convert the lateral acceleration command to a desired bank angle. The relationship between lateral acceleration and bank angle is illustrated in the figure below. By banking, the lift force attains a lateral component, which produces the desired lateral acceleration.

8598960700?profile=RESIZE_400x

Having described basic loiter circle tracking, we are ready to move on to waypoint navigation. The waypoint navigation routine is actually loitering in disguise: the routine positions loiter circles so that the vehicle smoothly intersects each waypoint in succession. The positioning algorithm is shown in the figure below.  The positions of the previous waypoint, current waypoint, and next waypoint are denoted by A, B, and C, respectively. The center point lies along the bisector of angle ABC. The loiter radius sets the distance between the center point and the current waypoint. Having determined the loiter center, the next step is to determine the sign of the circulation constant. If C is to the left of the line AB, then the circulation is counter-clockwise (<0). If C is to the right the line AB, then the circulation is clockwise (>0).

8599147484?profile=RESIZE_400x

A nice feature of the positioning routine is that it can work with only two waypoints. The waypoints are stored in a circularly linked list. Hence, the "next" waypoint and the "previous" waypoint can point to the same waypoint. 

Next, we describe the algorithm that governs how the aircraft switches from one waypoint to the next. 

The positions of the current waypoint and next waypoint are denoted by A and B, respectively. Let LA denote the loiter circle that brings the aircraft into A. Suppose the aircraft has just "hit" the current waypoint. The navigation routine sets the current waypoint to B and computes the parameters of LB, the loiter circle that brings the aircraft into B.

Now, the straightforward approach is to immediately switch from using LA to LB. However, this approach will change the guidance vector in an abrupt manner. To achieve a smooth transition between waypoints, the switch from LA to LB occurs when the velocity vectors from both loiter circles are pointing in roughly the same direction.  A smooth transition protects the low-level control system from large changes in the input command.

The figure below illustrates the switching algorithm. In the top plot, the aircraft is about to hit A. In the second plot, the aircraft has hit A and has set up the loiter circle of B. The aircraft, however, continues tracking the loiter circle of A. In the third plot, the guidance vectors from both loiter circles are aligned. The aircraft begins tracking the loiter circle of B.  In the fourth plot, the aircraft is en route to B.

8603314891?profile=RESIZE_400x

8603325682?profile=RESIZE_400x

8603334665?profile=RESIZE_400x8603342856?profile=RESIZE_400xI implemented the guidance system described herein using fixed-point arithmetic on the AMP autopilot, which is made in-house by our research group (see https://doi.org/10.2514/1.I010445). The microprocessor belongs to the DSPIC33 family of microprocessors made by Microchip.

Flight test results are shown in the figures below (map data ©2020 Google). The test took place at the Flying Gators RC Airport in Archer, FL. The first plot shows the flight path of the delta wing UAV performing waypoint navigation with four waypoints. The waypoints are positioned to create a figure-eight trajectory. The second plot shows the flight path of the delta wing UAV performing waypoint navigation with two waypoints. You can view a synthetic video of the flight test here: https://youtu.be/otRW2_80G0U.The video is reconstructed from downlinked telemetry data. You can view an actual video of a portion of the flight test here: https://youtu.be/jQyc3_tk7MA.

8603828281?profile=RESIZE_400x

8603768881?profile=RESIZE_400x

In the first plot, we note that there is a clear asymmetry in the flight pattern. This asymmetry was due to the aircraft being out of trim. When I examined the roll data after the flight test, I found that the aircraft was better at tracking negative roll commands as opposed to positive roll commands. Hence, the asymmetry has to do with the control system, not the guidance system.

In conclusion, this blog post provides an overview of a simple and effective waypoint navigation scheme. For a more mathematical treatment, the reader is referred to the attached PDF. I have also attached a MATLAB code that simulates loiter circle tracking. The vehicle dynamics are represented using a matrix state-space model.

simple_waypont_navigation.pdf

code.zip

Read more…

Stereoscopic systems widely used in drone navigation, but this project is using a new approach - variable baseline.

In the related Arxiv.org (PDF) the team showcases three different applications of this system for quadrotor navigation:

  • flying through a forest
  • flying through an unknown shaped/location static/dynamic gap
  • accurate 3D pose detection of an independently moving object

They show that their variable baseline system is accurate and robust in all three scenarios.

For the video capture, the Raspberry Pi-based StereoPi board was used. Additional AI-acceleration hardware (Intel Movidus) is considered as a next step, as well as using a more powerful CM4-based version of the StereoPi (v2).

Here is the brief video of the project:

 

Read more…

Full VR Drone

Hello, people.

I decided to open a new thread to register some progress about this project. I am working on it in my spare time, and I feel excited about it.

This is a very preliminar video -- please disregard the pink view, I was using an infrared camera -- with a normal camera the video is way more clear and interesting. I also need to find better 180º (or better yet, 230º) lens. I am also willing to go full 360 degrees eventually.

So, it is an experimental Virtual Reality controlled drone with some AR features which I will show when I finish implementing them. The plan is to have one or more people walking and controlling the drone as if they were truly inside the physical aircraft. Much like the traditional FPV, but not limited to the video view only, as the crew will be able to interact and walk around freely while they fly.

I have lots of ideas to implement on this experiment. I plan to talk about them while they are implemented.

At first I wanted to show it only after I had a real flight video. But for a long time I had no opportunity to bring all the equipment to a secure place for a flight. With the pandemy, I can't go to a safe place where I feel secure with my Oculus Quest and all the related hardware with me. Hopefully the next video will be more interesting, as it will show a real flight with a non-infrared camera, plus all interactions from arming the drone to taking-off, flying around and landing back, while showing some AR overlays.

Please note that this is NOT a fake video, I can already control this quadcopter from start to end using an Oculus Quest. I was using my home, conventional 2.4ghz Wifi when I recorded the video, but I could also use Raw Wifi (much like Befinitiv's WifiBroadcast, although I do have my own implementation from scratch), or 4G. It is always low latency, using a simple UDP protocol made by me.

The flying hardware has a Raspberry Pi commanding a Pixhawk through serial Mavlink, while exchanging information with the VR headset through a ground-based Rpi (this ground Pi has a 1w sunhaus and a Yagi antenna for extended (Raw)Wifi range). The shown 3D part on Oculus Quest was done using Unity3D for now -- not sure if I will keep that path, though. Unity is a nice engine and quick to develop on, but it has its drawbacks when used on custom projects like this. I do have my own 3D engine and a second implementation of the same environment (except for the nice hands), which might take over in the future. My low-level engine is very lightweight and can also run on Webasm if desired. It does not have an editor yet, though, and is harder to use, so there is an indecision in that regard. =)

This is planned to be free and open-sourced, when it reaches a certain maturity level.

Read more…

Renewable drones are planned to expand at a rapid pace as the industry focuses on UAV services, line-of-sight applications, ocean-going ship tracking, inspection or wind turbine inspection, an inspection of offshore platforms and refineries, monitoring of power lines, and solar panels in the energy sector. As one way to alleviate congestion and enhance the air quality in urban areas, passenger drones have been touted. Equipped with thermal cameras, drones make it possible to conduct inspections rapidly and on a scale. On wind farms around the world, drones are changing inspections. Wind turbines are left exposed to the elements as they run - both onshore and offshore. Also, minor damage and wasted energy can cause inefficiencies. By offering rapid and remote coverage of turbines, drones will reduce the time engineers need to spend in precarious positions. The technology is also much cheaper than a manned team, ensuring that wind farms can carry out drone inspections of wind turbines with greater regularity to keep operations running at 100%.

According to the research report published, renewable Drones Market to surpass USD 152 million by 2030 from USD 42 million in 2019 at a CAGR of 26.5% throughout the forecast period, i.e., 2020-30.

The key factors driving the demand for sustainable drones are expected to increase the adoption of drones to reduce the cost of inspection operations based on asset optimization and increasing construction of solar and wind farms. Rising environmental issues and increasing the use of clean energy alternatives are likely to be key factors driving the global demand for green drones during the forecast era. Recent technical developments and international agreements have also enabled countries worldwide to shift to the renewable energy market and develop their energy infrastructure which is expected that this will expand the global demand for renewable drones over the coming years.

Get a Free Sample Copy of Research Report Here: https://www.fatposglobal.com/sample-request-398

Multirotor Segment to Grow with the Highest CAGR During 2020-30

Renewable Drones Market is segmented by type as multirotor and fixed-wing. The greater market share in 2018 was accounted by multirotor segment owing to various advantages over Fixed-wing drones. Vertical takeoffs and landings can be done by multi-rotor planes. For easy inspection, visualization, and modelling, they also need less space to take off can hover mid-flight, and manoeuvre around objects. In addition, multirotor drones use multiple propellers to manoeuvre, so compared to fixed-wing drones, they do not need a greater surface area or wingspan. In addition, multirotor drones are designed to be folded down and packed into smaller cases, making it simpler to transport them.

A Solar Segment to Grow with the Highest CAGR During 2020-30

Renewable Drones Market is segmented by end-user into solar and wind. The solar segment is further categorized as solar PV and solar CSP. The solar segment based on end-user was held the maximum market share of XX.X% in 2018 as to meet the growing demand for solar farm inspection and maintenance, asset owners, inspectors, and drone service providers (DSPs) must develop a deep understanding of thermography and flight operations to take full advantage of the benefits of drone-based solar inspection. These factors are driving the solar sector's growth in the market for renewable drones. Increased emissions, high reliability and depletion of non-renewable energy sources are some of the main propellants in the organic construction of the market for renewable drones.

Growing Construction of Solar and Wind Farms

The sector of renewable energy is among the fastest-growing sectors worldwide. With advanced technology, renewable energy plants are being built at a rapid rate, with rising demand for clean and sustainable energy. Countries are moving their emphasis from traditional sources of energy toward rising renewable energy production. At a CAGR of more than 21% since 2000, wind power has increased. In addition, onshore wind power installations are expected to generate demand for new wind turbines as well as the replacement of old turbines. Wind power plant construction is capital-intensive, and asset owners aim to optimize returns and minimize investment. This is where the drones join the picture. Drones would help to minimize wind turbine inspection costs by at least 40%.

Rising Adoption of Drones to Reduce Cost of Inspection Operation

The growth of the Energy Industry Drone Market is mainly driven by the difficulty of remote and discrete systems inspection & monitoring. Renewable drone inspection helps to remove the need for inspection staff to operate at high altitudes. It also decreases maintenance time, when defining whether a repair needs to take place immediately or whether it can be safely postponed. Drones in the energy sector are likely to expand at a significant pace as the industry invests in UAV services, line-of-sight applications, ocean-going ship surveillance, offshore platform and refinery inspection, inspection or wind turbine inspection, power line monitoring and solar panel monitoring.

Strict Regulations for Performing Drone Operations

Drones are extremely important to utilities for conducting inspection activities. Legal provisions, however, have limited development in the drone industry. In certain cases, such as Behind Visual Line of Sight, over a long distance, or at night, these regulations exclude drone operations in particular. Considering that the FAA has not kept pace with the rapid development of drone technology, utility companies have not been able to use drones to the fullest extent possible to increase the effectiveness and quality of inspection operations. However, these regulations do not allow users to operate outside the visual line of sight and do not specify if the governmental operation is the use of drones by public power utilities.

Top Market Players Are:

DJI Enterprise, Terra Drone, Cyberhawk Innovations Limited, PrecisionHawk, ULC Robotics, Sharper Shape Inc., Sky Futures, Asset Drone and YUNEEC.

For More Report Details, Visit Here

Read more…

NexuS UAV presentation

I want to share the first flight of the third development of a fixed wing aircraft for autonomous flight.
This design was made entirely with composite materials and the molds made with 3d printers.
tests resulted in great aerodynamic efficiency and stability and maneuverability
final weight with full payload is 2.6 kg.
for the flight he was ballasted with the final weight for flight.
we estimate endurance of 90 minutes with 10,000 mA and 4S8529323882?profile=RESIZE_710x8529327272?profile=RESIZE_710x

Read more…

Centeye Modular Vision Sensors

8529205664?profile=RESIZE_710xIt has been awhile since I’ve posted anything about recent Centeye hardware. Some of you may remember my past work implementing light weight integrated vision sensors (both stereo and monocular) for “micro” and “nano” drones. These incorporated vision chips I personally designed (and had fabricated in Texas) and allowed combined optical flow, stereo depth, and proximity sensing in just 1.1 grams. Four of these on a Crazyflie were enough to provide omnidirectional obstacle avoidance and tunnel following in all ambient lighting conditions. Their main weakness was that they were time-consuming and expensive to make and adapt to new platforms. Some of you may also remember our ArduEye project, which used our Stonyman vision chip and was our first foray into open hardware. Although that project had a slow start, it did find use in a variety of applications ranging from robotics to eye tracking. I have discussed, privately with many people, rebooting the ArduEye project in some form.

Like many people we faced disruption last year from COVID. We had a few slow months last Summer and I used the opportunity to create a new sensor configuration from scratch that has elements of both Ardueye and our integrated sensors. My hypothesis is that most drone makers would rather have a sensor that was modular and easy to reconfigure or adapt, or even redesign, and are OK if it weighs “a few grams” rather than just one gram. Some users even told me they prefer a heavier version if it is more physically robust. Unlike the nano drones I personally develop, if your drone weighs several kilograms, an extra couple grams is negligible. I am writing here to introduce this project, get feedback, and gauge interest in making this in higher quantities.

My goals for this “modular” class of sensors were as follows:

  • Use a design that is largely part agnostic, e.g. does not specifically require any one part (other than optics and our vision chip) in order to minimize supply chain disruptions. This may sound quaint now, but this was a big deal in 2020 when the first waves of COVID hit.
  • Use a design that is easy and inexpensive to prototype, as well as inexpensive to modify. We were influenced by the “lean startup” methodology. This includes making it easier for a user to modify the sensor and it’s source code.
  • Favor use of open source development platforms and environments. I decided on the powerful Teensy 4.0 as a processor, using the Arduino framework, and using Platform IO as the development environment.

We actually got it working. At the top of this post is a picture of our stereo sensor board, with a 5cm baseline and mass of 3.2 grams, and below is a monocular board suitable for optical flow sensing that weighs about 1.6 grams. We have also made a larger 10cm baseline version of the stereo board, and have experimented with a variety of optics. All of these connect to a Teensy 4.0 via a 16-wire XSR cable- The Teensy 4.0 operates the vision chips, performs all image processing, and generates the output. We have delivered samples to collaborators (as part of a soft launch) who have indeed integrated them on drones and flown them. Based on their feedback we are designing the next iteration.

8529205301?profile=RESIZE_710x

As with any new product you have to decide what it does and what it does not do. Our goal was not to have an extremely high resolution- those already exist, and the reality is that having a high resolution has other costs in terms of mass, power, and light sensitivity. Instead, we sought to optimize intensity dynamic range. The vision chips use a bio-inspired architecture in which each pixel individually adapts to its light level independent of other pixels. The result is a sensor that can work in all light levels (“daylight to darkness”, the latter with IR LED illumination), can adapt nearly instantaneously when moving between bright and dark areas, and function even when both bright and dark areas are visible.

Below shows an example of the stereo sensor viewing a window that is open or closed. (Click on the picture to see at native resolution.) The current implementation divides the field of view into a 7x7 array of distance measurements (in meters) which are shown. Red numbers are those measurements that have passed various confidence tests; cyan numbers are those that have not (thus should not be used for critical decisions). Note that when the window is open, the sensor detects the longer range to objects inside even though the illumination levels are about 1% that of outside. A drone with this sensor integrated would be able to sense the open window and fly though it, and not suffer a temporary black-out once inside.

8529205888?profile=RESIZE_710x

A more extreme case of light dynamic range is shown in the picture below. This was taken with a different sensor that uses the same vision chip. On the top left is a picture of the sensor- note that it was in the sunlight, thus would be subject to the “glare” that disrupts most vision systems. On the top right is a photograph of the scene (taken with a regular DSLR) showing sample ranges to objects in meters. On the bottom is the world as seen by the sensor- note that the Sun is in the field of view at the top right, yet the objects in the scene were detected. Other examples can be found on Centeye’s website.

8529207066?profile=RESIZE_710x

We are currently drafting up plans for the next iteration of sensors. For sure we will be including a 6-DOF IMU, which will be particularly useful for removing the effects of rotation from the optical flow. We are also envisioning an arrangement with the Teensy 4.0 placed nearly flush with the sensor for a more compact form factor. There is still discussion on how to balance weight (less is better) with physical robustness (thicker PCBs are better)! Finally I am envisioning firmware examples for other applications, such as general robotics and environment monitoring. I am happy to discuss the above with anyone interested, private or public.

Read more…
3D Robotics

Demo of Microsoft AirSim with PX4

From the video description:

I wanted to put this video together to share what I've been working on as it relates to PX4 simulation. I've been really impressed with the capabilities of AirSim and I hope this video makes it a little easier to understand. You can learn more about AirSim here: https://github.com/microsoft/AirSim and my GitBook notes can be found here: https://droneblocks.gitbook.io/airsim... To learn more about DroneBlocks please visit: https://www.droneblocks.io Please feel free to leave a comment below if you have any questions and I hope to share more information in the near future. Thanks for watching.

Read more…

Quick install BatMon v4 released

8467210494?profile=RESIZE_710x

BatMon v4 released
One of the main challenges faced with BatMon was the installation overhead. Installing BatMon v3 took over an hour on a new battery pack. Second challenge was the cost overhead of a BMS system on each battery. We have reduced these issues significantly with the BatMon v4 release.
  • v4 is super fast to install on most batteries with a tool, and connects to the balance leads.
  • The modular board make it possible to reuse BatMon after end of life of a battery. The XT90 leads can be replaced if they are worn, but can practically be reused few times, reducing the cost overhead on each smart battery pack.

 

8467209253?profile=RESIZE_710x

Read more…
3D Robotics

Clever research from ETH showing how it uses the drone camera to maintain position while a quadcopter spins to maintain control after one motor fails. 

From DroneDJ:

Researchers at the University of Zurich and the Delft University of Technology have been able to keep a drone flying after a motor fails. The researchers have managed to use onboard cameras to keep the test drones in the air and flying safely.

 

AD

 

team of researchers has come up with a simple yet ingenious way to solve a problem that will usually result in a drone falling to the ground due to a motor failure.

Well, motor failures don’t often happen, but when they do, the drone needs to stay in the air regardless, especially if people are nearby or the drone is being used for a commercial job. Redundancy is important when it comes to drones.

Davide Scaramuzza, head of the Robotics and Perception Group at UZH and of the Rescue Robotics grand challenge at NCCR Robotics, shared:

When one rotor fails, the drone begins to spin on itself like a ballerina. This high-speed rotational motion causes standard controllers to fail unless the drone has access to very accurate position measurements.

Scaramuzza essentially says that the standard controllers in drones cannot cope with the fast and random spinning of a free-falling drone. This led the team to onboard RGB cameras and event cameras, which we’ve gone into in the past for obstacle avoidance.

GPS methods were also explored before the cameras, but the researchers ended up dumping the idea as GPS isn’t available in all situations, especially when it comes to specific drone missions.

The changes between the frames

Now for the way to keep the drone flying. The team equipped a drone with an RGB camera and an event camera. The standard RGB camera detects movements in the whole frame, where the event camera detects changes on the pixel level, allowing for tiny changes to be spotted.

The data from the two cameras are combined using a specially developed algorithm that then tracks the quadcopter’s position relative to its surroundings. This allows the flight controller to take control of the drone as it spins and flies.

Both cameras work great in well-lit environments, but the RGB camera begins to suffer as light decreases. In testing, the researchers were able to keep the drone stable with the event camera all the way down to 10 lux, which is about equivalent to a dimly lit room.

Video Player
 
Read more…

Zion Market Research has published a new report titled “Drone Logistics and Transportation Market By Solution (Shipping, Warehousing, Software, and Infrastructure), By Drone (Passenger, Freight, and Ambulance), and By Sector (Commercial and Military): Global Industry Perspective, Comprehensive Analysis, and Forecast, 2018–2025”. According to the report, the global drone logistics and transportation market was USD 4.56 billion in 2018 and is expected to reach around USD 18.05 billion by 2025, at a CAGR approximately 21.9% between 2019 and 2025.

8422565492?profile=RESIZE_710x

Unmanned aerial vehicles, also known as drones, are small aircrafts that don’t have a human pilot onboard that can either operated remotely or automated and travel with the help of GPS coordinates. They are made of light material to reduce weight, which enables them to fly at high altitudes. Drones are controlled by a ground cockpit and can easily return to their marked starting position in case of low battery or when the drone loses contact with the controller. Initially, these were used for photography and videography; however, these are used for applications in the supply chain of the modern world. Drones are considered to revolutionize the supply chain.

In this constantly evolving world, entrepreneurs and visionary leaders are focusing on integrating drone delivery in the supply chain. This is the primary growth factor of the drone logistics and transportation market, as including drones in the supply chain is believed to revolutionize the way shipments reach the customers. It will be the fastest way to transport and has the potential to deliver orders within minutes. Drones will have a diverse range of impact on the supply chain. Drones can track inventory and act as a significant system for inventory management, which will reduce the inventory carrying the cost of companies. Around 90% of the inventory of a warehouse is stationary and companies end up with extra inventory due to improper management. However, it is often difficult even for planes to fly in extreme weather conditions like snow, rain, and strong winds. Drones are smaller versions of these flying machines and could pose a major challenge for delivery during extreme climatic conditions, which might imply that drones can deliver only in certain climatic conditions. This is a major restraint for the drone logistics and transportation market.

By solution, the shipping sector is expected to hold the highest market share. Drone shipping has been targeted by various companies like Amazon, Google, UPS, DHL, etc. By drone, the ambulance drone segment will hold a major market share, owing to the increasing casualties and growing traffic congestion in major cities across the globe. The first few moments are the most crucial during an accident to prevent any further escalations. Lifesaving technologies like CPR (Cardiopulmonary Resuscitation), medication, and AED (Automated External Defibrillator) can be made compact enough to be performed by a drone. Moreover, drones can also transfer other crucial medical supplies during disasters and across tough terrains.

Download Free Research Report Sample PDF for more Insights - http://bit.ly/3oWtyz3

North America will hold a substantial share of the drone logistics and transportation market in the future, owing to the various developments and innovations witnessed in the inventory management domain. The U.S. is witnessing continuous growth in tech start-ups every year, which is backed by numerous venture capitalists, thereby increasing the regional market scope. The presence of key market players is also predicted to accelerate the demand for drone logistics and transportation market. In Europe, the presence of developed economies of the UK, Germany, and France is contributing to the drone logistics and transportation market.

Some major key players operating in the drone logistics and transportation market are PINC Solutions, Matternet, Drone Delivery Canada, Hardis Group, CANA Advisors, Infinium Robotics, Workhorse Group, Aerovironment, DroneScan, Skycart, and Zipline.

Read more…

The main focus of this research is to develop a real-time forest fire monitoring system using an Unmanned Aerial Vehicle (UAV). The UAV is equipped with sensors, a mini processor (Raspberry Pi) and Ardu Pilot Mega (APM) for the flight controller. This system used five sensors. The first is a temperature sensor that served to measure the temperature in the monitored forest area. The others sensors are embedded in the APM. There are a barometer, Global Positioning Sensor (GPS), inertial measurement unit (IMU) and compass sensor. GPS and compass are used in the navigation system. The barometer measured the air pressure that is used as a reference to maintain the height of the UAV. The IMU consists of accelerometer and gyroscope sensors that are used to estimate the vehicle position. The temperature data from the sensor and the data from GPS are processed by the Raspberry Pi 3, which serves as a mini processor. The results of the data processing are sent to the server to be accessible online and real-time on the website. The data transmission used the Transmission Control Protocol (TCP) system. The experimental setup was carried out in an area of 40 meters × 40 meters with 10 hotspots. The diameter of the hotspots is 0.4 meters with a height of 0.5 meters. The UAV is flown at a constant speed of 5 m/s at an altitude of 20 meters above the ground. The flight path is set by using a mission planner so the UAV can fly autonomously. The experimental results show that the system could detect seven hotspots in the first trial and nine hotspots in the second trial. This happened because there is some data loss in the transmission process. Other results indicate that the coordinates of hotspots detected by the UAV have a deviation error of approximately 1 meter from the actual fire point coordinates. This is still within the standard GPS accuracy as this system uses GPS with a range accuracy of 2.5 meters. 

You can Download this article on Research Gate or Journal of Engineering Science and Technology

Read more…

The main focus of this research is early detection system for forest fires by using Unmanned Aerial Vehicle (UAV). Data source of fires are collected by mobile devices such as GPS-equipped UAV quadcopter, non-contact infrared sensor, and the sensor stabilizer. The data from sensor is sent via telemetry link 433 Mhz, towards the Ground Control Station. Then the data is processed by a program that has been made, namely SPTA Real-time v0.1.0, which can show the temperature data as well as the color layer based on the difference of temperature levels in real-time on a digital map layer. The coordinates of fires are observed by SPTA realtime v0.1.0, and then it is compared with the coordinates of the source of the fire that is recorded manually. The results of data retrieval, the area that monitored is 3662 m2, constant height of 30 m, quadcopter speed of 5 m / s. First Data, with wind speeds of 3.2 m / s has a difference of 1.18 m from the coordinate source of the fire, within GPS tolerance accuracy is 2.5 m. For the second data with a wind speed of 6 m / s, has a difference of 5.16 m from the coordinates of the source of fire, deviated from the tolerance is 2.66 meter GPS accuracy.

You can download DEVELOPMENT OF UNMANNED AERIAL VEHICLE FOR REAL TIME FIRE FOREST DETECTION.pdf


Read more…

Long Range

I left the tittle wide open intentionally.  This is something that is beyond my skills.  One of the things that we all struggle with.  If you want two cameras, you have to have two transmitters or switch betwen them.  Telemetry from your autopilot, can get some of that from the OSD butr to update the autopilot, you need another tranciever  that you can also use for telemetry sometimes.  Then you need the reciever for the remote control stuff.  OK Thats two transmitters, one transciever that really doesn't have good range and a reciever.  All of this back and forth is data.  Some analog some digital.  What I am hoping to throw into the hornets nest here is an idea that I'm hoping the community can take and run with it.  What I'm proposing is using 802.11n  Not g or ac but n.  Reason, most of the wireless service providers are using n to distribute their product, internet connection.  With n you have a range that is in quite a few miles with excellent bandwidth. An 802.11n tranciever can be had in a wide rante of power and cost.  Both the base station and drone mounted unit can be of the same make and power.  Then the output of the tranciever is IP.  That can be decoded by something as a Raspberry Pi and then your cameras and load of stuff can be interfaced vie usb and other ports on the board.  There are only two usb ports on a Pi but a usb expansion board can be used.  Going this rout the amount of data transfered between your base station and the drone is much much broader.  The very design of the 82.11 tech allows for several to many trancievers sharing the space allowing for a coaperative operation of data and this can cut down on interferience. Like when you use an analog video link, ANY transmitter on the same channel can cause havoc and possibly even loss of video.  802.11 is designed to deal with other in the area and will deal with it by design.  So, my tech geniuses let the creativity begin.

Read more…

Enterprises & Startups can leverage FlytNow APIs to build & scale automated, cloud-connected drone applications, and reduce time-to-market.

The commercial drone industry is heading towards complete automation. This transition calls for seamless integration with different software & hardware. At FlytBase, we are cognizant of this ever-growing need for scalable enterprise applications that involve drones. Keeping this in focus, we are introducing FlytNow API platform to enable automated, cloud-connected drones applications for enterprises & startups.

We are proud to announce that we are extending the capabilities of FlytNow as a comprehensive backend platform for enterprise drone ops. We introduce FlytNow APIs to securely connect drones with any type of business application that supports RESTful architecture. This means that businesses can rapidly build and scale custom drone applications to manage their drone fleet.

Key benefits of using FlytNow APIs

  • Simple: FlytNow APIs are simple with clearly defined endpoints to perform specific functions.
  • Powerful Abstractions: FlytNow APIs provide powerful abstractions so developers do not have to deal with lower-level languages to communicate with drones.
  • Hardware Agnostic: Whether it is a DJI, PX4, or ArduPilot drone or any companion computer (like Rasberry Pi 4b, Nvidia Jetson, ODROID N2, etc.), FlytNow APIs are agnostic and have the necessary adapters to communicate with the hardware. These APIs
  • Discoverable: Our API endpoints are logically organized in extensive documentation so that even new developers can get up to speed quickly with the capabilities.
  • Consistent: All our API endpoints are constructed logically so that developers can anticipate different functionalities.
  • Virtual Drones: As the name implies, these are simulated drones in a virtual environment. Work on simulations to test your applications faster without risking expensive hardware.
  • Scalability: Our cloud services are hosted on Amazon AWS, and it is adaptive making it possible to deploy resources as you grow your business.

APIs that are currently available for our enterprise users


  • Navigation APIs: Control drones remotely from a dashboard.
  • Telemetry APIs: Fetch telemetry data like speed, altitude, global position, etc. from a drone.
  • Payload APIs: Control & integrate various payloads with FlytNow.
  • Video Streaming APIs: Access live video streaming from a drone. Share this stream with your team and guests for collaboration.
  • Vehicle Setup APIs: Perform a series of checks on the operational capabilities of a drone.
  • Gimbal Control APIs: Remotely control the gimbal pitch of a drone.
  • Camera Zoom APIs: Change the orientation of the camera and the zoom remotely.
  • Command & Control APIs: Send drone to a GPS location, control heading remotely.
  • Mission Planning APIs: Set a pre-programmed mission/path for a drone.
  • Precision Landing APIs: Land drones precisely on a machine-generated tag.
  • Collision Avoidance APIs: Integrate collision sensor data with FlytNow dashboard and set thresholds to avoid collisions.
  • Drone-in-a-box API: Integrate with Drone-in-a-Box hardware. Command drone launches and landings remotely. Moreover, you can retrieve charging (or battery swapping) & docking station statuses.
  • Geofence APIs: Set a virtual boundary for drones and trigger fail-safes in case of breaches.

What enterprises & startups can build using our Drone APIs

Drone-based autonomous security and surveillance system:


Security systems can be enhanced using drones. A custom enterprise web application can be integrated with CCTV cameras & software (for example Video Management solutions like Milestone), motion sensors, and ground-based hardware using FlytNow APIs. Further, businesses can leverage these APIs for mission planning to automate the patrolling of drones, thereby reducing the need for redundant manpower. Automation need not be limited to just spontaneous patrolling; it can be scheduled for regular security patrols using APIs for DiaB (Drone-in-a-Box hardware). Absolute autonomy lies in eliminating human interference starting from time-defined missions where the drone takes off, performs the mission, and docks back into the box to charge/swap batteries. In real-life, the system will leverage a unified dashboard as a command center and our live video streaming APIs to manage the entire operation. In the event of an intrusion, it will operate in the following way:

  • An intrusion alert goes off in the main dashboard. API integration with FlytNow triggers the drone system.
  • The system creates a waypoint mission for the drone. A drone automatically launches from a DiaB station and goes to the point of interest.
  • The drone begins live-streaming, and the human operator identifies the intruder from the live drone footage. The operator uses the payload APIs of FlytNow to maneuver the camera and look around. AI-detection technology can also automate intruders and help track in the video.
  • On completion of the mission, the drone automatically returns back to the docking station.

Drones-based medical delivery system:


Companies have been actively building & deploying drone systems that can deliver critical medical payloads to remote locations. A US-based company called Zipline is one such company that has extensive operations in the African nations of Ghana and Rwanda. They rely on a centralized system where they operate from a medical warehouse and all incoming requests for blood are fulfilled via drone delivery. The highlights of the system are that the drones can fly autonomously from the warehouse to the delivery point, drop the payload, and return back to the base. Following are some APIs of FlytNow can be used to build a similar system:

  • Mission Planning APIs: To set the route of a drone to the delivery location.
  • Navigation APIs: To take control of a delivery drone remotely in case of an emergency.
  • Vehicle Setup APIs: To run a diagnostic of a drone before sending it off to a mission.
  • Video Streaming APIs: To remotely monitor a delivery mission through a video feed.
  • Geofence APIs: To restrict the area of operation of the drones.
  • Command & Control APIs: Send drone to a GPS location, control heading remotely.
  • Collision Avoidance APIs: To get data from the onboard sensors and set thresholds to avoid collisions.
  • Payload API: To control the payload dripper or actuators

Drone-based emergency response system:


Leveraging the FlytNow APIs, a response system can be built that is fully autonomous and integrated with a Computer Aided Dispatch system like 911. In the event of an emergency, an operator using such a system can dispatch a drone to survey the situation. On receiving the command, a drone will launch and fly to the location autonomously and begin acquiring data using its onboard camera. The operator can share the live feed of the drone with the first responders who can plan a better response.

The APIs used in this case would be similar to the delivery system mentioned above, with a focus on BVLOS capabilities and live-stream of data.

Summary


In this blog, we introduced the APIs of the FlytNow platform and the benefits of using them. In a nutshell, FlytNow is built for developers building applications to manage enterprise drone operations with BVLOS capabilities. Our extensive and reliable set of APIs is a result of our experience working with commercial drones for almost a decade. Originally published on FlytNow

Read more…

M-Eagle A2 long endurance vtol drone

M-Eagle A2 long endurance vtol drone equips with Herelink 2.4Ghz HD video transmission system, which is an all-in-on data video and rc system with max range 20km.
Max payload 1kg, sony A7R mapping camera, 10x zoom dual sensor camera and multispectral camera are recommended.
With Zeus Power 30000mah sem-solid battery, max endurance can ben 2 hours(no payload). 
#Vtol drone #Long
 range uav, mapping drone8268879890?profile=RESIZE_710x8268880299?profile=RESIZE_710x8268880670?profile=RESIZE_710x8268880693?profile=RESIZE_710x8268881061?profile=RESIZE_710x8268907894?profile=RESIZE_710x

Read more…
3D Robotics

From NewAtlas:

No matter how good we humans have made something, chances are nature did it better millions of years ago. Rather than compete, it’s often better to tap into the natural version – and that’s exactly what scientists have done with the Smellicopter, a drone that uses an antenna from a live moth to sniff out its targets.

We humans don’t tend to rely on it too much, but to moths, the sense of smell is crucial. They use their feathery antennae to scan for the smell of flowers, mates, and other important things, so they’re incredibly sensitive – a single scent molecule can trigger a cascade of cellular responses, very quickly.

Realizing that, engineers at the University of Washington hooked up an antenna from a live moth to a suite of electronics, and used it to guide a drone towards specific scents. They call the end result the Smellicopter.

“Nature really blows our human-made odor sensors out of the water,” says Melanie Anderson, lead author of the study. “By using an actual moth antenna with Smellicopter, we’re able to get the best of both worlds: the sensitivity of a biological organism on a robotic platform where we can control its motion.”

The antennae are sourced from the tobacco hawk moth, which are anesthetized before removal. Then, small wires are inserted into each end of the hollow antennae, which can measure the average signal from all of its cells. The antenna only stays biologically and chemically active for up to four hours after being removed from a live moth, but the researchers say this could be extended b storing them in the fridge.

The Smellicopter is a drone that uses a live moth antenna as a smell sensor The Smellicopter is a drone that uses a live moth antenna as a smell sensor
Mark Stone/University of Washington

To test out the cyborg’s smelling prowess, the team placed it at the end of a wind tunnel, and had it compete with a standard artificial odor sensor. When either a floral scent or the smell of ethanol was wafted down the tunnel, the antenna reacted faster than the other sensor, and was able to cleanse its palette quicker between smells.

For the next experiments, the researchers then mounted the electronics onto a small, common quadcopter platform, which was equipped with two plastic fins to keep it oriented upwind, and four infrared sensors for obstacle detection and avoidance.

Finally, the Smellicopter was driven by an algorithm that mimicked how moths search for smells of interest. The drone starts off by drifting to the left for a set distance, and if it doesn’t detect a strong enough scent, it then moves to the right for a while. When it detects a smell, the drone will then fly towards it. If at any point those infrared sensors pick up an obstacle within 20 cm (8 in), the Smellicopter will change direction.

“So if Smellicopter was casting left and now there’s an obstacle on the left, it’ll switch to casting right,” says Anderson. “And if Smellicopter smells an odor but there’s an obstacle in front of it, it’s going to continue casting left or right until it’s able to surge forward when there’s not an obstacle in its path.”

The team says the device could be useful for seeking out plumes of scent, such as chemical signatures from explosives or the breath of people trapped in rubble. That way, the drones could help in situations where it may be dangerous to send humans to investigate. And it might not be the only insect hybrids doing so – other studies have experimented with using cyborg cockroachesdragonflies and locusts for similar purposes.

The research was published in the journal IOP Bioinspiration & Biomimetrics

Read more…

BatMon enabled Smart Battery available for purchase

8249073295?profile=RESIZE_710x

You can now buy BatMon enabled smart batteries off the website, saving you the engineering time for assembling your own. 

BatMon enabled batteries can talk with Ardupilot, Pixhawk, Arduino and ROS. BatMon enables safe and robust operation of robot using smart batteries. Batteries are currently shipped ground within contiguous United States.

Smart Battery features

  • 2-12S LiPo/Li-Ion Battery support
  • SMBUS based data protocol. Work out of the box with Ardupilot, Pixhawk*, Arduino, Raspberry Pi etc
  • 150A burst/ Optional 240A burst
  • Accurate current monitoring 
  • Accurate individual voltage monitoring
  • OLED screen for monitoring and outdoor viewing
  • Buzzer usable for warning
  • Firmware upgradeable with optional programmer

Buy Smart Battery 6S 4500mAh

Get custom BatMon enabled Smart Battery

8249074493?profile=RESIZE_710x

 

Read more…

Customizable open source smart charger (2-12s) : BatCha

8233112655?profile=RESIZE_710x

We built BatMon for making smart batteries. BatMon can now talk with Ardupilot, Pixhawk, and ROS and enables safe and robust operation of robot using smart batteries. But there is a missing piece, a smart charger that can safely charge batteries with a peace of mind for customers.

So, we are building exactly that, and a lot more, to future proof your workflow as they scale. Below is a sneak peek of the most foolproof futureproof smart charger we know of: BatCha

BatCha features

  • Opensource Firmware
  • 2-12S LiPo Charge
  • Smart battery data interfaces:
    • I2C: Two wire based protocol such as SMBUS
    • CAN BUS: Two wire differential pair robust protocol for UAVCAN, CANaerospace etc
    • LIN BUS : Single wire communication protocol for micro drones
  • Two ports. Charge two battery simultaneously.
  • 500 watt total power. Shareable to either ports
  • Max charging current per port: 20A
  • Measures battery temperature and voltage for safe charging limits for each battery
  • Optional: expansion puck to charge 4x smart batteries per port
  • Optional: Puck for 2-12S dumb battery charging and balancing
  • USB interface for monitoring and controlling charge workflow from PC
  • Optional: Integrated Raspberry Pi Compute module for Wi-Fi and Ethernet connectivity
  • OLED screen for monitoring and outdoor viewing
  • User Interface engineered for speed and safety. Plug-in to charge-start in less than 4 seconds
  • Zero effort to train customers on battery charging and safety. Customer have two options: Regular charging / Fast Charging
  • Mechanical design optimized for robustness and repairability

Sign up here for more details.

 

Read more…

 Arrows Hobby Marlin 64mm EDF PNP  RC Airplane

The Arrows Marlin 64mm EDF is an ALL NEW design with the beginner to intermediate pilot in mind. This plane has a more robust fixed landing gear than most, to handle bumpy landings. More importantly it is the only plane in its class to have flaps to shorten takeoff, and to make approaches and landings more like those of a high wing trainer. The 3150kV motor combined with a 40 Amp ESC, and a powerful 64mm 11 blade EDF, ensure plenty of power for takeoff and maneuvers. The fan sounds like a real turbine.

This plane basically has eight screws and some servo connections to complete assembly. No glue is necessary, and because we use 8 servos we are not running a lot of control rods around. Even the nose wheel has it's own servo. Flap and aileron connections are made with ball linkages, for greater strength at higher speeds. A latch type canopy makes in flight canopy loss a thing of the past.

If you are looking for a first EDF to try, or are looking to move up to a EDF without spending a bundle, this plane is for you.

source from:

https://www.arrows-hobby.net/arrows-hobby-marlin-64mm-edf-jet-pnp-rc-airplane.html

Features

  • Ample thrust courtesy of dual out-runner 64mm 12-bladed fans, dual 40A ESCs and 6S power.
  • Highly realistic functional and scale features.
  • CNC machined-metal shock-absorbing undercarriage.
  • Electric retracts with over-current protection.
  • High quality rubber tires.
  • Bearing-equipped full-flying horizontal stabilizer.
  • Preinstalled ball-linked linkages for precise surface movements.
  • Sleek aerodynamic airframe
  • Screw together assembly
  • Flaps
  • Ball linkages
  • Latch type canopy
  • Ultra durable EPO foam
  • The perfect beginner jet!
  • Park, School, or AMA field
    Arrows Hobby Marlin 64mm EDF PNP  RC Airplane Arrows Hobby Marlin 64mm EDF PNP  RC Airplane Arrows Hobby Marlin 64mm EDF PNP  RC Airplane  Arrows Hobby Marlin 64mm EDF PNP  RC Airplane Arrows Hobby Marlin 64mm EDF PNP  RC Airplane Arrows Hobby Marlin 64mm EDF PNP  RC Airplane
Read more…

Rpanion-server 0.7 Released

8183230679?profile=RESIZE_710xRpanion-server 0.7 has been released!

Rpanion-Server is an Open-Source software package for a managing the companion computer (such as the Raspberry Pi) connected to an ArduPilot or PX4 flight controller. It will run on most Linux-based systems.

Rpanion-Server consists of a network manager, MAVLink telemetry routing, flight logging and a low latency video streaming server. All can be managed via a web-based user interface.

Documentation and pre-built disk images for the Raspberry Pi are available at https://www.docs.rpanion.com/software/rpanion-server. Source code is at https://github.com/stephendade/Rpanion-server

New in 0.7 is:

  • Support MJPEG cameras for video streaming
  • Added button to disable all Wi-Fi adapters
  • GUI overhaul, using the Bootstrap framework
  • Various bug fixes

 

Read more…