All Posts (13940)

Sort by

Lithium Ion battery Pack designer utility

8922509889?profile=RESIZE_710x

Lithium ion batteries have a higher energy capacity than Lithium Polymer packs, in general. Higher capacity lithium ion packs can be designed if the current draw of the drones are known. We have built a web utility to find the right chemistry of cells to use, from among the hundreds of different cells. 

Input the max weight, and current discharge of the drone and see: 

  • Battery cell configuration (number of cells in parallel and series) 
  • Approximate flight time as compared to current aircraft (without weight change)
  • Battery weight.

Reach out to us shout@rotoye.com to get free access to this utility. 

Read more…

3D printed truck

8910822072?profile=RESIZE_710x

 

 

8910913878?profile=RESIZE_710x

The high cost of good RC truck kits, diminishing need for such kits & the noise of gearboxes made me look elsewhere for a robot platform.  3D printing a truck from scratch, with only a few metal parts still being off the shelf, was the next step.  The only parts which have to be outsourced are the motors, steering servo, & steering knuckles.  Everything else is 3D printed or home made electronicals.  The size was based on the original Tamiya lunchbox.

 

 

8910820663?profile=RESIZE_710x

 

8910862088?profile=RESIZE_710x

 

 

8910842477?profile=RESIZE_710x

 

 

8910853293?profile=RESIZE_710x

 

The remote control is a single paw 3 channel, with 100mW radio, ball point pen springs, hall effect sensors.

 

 

 

8910905662?profile=RESIZE_710x

 

 

8910905683?profile=RESIZE_710x

 

The remote control is charged inductively.

 

 

8910864885?profile=RESIZE_710x

 

8910868496?profile=RESIZE_710x

 

 

Motors are direct drive Propdrive 4248's of any KV rewound with 20 turns of 26AWG.   They don't produce enough torque to go up hills.  The motor sensor is a dual hall effect sensor resolver.  

The main problem is cooling the motors while getting more torque.  Metal motor mounts of the right shape continue to be cost prohibitive & PLA doesn't dissipate heat.

 

 

8910880482?profile=RESIZE_710x

 

8910880691?profile=RESIZE_710x

 

Electronicals automate just enough to drive it with 1 paw, but not so much that it's never finished.

 

 

8910883454?profile=RESIZE_710x

 

 

8910883474?profile=RESIZE_710x

 

 

8910883694?profile=RESIZE_710x

 

8910886053?profile=RESIZE_710x

 

 

8910886455?profile=RESIZE_710x

 

 

Traction & steering are 2 separate modules.  Steering uses a brushless servo with stock lunchbox servo saver.

 

8910884879?profile=RESIZE_710x

 

Motors are powered by L6234's mounted dead bug style to get the heat out.

8910890092?profile=RESIZE_710x

 

 

8910890477?profile=RESIZE_710x

 

Tires are printed out of TPU in varying shapes to adjust hardness & traction.

 

8910891284?profile=RESIZE_710x

 

8910895474?profile=RESIZE_710x

 

 

Rear tires have a flat wide shape for more forwards traction.  Front tires have a round shape for more sideways traction.

 

 

8910909092?profile=RESIZE_710x

 

The battery is completely enclosed.

 

 

Read more…

8904271867?profile=RESIZE_710x

 

Great post from NASA explaining how the Mars helicopter autopilot works:

----

Before each of Ingenuity’s test flights, we upload instructions that describe precisely what the flight should look like. But when it comes time to fly, the helicopter is on its own and relies on a set of flight control algorithms that we developed here on Earth before Ingenuity was even launched to Mars.

To develop those algorithms, we performed detailed modeling and computer simulation in order to understand how a helicopter would behave in a Martian environment.  We followed that up with testing in a massive 25-meter-tall, 7.5-meter-diameter vacuum chamber here at JPL where we replicate the Martian atmosphere. But in all of that work, we could only approximate certain aspects of the environment. Now that Ingenuity is actually flying at Mars, we can begin to assess how things stack up against expectations. Here are some key aspects of the flight control system’s performance on Mars.

Takeoff

Unlike many consumer drones, Ingenuity is not controlled by changing the rotor speeds. Instead, we control our Mars Helicopter in the same manner as full-scale terrestrial helicopters: by changing the pitch angle of the blades, which affects the airfoil “angle of attack” and thereby determines how big a “bite” the blades take out of the air. The bigger the bite, the more lift (and drag) is produced. Like a traditional helicopter, we can change the pitch angle in two ways: by using “collective control,” which changes the blade pitch uniformly over the entire rotation of the blade, and by using “cyclic control,” which pitches the blade up on one side of the vehicle and down on the other.

When Ingenuity takes off, the rotor is already spinning at the setpoint speed of 2,537 rpm. We take off with a sudden increase in collective control on both rotors, which causes the vehicle to “boost” off the ground. During this initial takeoff phase, we limit the control system to respond only to angular rates (how quickly the helicopter rotates or tilts). The reason for this is that we don’t want the control system to be fighting against the ground, possibly resulting in undefined behavior.

The initial takeoff phase lasts for only a split second; once the helicopter has climbed a mere 5 centimeters, the system asserts full control over the helicopter’s position, velocity, and attitude. At this point we’re accelerating toward a vertical climb rate of 1 meter per second.

To estimate our movements during flight, we use a set of sensors that include a laser rangefinder (for measuring altitude) and a camera. We don’t use those sensors until we reach 1 meter altitude out of concern that they might be obscured by dust near the ground. Instead, we initially rely only on an inertial measurement unit (IMU) that measures accelerations and angular rates, and we integrate those measurements to estimate our movements. This is a type of “dead reckoning” navigation – comparable to measuring how far you’ve walked by counting your steps. It’s not very accurate in the long run, but because Ingenuity takes only a couple of seconds to reach 1 meter, we can make it work.

 

Ingenuity's Rotor Power During Flight Two Ingenuity’s rotor power during Flight Two. Credits: NASA/JPL-Caltech. Download image ›

One of the things we were curious about is how “confidently” Ingenuity would boost off the ground and reach that first threshold of 5 cm. Data from the first three flights shows that portion of the climb took about 0.25 seconds, which is very much in line with expectations and indicates that Ingenuity had no issue producing enough thrust on takeoff. During this initial boost, we expected to see a spike in the power required by the rotor system, and that is indeed what we observed. For example, the spike in Flight Two was about 310 watts (W) – well below the maximum capacity of our batteries, which can tolerate spikes as high as 510 W.

 

 

Ingenuity Flight Two Ingenuity Flight Two: A picture from the navigation camera aboard Ingenuity captured the helicopter on takeoff during Flight Two, showing little sign of dust. Credits: NASA/JPL-Caltech. Download image ›

After takeoff, Ingenuity took about 2 seconds to reach the 1-meter altitude where it could start using its full suite of sensors. That being said, while we did see some faint dust in the images taken by the Perseverance rover (parked nearby) on takeoff, there was no indication flying dust or sand obscured the altimeter or camera, so our design appears to have erred on the cautious side in this regard (which is a good thing).

 

The moment the helicopter’s legs leave the ground, its motion starts to become affected by wind. These winds can cause the vehicle to momentarily roll (side to side) or pitch (forward or backward) on takeoff, until it has time to catch and correct itself. We were prepared for some significant roll/pitch angles on takeoff if winds were high at the ground level, but in Ingenuity’s three takeoffs so far, they have been limited to a couple of degrees only, making for nice, vertical takeoffs.

Hover

 

Ingenuity's Horizontal Position During Flight One Hover Ingenuity’s horizontal position relative to start during Flight One hover. Credits: NASA/JPL-Caltech. Download image ›

During hover phases of flight, we are attempting to maintain a constant altitude, heading, and position. In evaluating how well we are managing to achieve that, we are forced, for the most part, to rely on Ingenuity’s own estimates of what it was doing, as we have limited data establishing “ground truth.” Those estimates are subject to errors in navigation that will be covered in a separate post. But the steadiness of these estimates tells us a lot about how tightly the controller is able to hold the desired values. 

 

The data shows that we hold our altitude extremely well in hover, to within approximately 1 cm. We also hold the heading (which way we point) to within less than 1.5 degrees. For horizontal position, we’ve seen variations up to approximately 25 cm. Such variations are expected as the result of wind gusts.

So, what has the wind been like during our flights? Fortunately for us, the Perseverance rover carries the MEDA weather station. For Flight One, we have measurements from MEDA indicating winds of 4-6 meters per second from the east and southeast during most of the flight, gusting to 8 meters per second. Keep in mind that those measurements are made 1.5 meters above ground level, and the tendency is for winds to increase as you go from ground level up. We also have atmospheric density measurements at the time of Flight One, showing 0.0165 kilograms per cubic meter, or about 1.3% of Earth’s density at sea level. Using this information, we can assess the system’s performance in another important respect – namely, the control effort required to fly.

 

Ingenuity's Collective Control During Flight One Ingenuity’s collective control during Flight One. Credits: NASA/JPL-Caltech. Download image ›

For the collective control (remember, that is the one that changes rotor blade pitch angle uniformly to affect helicopter’s thrust), we would like to see hover values roughly consistent with prior expectations. During Flight One, we hovered with around 9.2 degrees collective on the lower rotor and 8.2-degree collective on the upper (that’s the angle of the blade’s “chord line” – an imaginary line drawn from the leading edge to the trailing edge of the rotor blade – at ¾ of the rotor radius). Those values are 0.7-0.8 degrees lower than the trim values we anticipated (9.0 degree on the upper rotor and 9.9 degree on the lower rotor). But those trim values were tuned based on tests without wind at a somewhat different density/rotor speed combination, so this difference is not unexpected. Another indication that we are within our aerodynamic comfort zone is the electrical rotor power of around 210 W in hover, which is also right in the vicinity of what was expected. Taken together, the results indicate that we have good margin against “aerodynamic stall,” which is when the blade airfoil’s angle relative to the surrounding airflow is increased beyond the point where it can produce further increases in lift.

 

 

Ingenuity's Lower Cyclic Control on Flight One Ingenuity’s lower cyclic control on Flight One. Similar cyclic controls applied on the upper rotor. Credits: NASA/JPL-Caltech. Download image ›

We also evaluate the cyclic control, which is used to create roll and pitch moments on the vehicle. We have seen relatively steady values in hover, generally of magnitude less than 3 degrees, which leaves ample margin against the upper limit of 10 degrees. The cyclic control inputs tell us a fair amount about the wind that the vehicle has to fight against. For example, for Flight One the cyclic control is consistent with winds from the east and southeast, which is in alignment with MEDA observations. The cyclic control effort also increases with altitude, which indicates that winds are getting higher further from the ground.

 

Landing

Landing is a particularly challenging part of any flight. Ingenuity lands by flying directly toward the ground and detecting when touchdown happens, but a number of events occur in rapid succession leading to touchdown. First, a steady descent rate of 1 meter per second is established. Then, once the vehicle estimates that the legs are within 1 meter of the ground, the algorithms stop using the navigation camera and altimeter for estimation, relying on the IMU in the same way as on takeoff. As with takeoff, this avoids dust obscuration, but it also serves another purpose -- by relying only on the IMU, we expect to have a very smooth and continuous estimate of our vertical velocity, which is important in order to avoid detecting touchdown prematurely.

About half a second after the switch to IMU-only, when the legs are estimated to be within 0.5 meters of the ground, the touchdown detection is armed. Ingenuity will now consider touchdown to have occurred as soon as the descent velocity drops by 25 centimeters per second or more. Once Ingenuity meets the ground, that drop in descent velocity happens rapidly. At that point, the flight control system stops trying to control the motion of the helicopter and commands the collective control to the lowest possible blade pitch in order to produce close to zero thrust. The system then waits 3 seconds to ensure the helicopter has settled on the ground before spinning down the rotors.

People have asked why we contact the ground at the relatively high speed of 1 meter per second. There are multiple reasons for this. First, it reduces the dead-reckoning time that we need to spend without using the camera and altimeter; second, it reduces the time spent in “ground effect,” where the vehicle dynamics are less well-characterized; and third, it makes it easier to detect that we’ve touched down (because the velocity change is clearly sufficient for detection). What makes this strategy possible is the landing gear design which helps prevent the vehicle from bouncing on landing.

 

Ingenuity's Estimate of Vertical Velocity During Flight Two Ingenuity’s estimate of vertical velocity during Flight Two. Credits: NASA/JPL-Caltech. Download image ›

Any touchdown detection algorithm of this kind has to strike a balance between two potential pitfalls: (1) detecting touchdown too early (thereby dropping to the ground from the air) and (2) not detecting touchdown soon enough (which would cause the helicopter to keep trying to fly after coming in contact with the ground). Data from Ingenuity’s flights on Mars show that we were not in danger of either of these scenarios. During descent, Ingenuity has maintained its vertical velocity to within approximately 4 cm per second, and it has detected the necessary 25 cm per second drop within approximately 30 milliseconds of touchdown.

 

As we continue with our flights on Mars, we will keep digging deeper into the data to understand the various subtleties that may exist and would be useful in the design of future aerial explorers. But what we can already say is: Ingenuity has met or exceeded our flight performance expectations.

Read more…
Moderator
 

8876735875?profile=RESIZE_710x

Dear Friends,
1 months ago our cubesat satellite "FEES" release in space by Soyouz 2 . The hardware is based of VRBrain 5 architecture , we put in space a STM32F4 micro controller . The first release don't use Ardupilot , hope the next one with your help and our experience could be . After a lot of test and study about the communication in space we found that lora protocol it's a great option for sat to ground and could be ground to space communication . So actually FEES is still in space and continue to trasmit it's position to earth. Now for recived it we are using global opensource network called TinyGS . Now we start we next step that's is try to constantly comunicate with the sat all around the world . So now we need the help of our great community . We need put small ground station around the world for connect our cubesat at any time of day and night. Fees is only first sat that we have in space but my opinion is that in the future we can add more cubesat and interconnect it and interact. I need other drone enthusiast around the world with ham license for implement ground station and cover ther world. The project is already available in alfa version . I developed a global dashboard where all gs will be available like tinygs , but with our implementation we can also trasmit message to satellite . We try as 12 years ago to implement a version of ardupilot for sat , where Ground station , protocol , and firmware for special future will be developed in the next months . If other guys around the world would help me and other member of this new group in ardupilot are welcome . The ground station is only the starting point , FEES is only the first sat available . Hope that next one will be from our community with Ardupilot code in space !!! 😉 Follow same picture of position of FEES in the space in last month. And a picture of dahboard during first global trasmission with sophyai.space gs . If you need more info or would join our new cool project contact me .. on Ardupilot discord there is a new channel called #satellite
This is the link in Ardupilot discord channel https://discord.gg/SJAxYMHYQp
If you need more info don't exitate to contact me 
 
Last Month trasmission from fees around the world : 
8876737854?profile=RESIZE_710x
 
8876738898?profile=RESIZE_710x
 
Read more…

8875503481?profile=RESIZE_710x

From Applied Aeronautics:

Albatross UAV made its first appearance in 2013 on DIY Drones. The Albatross UAV Project, as it was titled, set out to design a composite UAV that met a long list of precise and ambitious performance metrics. Those were: > 6KG MTOW, plenty of room for sensors and batteries, up to 4 hours of flight time, wide flight envelope, and last but not least, efficiency. Over the span of 13 months, the Albatross began to take shape through meticulous design and testing. To date, our company has Albatross systems flying in over 50 countries and on every continent. Still, it’s important to us to always remember our roots as a company spurred by the passion to create something truly incredible from the ground up.

Read more…

AIRLink stands for Artificial Intelligence & Remote Link. The unit includes cutting-edge drone autopilot, AI mission computer and LTE connectivity unit. Start your enterprise drone operations with AIRLink and reduce the time to market from years and months down to a few weeks.

More info: https://sky-drones.com/airlink

Sky-Drones Technologies hit their 10-year anniversary this year, so to celebrate the occasion the company decided to prolong the launch of their latest addition to the product range. This has been an entire year in the making, excitement has built up, and expectations have been implied, but the team have absolutely not disappointed!

As drones become more and more hardware-defined and software-orientated, Sky-Drones Technologies must be at the forefront of innovation. With that in mind, the company presents UAV industry specialists and enthusiasts alike: AIRLink. The artificial intelligence and remote link unit that is installed to your enterprise UAV to provide:

  1. Cutting edge autopilot: manual and fully autonomous flight capabilities for multirotors, fixed wings and VTOLs
  2. AI Mission Computer: onboard data processing and computer vision
  3. LTE Connectivity: BVLOS flights, Cloud connectivity and remote workflows

What these exceptional elements mean for the UAS industry is the start of a whole new era in UAV flight control, data analytics, and safety. Never has anyone created a product that can withstand such extreme ambient temperatures and high computational power without an overheating issue, a product that has entirely integrated hardware and software for flight and payload data analytics, a product that can be so many products in one or one product with so many cutting-edge capabilities... Until now.

To give an overview, AIRLink comes with a FPV HDR 1080p camera to provide users with a video stream that has exceptional image quality. The unit is constantly connected to the internet, has been designed with critical peripherals as an essential element for user convenience, and has built-in LTE and Wi-Fi. 

The goal with AIRLink is to integrate this all-in-one unit into the UAV designs during the manufacturing stages so as to focus the efforts and resources of the drone itself rather than avionics and connectivity. With that in mind, Sky-Drones offers a variety of options when choosing how to use AIRLink to the best of its ability:

  1. AIRLink OEM – purchase AIRLink as Sky-Drones designed it. Includes all the features and aviation-grade aluminium casing with the heatsink. The process includes simply installing the unit to your UAV.
  2. AIRLink Enterprise – Sky-Drones will provide the internal electronics of the AIRLink units, excluding the casing to reduce the size of the units and integrate them directly into your UAV during the manufacturing process.
  3. Reference Design – build AIRLink yourself in-house! Sky-Drones will provide all the relevant instructions, manufacturing material, and will be on hand for engineering support whilst you set up your operation for manufacturing your own AIRLink units.

Read more…

Hacking a cheap drone to run PX4

8831264061?profile=RESIZE_710x

From Hackaday:

Sometimes bad software is all that is holding good hardware back. [Michael Melchior] wanted to scavenge some motors and propellers for another project, so he bought an inexpensive quadcopter intending to use it for parts. [Michael] was so surprised at the quality of the hardware contained in his $100 drone that he decided to reverse engineer his quadcopter and give the autopilot firmware a serious upgrade.

Upon stripping the drone down, [Michael] found that it came with a flight management unit based on the STM32F405RG, an Inertial Measurement Unit, magnetic compass, barometric pressure sensor, GPS, WiFi radio, camera with tilt, optical flow sensor, and ultrasonic distance sensor, plus batteries and charger! The flight management unit also had unpopulated headers for SWD, and—although the manufacturer’s firmware was protected from reading—write protection hadn’t been enabled, so [Michael] was free to flash his own firmware.

We highly recommend you take a look at [Michael]’s 10 part tour de force of reverse engineering which includes a man-in-the-middle attack with a Raspberry Pi to work out its WiFi communication, porting the open-source autopilot PX4 to the new airframe, and deciphering unknown serial protocols. There are even amusing shenanigans like putting batteries in the oven and freezer to help figure out which registers are used as temperature sensors. He achieves liftoff at the end, and we can’t wait to see what else he’s able to make it do in the future.

Of course, [Michael] is no stranger to hacking imported quadcopters, and if you’re interested in PX4 but want something quieter than a quadcopter, take a look at this autopilot-equipped glider.

Read more…

Drone swarms for fire-fighting

From Robohub:

For my PhD, I’m studying how global problems such as wildfires and aid delivery in remote areas can benefit from innovative technologies such as UAV (unmanned aerial vehicle) swarms.

Every year, vast areas of forests are destroyed due to wildfires. Wildfires occur more frequently as climate change induces extreme weather conditions. As a result, wildfires are often larger and more intense. Over the past 5 years, countries around the globe witnessed unprecedented effects of wildfires. Greece has seen the deadliest wildfire incident in its modern history, Australia witnessed 18,636,079 hectares being burnt and British Columbia faced wildfire incidents that burnt 1,351,314 hectares.

Rather than jump straight in to the design of swarm algorithms for UAVs in these scenarios, I spent a year speaking with firefighters around the world to ask about their current operations and how UAVs could help. This technique is called mutual shaping where end users cooperate with developers to co-create a system. It’s been interesting to see the challenges they face, their openness to drones and their ideas on the operation of a swarm. Firefighters face numerous challenges in their operations, their work is dangerous and they need to ensure the safety of citizens and themselves. Unfortunately, they often don’t have enough information about the environment they are deploying in. The good news is that UAVs are already used by firefighters to retrieve information during their operations, usually during very short human-piloted flights with small off-the-shelf drones. Having larger drones, with longer autonomy and higher payload, could provide high-value information to firefighters or actively identify and extinguish newly developed fire fronts.

I think one of the reasons we don’t have these swarms in these applications is that 1) we didn’t have the hardware capabilities (N robots with high-payload at a reasonable cost) , 2) we don’t have the algorithms that allow for effective swarm deployments at the scale of a country (necessary to monitor for forest fires), and 3) we can’t easily change our swarm strategies on the go based on what is happening in reality. That’s where digital twins come in. A digital twin is the real-time virtual representation of the world we can use to iterate solutions. The twin can be simplistic in 2D or a more realistic 3D representation, with data from the real-world continuously updating the model.

To develop this framework for digital twins we’re starting a new project with Windracers ltdDistributed Avionics ltd, and the University of Bristol co-funded by Innovate UK. Windracers ltd. has developed a novel UAV: the ULTRA platform, that can transport 100kg of payload over a 1000km range. Distributed Avionics specialises in high-reliability flight control solutions, and Hauert Lab, which engineers swarm systems across scales – from huge number of tiny nanoparticles for cancer treatment to robots for logistics.

In the future, we aim to use the same concepts to enable aid delivery using UAV swarms. The 2020 Global Report on Food Crises states that more than 135 million people across 53 countries require urgent food, nutrition, and livelihoods assistance. Aerial delivery of such aid could help access remote communities and avoid in-person transmission of highly infectious diseases such as COVID-19.

 

By the end of this project, we are hoping to demonstrate the deployment of 5 real UAVs that will be able to interact with a simple digital twin. We also want to test live deployment of the control swarm system on the UAVs.

Are you a firefighter, someone involved in aid delivery, do you have other ideas for us. We’d love to hear from you.

 

Read more…

8798737491?profile=RESIZE_710x

Endangered Snow Leopards and their habitat are under threat, KwF to use drones to protect these elusive cats of the Himalayas

For more than a decade, Kashmir World Foundation (KwF) has been operating aircraft around the world protecting endangered cats, rhinos, sea turtles, and many other species. We integrate airframes with power and propulsion systems to carry sensors, high performance computers, and special components needed to perform our missions. Operating on battery or liquid fuel, our aircraft meet most of our mission needs.

But the mission that drove us to create KwF -- protecting snow leopards and other endangered species in the Himalayas -- eluded us with demands on flight performance that could not be met by any existing aircraft. To support this mission, the aircraft must operate at very high altitude while close to ground below and to the sides as they navigate through extremely rocky terrain while encountering high and highly variable winds with snow, ice, and freezing rain. Remote from any conceivable operating base, and with needs to patrol over tens of thousands of square kilometers, the aircraft must be able to stay on station for at least 8 hours processing data, making decisions, and taking swift action when needed, all while barely making a sound.

Background

Fifty million years ago, Eurasia and current day India collided to create the tallest mountain range in the world -- the Himalayas. Lifted by the subduction of the Indian tectonic plate under the Eurasian Plate, the Himalayan mountain range runs west-northwest to east-southeast in an arc 2,400 km (1,500 mi) long. Standing at 8,848 meters (29,031.7 ft), nestled between Nepal, Tibet, and China, Mount Everest is the tallest mountain peak in the world. Glaciers and fast flowing water have cut the mountains creating jagged peaks and valleys through which winds blow with rapidly varying direction.

8798797858?profile=RESIZE_710x

 

A map of mountain ranges in snow leopard habitat, with the sites of published population studies marked. Image taken from Suryawanshi et al, 2019.

 

Masters of disguise, the snow leopards (PANTHERA UNCIA) evolved about 3.9 million years ago from Miacids, a common ancestor of tigers, to become king of the Himalayas. These elusive cats live above the high alpine areas where they hunt the blue sheep, Argali wild sheep, ibex, marmots, pikas, deer and other small mammals.

8798802064?profile=RESIZE_710x

The global snow leopard population remains unknown, but based on their research, Suryawanshi and his colleagues fear it may be lower than prevailing guesstimates suggest.

Photo by Shan Shui / Panthera / Snow Leopard Trust

Having thrived under some of the harshest conditions on Earth, snow leopard populations are now in rapid decline and will be gone soon unless action is taken to ensure their protection. Despite being listed as endangered since 1972, and legally protected, humans hunt and poison them for trophies, fur and body parts; hunt their prey; and destroy their habitat through over grazing of domestic animals. Researchers aren’t sure how many snow leopards are left in the world. In 2016, IUCN estimated that there are between 2,710 and 3,386 snow leopards found in the high mountains of Central and South Asia.

Video by Dr. Ali Nawaz

Problem

Protecting snow leopards and other endangered species in the Himalayas requires surveillance over broad areas. To meet these stressing needs, Kashmir-Robotics, the Science and Technology division of KwF, proposed using aerial robotics enabled with artificial intelligence to provide self awareness, situational awareness, and the ability to act quickly and decisively.

In collaboration with the Technology Assisted Counter Poaching (TACP) network, Dr. Ronald Pandolfi, Chief Technology Officer for KwF and Dr. Muhammad Ali Nawaz, Executive Director of Snow Leopard Foundation in Pakistan derived a set of operational requirements for an unmanned aerial system (UAS) to perform the mission. Dr. Ronald Pandolfi concluded, “the aircraft must be able to launch from low altitude, climb to over 5,000 meters while traversing to snow leopard habitat, then stay on station for at least 8 hours while navigating between rocky slopes and deep valleys, all while exposed to high and variable winds, snow, and ice.” Hence, Eagle Ray was born.

Baseline

With no current aircraft able to meet these requirements,  Princess Aliyah Pandolfi, Executive Director of KwF turned to Chief Scientist & Vice President of DZYNE Technologies, Mark Page, world leading designer of high performance unmanned aerial systems and creator of the modern blended wing body (BWB) aircraft. She asked him to design a planform and airfoil stack for an aircraft with gross takeoff mass less than 25 kg (to comply with Federal Aviation Administration regulations) that is efficient and agile while able to operate in high and variable winds under some of the most extreme weather conditions.

Rooted in a long history of successful BWB designs, Page provided a Baseline adjusted for the flight characteristics required for the KwF missions. Page says, “Blended Wings like Eagle Ray are well suited to flight at extreme altitudes.  First, high altitude flight is much easier with an electric airplane since the powerplant doesn’t rely on oxygen from the atmosphere to generate power.   At high altitudes the oxygen density is about half that available at Sea Level.  A gas-powered plane would lose half its horsepower, while an electric plane loses none. Second to maintain efficient flight in the thinner air the true flight speed must increase about 40%.  The power required to fly is the product of flight speed times aerodynamic drag.  Eagle Ray will need to fly faster like any other plane, so the only question is this, “which plane makes the least drag?”.  Efficiency is the gift that keeps on giving, and the BWB produces about 30% less drag to bear the airplane’s weight.”

Nepal Collaboration

“At KwF, we look to the world for good ideas and great engineering,” Princess Aliyah says, “Our mission is to save endangered species from extinction, so we have reached out in the broadest sense for innovation and invention to meet our demanding mission capabilities.”

 

8798803666?profile=RESIZE_710x

Dr. Ronald Pandolfi, Princess Aliyah, and Kapil Regmi share their signed copies of an MoU between KwF & acem

that will inspire students in Nepal to collaborate internationally to help protect the environment

and endangered species through the use of innovative drones and artificial intelligence.

KwF’s partnership with Advanced College of Engineering & Management (acem) in Nepal will give Eagle Ray the platform to test at 18,000 feet and students to participate in an international design challenge. Kapil Dev Regmi, the Executive Director of acem is excited for the collaboration because “this state-of-the-art Eagle Ray Project is going to inspire students at so many different levels. It will motivate students to keep dreaming and prove that the true dreamers are the true achievers. Eagle Ray project is not just another project being carried out solely by an international organization but the project which collaborated with a local partner (like acem) to involve students right from the project development phase to ensure the end to end project overview. While most drone projects are being done just at local level (mostly on medical deliveries to rural areas), Eagle Ray project will open new dimensions of exploration like using tech to understand biodiversity and help to conserve them. This will showcase the idea of *innovation* to students. It is a project which is way ahead of time for Nepali students that will excite them to innovate for local problems.”

Global Challenge

8798855700?profile=RESIZE_710x

Just a few years ago, only large aerospace corporations could design advanced performance aircraft. Now with global communications resources like DIY DRONES and open source tools like OpenVSP, individuals and small treams from around the world can contribute with design and breakthrough technologies.

KwF now challenges students, academics, and professionals around the world to improve on the Mark Page BWB Eagle Ray design, and in so doing help KwF create an aircraft that is better suited to the mission of protecting endangered species throughout the Himalayas. More specially, the challenge is to improve on the planform and airfoil stack, increasing endurance while maintaining stability and maneuverability. All challenge participants receive access to the current and final design, and those who succeed in improving Eagle Ray have their name appear on the official list of contributors and their logo emblazoned on every Eagle Ray aircraft.

Challenge Teams can attempt to make improvement to the overall planform and airfoil stack, or they can concentrate on special features including elevons, trim tabs, tip rudders, or other control surfaces. Each proposed change must include a statement of why the change is suggested and how it is expected to improve performance and enable greater mission capability along with calculations, simulations and/or charts.

The Eagle Ray Design Challenge Opens April 15, 2021! Join the challenge to help protect endangered snow leopards. Details of the challenge criteria and rules of engagement will be available at www.kashmirworldfoundation.org or contact: info@kashmirworldfoundation.org

 

 

Read more…

Introduction

It can be difficult to get useful data from a magnetometer. It can be especially difficult if the data is used to estimate the yaw attitude of the vehicle. For example, the sensor may indicate the proper direction for certain attitudes, but a wrong direction for other attitudes. Or, the yaw estimate may be accurate one day and off the next.

In order for magnetometer data to yield the yaw attitude of a vehicle, the magnetometer must measure the direction of the geomagnetic field (i.e. the Earth's magnetic field). The geomagnetic field points north (more or less) and has been used for navigation for hundreds of years. The challenge is that the geomagnetic field is relatively weak. It is common for the geomagnetic field to be distorted or obscured by extraneous magnetic sources in the vicinity of the magnetometer. The purpose of magnetometer calibration is to extract an observation of the geomagnetic field vector from the sensor’s output, which is corrupted by various errors. 

Over the last several years, I have tinkered with the AK8963 magnetometer, which is part of the MPU-9250 IMU. Through trial-and-error, I eventually arrived at a dependable calibration procedure. The calibration procedure is repeatable and produces sufficiently accurate estimates of the yaw attitude. It is typical for the yaw attitude to have less than 2 degrees of error, provided the aircraft is in the wings-level configuration. Because of this work, the magnetometer now serves as the primary yaw attitude sensor within our autopilot system. 

In this article, I want to share four lessons I learned while integrating the magnetometer into an autopilot system. These lessons are probably familiar to those who work with magnetometers. However, for those who are new to magnetometers, this article may alert you to some common pitfalls. The four lessons are:

  1. Artificial errors impose special considerations on magnetometer calibration.
  2. Autocalibration methods have a fundamental flaw.
  3. Keep it simple with the compass swinging method.
  4. The throttle command can be related to a magnetometer bias.

In the remainder of this article, I will expand upon each of the previous points. For a more mathematical treatment of the proposed calibration procedure, the reader is referred to the attached PDF.

Artificial errors impose special considerations on magnetometer calibration

Magnetometers are subject to two types of errors: instrument errors and artificial errors. Instrument errors are associated with the sensor itself. Instrument errors are less noticeable in high-grade sensors than low-grade sensors. Artificial errors are extraneous magnetic sources that compete with the geomagnetic field. Artificial errors are associated with the environment. For this reason, you should calibrate the magnetometer in an environment that is similar to the sensor's operative environment. Practically, this means calibrating the sensor in its final configuration on the vehicle. This also means calibrating the sensor outdoors, away from buildings and power lines, which may distort the geomagnetic field. While it may be more convenient to calibrate an isolated magnetometer on a workbench, the calibration would likely become obsolete as soon as the magnetometer is mounted on the vehicle. 

It is worth emphasizing that artificial errors are not associated with the sensor itself.  Consequently, you cannot fix artificial errors by getting a better sensor. You fix artificial errors by (i) shielding/isolating the extraneous sources or (ii) removing the effects of the artificial errors from the data. In this work, we take the latter approach.

That said, the choice of magnetometer is still important. The full-scale range (FSR) of the sensor is particularly important. You don't want a sensor with a FSR that is orders-of-magnitude greater than the magnitude of the quantity of interest. This is because range and resolution are competing factors: what you gain in range you lose in resolution. For our application (vehicle state estimation), the quantity of interest is the geomagnetic field, the magnitude of which is about 0.5 Gauss. The AK8963 is a poor choice for our application because its FSR is 50 Gauss, which is 100x greater than the quantity of interest!

Autocalibration methods have a fundamental flaw

An autocalibration method determines the calibration parameters solely from magnetometer measurements. This feature makes it easy to collect the calibration data. Undoubtedly, the ease of data collection contributes to the popularity of a particular autocalibration method, the ellipsoid-fitting method. However, it can be shown that autocalibration methods are unable to correct for misalignment between the magnetometer and other inertial sensors [1]. Furthermore, it can be shown that misalignment is detrimental to attitude estimation [2]. In order to correct for magnetometer misalignment, additional sensors must be used. The theoretical flaw with autocalibration methods has real-world ramifications. In my experience, I have yet to see the ellipsoid-fitting method produce satisfactory estimates of the yaw attitude.

Keep it simple with the compass swinging method

Alternatives to the ellipsoid-fitting method include the dot-product invariance (DPI) method [3] and the compass swinging method. These alternative methods correct for misalignment by assimilating data from an additional sensor. The DPI method assimilates accelerometer data. The swinging method assimilates data from an "imaginary" magnetometer. We obtain data from the "imaginary" magnetometer by deducing the components of the geomagnetic field based on the orientation of the vehicle with respect to a compass rose. The mathematical details for all three calibration methods are included in the attached PDF.

The calibration procedure that I recommend applies the DPI method and the compass swinging method in succession. The DPI method is used to obtain a crude 3D calibration. The swinging method is used to enhance the measurement accuracy in the wings-level position. Of course, the calibration accuracy will decrease as the vehicle departs from the wings-level position. However, it is reasonable to assume the vehicle will remain close to the wings-level position during constant-altitude operations. Furthermore, deviations from the wings-level position are bounded due to roll and pitch constraints.

Ideally, the magnetometer would be well-calibrated after applying the DPI method. In practice, however, the estimated yaw angle can have up to 10 degrees of error. The error is linked to the off-diagonal elements of the matrix that appears in the inverse error model. The off-diagonal elements are difficult to observe using the DPI method. That is, the off-diagonal elements vary from run to run. For this reason, the swinging method is needed as an additional calibration step.

The throttle command can be related to a magnetometer bias

An electrical current will generate a magnetic field according to the Biot-Savart law (see this HyperPhysics article). A large current close to the magnetometer will bias the sensor and alter the estimated yaw attitude. A common source of such a current is the current drawn by the electric powertrain of the aircraft. The current-induced bias can be canceled by subtracted an estimated bias from the magnetometer readings. The estimated bias is proportional to the current. The proportionality constants can be estimated from system identification tests.

Of course, the previous solution requires a current sensor on the powertrain. The current sensor may be avoided by recognizing that the current is related to the throttle command. Using physical models of the UAV's powertrain (see the figure above), we can show that the current-induced bias is proportional to the square of the throttle command [4]. The figure below plots the magnetometer bias versus throttle command. The variable h3 denotes the component of the current-induced magnetometer bias along the z axis of the sensor. Overlaying the data is the quadratic model predicted by the powertrain analysis.8793495290?profile=RESIZE_400x

Conclusion

In conclusion, this article offers four insights on magnetometer calibration. First, we show that artificial errors impose special considerations on magnetometer calibration. Second, we caution the reader about autocalibration methods, such as the ellipsoid-fitting method. Third, we propose a calibration procedure that combines the DPI method and the swinging method. Finally, we propose a quadratic model for throttle-induced magnetometer biases.

magnetometer_calibration_procedure.pdf

References

[1] J. L. Crassidis, K.-L. Lai, and R. R. Harman, “Real-time attitude-independent three-axis magnetometer calibration,” J. Guid., Control Dyn., vol. 28, no. 1, pp. 115–120, 2005.

[2] D. Gebre-Egziabher, “Magnetometer autocalibration leveraging measurement locus constraints,” J. Aircr., vol. 44, no. 4, pp. 1361–1368, Jul. 2007.

[3] X. Li and Z. Li, “A new calibration method for tri-axial field sensors in strap-down navigation systems,” Meas. Sci. Technol., vol. 23, no. 10, pp. 105105-1–105105-6, 2012.

[4]  M. Silic and K. Mohseni, “Correcting current-induced magnetometer errors on UAVs: An online model-based approach,” IEEE Sens. J., vol. 20, no. 2, pp. 1067–1076, 2020.

Read more…

8744720662?profile=RESIZE_710x

Olá pessoal,

 

Estou em busca de pessoas interessadas em fazer parte de uma equipe de uma StartUp da área de drones recem aprovada em edital nacional no Brasil.

Estamos precisando de pessoas com conhecimento em desenvolvimento de hardware e software de drones ou conhecimentos em processamento de imagens.

Estamos na fase de prototipação de produtos e serviços. É necessário também conhecimento do idioma português, pois todo o processo de mentoria será realizado neste idioma.

Interessados entrar em contato comigo até o dia 03/04/21 por aqui ou pelo email mesquita.geison@gmail.com.

Obrigado!!!

 

 #########################################################################

Hey guys,

 

I am looking for people interested in being part of a team of a Drone StartUp recently approved in national competition in Brazil.

We are looking for people with knowledge in the development of hardware and software for drones or knowledge in image processing.

We are in the prototyping phase of products and services. Knowledge of the Portuguese language is also necessary, as the entire mentoring process will be carried out in this language.

Interested to contact me until 04/04/21 from here or by email mesquita.geison@gmail.com

Thanks!!!

 

 

Read more…

DJI TELLO becomes smart

This video demonstrates face tracking using a DJI TELLO drone. By means of the Python DJITELLOPY SDK and the mobilenet ssd artificial intelligence neural network the drone is converted into a smart drone. Tensorflow Lite and Google's Coral usb accelerator enables real time inference.

Read more…

Pioneering scientists lead the research of marine mammals utilizing drones in the most epic ways.

8684740677?profile=RESIZE_710x

There are countless growing threats to whales and dolphins in the oceans, that range from pollution, ship strikes, entanglement in fishing gear, climate change among others. Many species remain critically endangered. Ocean Alliance is actively working with the daunting task of acquiring more and better data with the objective to protect and save these beautiful creatures. But collecting data of marine mammals is no easy task, it requires a permanent, consistent effort, the most resourceful brains and the best tools. Therefore, Ocean Alliance developed their Drones for Whale Research (DFWR) program, in which they have been actively utilizing drones in the most creative ways. Apart from using multirotor drones that actually fly a few meters above the whales getting right into their blow of misty exhales to collect biological information (SnotBot), they are also using fixed wing amphibious drones to capture a broad range of data on the whales and their habitat. 

 Dr. Iain Kerr, CEO Ocean Alliance, recalls after using the Aeromapper Talon Amphibious for over 8 months:

“Ocean Alliance is a Conservation Science organization, in that we collect data so that we can best advise wildlife managers and policy makers on strategies to help conserve endangered marine species. Alas Oceanography has long been a rich man’s game, prior to our acquisition of an Aeromapper Talon Amphibious it has been a real challenge for us to collect the types of persistent, consistent offshore data on whales that we would like.

Our Aeromapper Talon Amphibious, has met all of the key touchstones that we look for in a new tool, affordable, field friendly, user friendly, scalable and easily modified or updated. I really think that you have made the right choice with regards to hand launch and water landing.  VTOL fixed wing drones (and I have tested a few) just seem to chew up too much battery time taking off and landing.  While our work to date has been primarily flying our Aeromapper Talon Amphibious on Stellwagen banks to find and count humpback whales, we have been approached by two aquaculture groups (principally mussels & oysters) and a field marsh conservation group to demonstrate the capacity of our Aeromapper Talon Amphibious for their use cases.

Ocean Alliance’s Drones For Whale Research program has been receiving international interest, our lead drone initiative SnotBot has been featured on BBC Blue Planet Live and twice on National Geographic, once with Will Smith in the series One Strange Rock.  Also, more than 18 groups worldwide now use protocols we have developed with SnotBot for whale research. We are excited to have added the Aeromapper Talon Amphibious to our drone stable and I am sure will have more exciting news to report on when we get back into the field.

I think that the Aeromapper Talon represents one of the best solutions out there for offshore research, monitoring and mapping. The hand launch water landing option makes the best use of battery and support resources. Those engaged in the new blue economy need the type of solution that Aeromao Inc., is offering, affordable, field friendly, scalable, robust and user friendly!”

 

 

 

8684744097?profile=RESIZE_710x

8684744883?profile=RESIZE_710x

8684745272?profile=RESIZE_710x

8684745092?profile=RESIZE_710x

Read more…

AMX UAV Officially Launch Vertic VTOL Drone

After 3 years development and test, AMX UAV officially launch Vertic, a quad tailsitter configuration VTOL drone specially designed for aerial mapping mission. Vertic is a drone with copter like (helicopter & multicopter) capability that can take-off on narrow area. After reach safe altitude, Vertic will transition to fixed-wing mode, ensure more flight efficient. Vertic will transition back to copter mode after the mission was finished and right above landing position. Vertic can be used for various application such as agriculture, plantation, urban planning, mining, forestry, and oil & gas.

8672550071?profile=RESIZE_710x

Vertic can fly fully autonomous, from take-off to landing, with preconfigured mission parameters. Vertic equipped with a failsafe system for safety, for example is quadchute system. This feature automatically change Vertic to copter mode if the flight controller detects anomaly on altitude loses. User with multicopter skills can manually control this drone easily since it’s similar with multicopter control. This easy-to-operate capability can minimized time spent on training.

8672559464?profile=RESIZE_710x

Vertic uses composite materials to ensure toughness and reliability. The detachable fuselage-wing design, the 1.4 meters wingspan drone is very compact and easy to deploy. Vertic is powered by 14WH Lithium Polymer. The drone can fly up to 35 minutes or cover about 250 hectares for single aerial mapping mission. Vertic ground control station (GCS) software are supported both on Windows or Android operating system connect to the drone up to 15KM. There are 2 type payload that can intergrated with Vertic, the Sony RX0 and Parrot Sequoia+. Sony RX0 is 15,3 mega pixels RGB camera, support with CarlZeiss fixed lens. For agriculture, forestry, and plantation mission, Parrot Sequoia+ multispectral camera can be used for deep analysis.

8672555499?profile=RESIZE_710x

All the advantages that Vertic has to offer, also supported by very competitive prices. Vertic standart packages is offered at a price $8500. The standart packages include ready-to-fly drone, Sony RX0 camera, GCS laptop, 3pcs battery, 2set propeller spareparts, tools and hard case. The standart packages also include training program for 2 operator, location in Yogyakarta, Indonesia. Contact us through email admin@amx-uav.com or phone +62-811292565 (Whatsapp/voice call/message) to get more info about Vertic and other AMX UAV products. 

Read more…

The threat from physical intrusion still remains one of the top concerns in both commercial and non-commercial contexts. According to a report from Markets and Markets, the video surveillance market, which includes both hardware and software, is presently at USD 45.5 billion and expected to reach USD 74.6 billion by 2025.

Over the years, there have been many advancements in optics and detection systems but limitations still exist in the conventional ways of using them. To overcome these limitations, security stakeholders are now incorporating drone technology in their operations.

In this blog, we will talk about drones and the FlytNow solution for perimeter security.

What is perimeter security?


automated perimeter security

Perimeter security is an active barrier or fortification around a defined area to prevent all forms of intrusion. Modern security systems are an amalgamation of sophisticated hardware and software that generally include cameras, motion sensors, electric fencing, high-intensity lights, and a command center to manage them all. 

Challenges with conventional security systems (without drones) for perimeter security


Below are some of the drawbacks and limitations that are inherent in a conventional security system:

  • CCTV cameras and motion detectors are stationary, thus leaving plenty of room for blind spots.
  • Patrolling requires human guards - for larger areas, this is the least efficient way of securing a premise.
  • Response to an intrusion is delayed since a human responder has to reach the location.

Benefits of using drones for perimeter security


Drones have the following advantages over a conventional security system:

  • Drones are mobile flying machines that can go to any location quickly, with HD camera(s), thus eliminating blind spots.
  • Drones can also be equipped with a thermal camera(s) which are useful during nighttime surveillance.
  • Drones can be automated for patrolling using the FlytNow cloud-connected solution and commercially available DiaB (Drone in a Box) hardware.


Note: A DiaB is box-like hardware that houses one or more drones. The hardware keeps the drone flight-ready (24x7) and also automates the launching and docking processes of a drone.

Drones automation for security


For perimeter security, drones are generally used in conjunction with Drone-in-a-Box hardware and a fleet management system that powers the command center. Other security system hardware, including CCTV cameras, motion sensors, etc. can complement the drones and can be connected to the command center, thus integrating into a complete system. In a real-life scenario, such a system might work in the following way:

Drone Command Center

  • An intrusion is detected by one of the CCTV cameras in an area under surveillance. 
  • The command center receives the alert and initiates a drone launch. 
  • A connected DiaB receives the launch request and releases a drone. 
  • The drone flies to the location where the intrusion was detected and begins streaming a live video feed. 
  • An operator maneuvers the drone to cover all blind spots.
  • On finding the intruder, the operator has the option to warn him/her about the transgression using the drone’s onboard payload such as a beacon, spotlight, speaker, etc.


To know about the kind of drones and sensors that can be used for security and surveillance operations please refer to our Drone Surveillance System: The Complete Setup Guide.

How FlytNow enabled perimeter security?


FlytNow is a cloud-based application that helps in managing and controlling a fleet of drones from a unified dashboard through automation, live data streaming and integration. In the context of perimeter security, this translates into a command center that connects drones with the traditional components of a perimeter security system.

6 Reasons to use FlytNow for perimeter security


#1 Easy Setup: FlytNow is cloud-hosted i.e. a user can access the application from any standard web browser, without any complicated server setup. Connecting the drones with the system is also easy and is done using FlytOS.

#2 Unified Dashboard: FlytNow features an advanced dashboard that shows the following:

  1. A live map showing the real-time location of all the drones. The map can be customized to show points of interest, and virtual geofence, and CCTV zones.
  2. On-screen GUI controllers and keyboard & mouse support to control a drone. This allows an operator to easily maneuver a drone to a point of interest from the command center.
  3. Multicam support that allows streaming video feeds from more than one drone.
  4. Different view modes that allow an operator to switch between RGB and thermal mode. In the thermal mode, there is the option to switch between different color pallets, allowing a user to identify warm objects against different backdrops.
  5. Pre-flight checklist which is a list of checks the system prompts an operator to perform before initiating a drone launch.


#3 Live Data Sharing: An operator can share the live video feed from a drone directly from the dashboard. The feature can be used to share video with the police or other remote stakeholders.

Using Drones for Perimeter Security

#4 Advanced Automation: Operating drones through manual control is quite an inefficient way to use drones. Instead, automation should be employed to perform activities like security patrols. FlytNow comes with an advanced mission planner that allows a user to define a path for a drone to follow and save it as a mission. The mission can be executed periodically, thus making a fleet of drones perform automated patrolling.

Self charging security drone software

#5 Add-on Modules: FlytNow provides add-ons to make a drone intelligent; this includes precision landing over a computer-generated tag, obstacle detection, and object identification. These add-ons enable a drone to autonomously fly to a location, identify a threat, and return to the DiaB hardware.

#6 Drone-in-a-Box Hardware Support: The functions of DiaB hardware, in the context of perimeter security, can be broadly classified into four categories:

  1. Securely house a drone.
  2. Keeping the drone fully charged all the time.
  3. Initiate a drone launch.
  4. Successfully dock a returning drone.

Summary


In this blog, we discussed the concept of perimeter security, the limitations of conventional security set up, and how these limitations can be overcome using drones. Then we covered how drones are actually used for aerial patrols and 6 reasons why FlytNow is an ideal solution for automating drones for perimeter security.

There are plenty more reasons to use FlytNow for perimeter security that you can find out by signing for our 7 days free trial.

Read more…

8665087083?profile=RESIZE_710xI am looking for a collaboration with a drone company to do a high altitude flight experience. We are starting a small company called Loweheiser, we are developing EFIs for small engines for UAVs. Right now we are developing a throttle body for an RCGF 15RE, and I also want to design one for the 10cc of this same brand. The 10cc is probably the smallest electronic fuel injection engine that has ever been made, At the moment the smallest engines we have worked on are the Saito FG-21 and the FG14, the FG14 has a 7mm throttle body!


Many clients ask me how high the EFI works. I tell them that the EFI can operate at any altitude that a fixed-wing UAV can reach. The absolute pressure sensor of the ECU can measure from 15 to 115 kPa, 15kPa are 60000m so it seems not a problem. We have tested at 2000m because it is the maximum we can reach in a few hours of travel, but I would like to test it in flight with higher altitude.
In my initial calculations I see that a normal plane could reach 10,000m without problems. A plane of about 2m wingspan and about 3kg of weight consumes less than 100w to maintain a cruise.

The 15cc engine has about 1.54Kw. The motor loses power linearly with the loss of density of the atmosphere The density of the atmosphere at 25ºC 0m (sea level) is 1.12720 kg / m3, the density of the atmosphere at -20ºC and 10000m is 0.453337kg / m3, that's 40% of the power of the engine at sea level. it is 600w, even introducing losses of another 50% due to propeller performances and other ineficiencies ... there is still 300w which is more than enough to fly at 10000m


Where can be done this high altitude challenge?
I am looking for a collaboration with a drone company who has experience in flying in an airspace where this test can be done and who can take permission from the concerned authorities. 

Best regard, jlcortex

Read more…