Brandon Basso's Posts (13)

Sort by
3D Robotics

3689689577?profile=original

We're been hard at work at 3DR on system architecture and looking into new ways of building adaptable UAV systems.

Solo in particular leverages a number of distributed systems--vehicle, controller, and app--that continue to enable industry-leading flight autonomy--smart shots, and now the shot modes on etc. The complexity of UAV systems is only increasing, and handling messaging and data distribution between these systems in a reliable, high performance way is a complicated challenge.

After evaluating a number of options, we have selected eProsima Fast RTPS, a messaging middleware developed by eProsima, to power system-level messaging and data distribution on our future platforms. Fast RTPS is an open-source implementation of the RTPS standard. RTPS (Real Time Publish Subscribe) comprises the transport layer of the DDS standard developed and maintained by the Object Management Group.

We selected eProsima Fast RTPS over other available implementations for a number of reasons. Fast RTPS is feature complete, providing support for many of the advanced features available in RTPS that we’re excited about. Given our history of contributing to and supporting open source projects, Fast RTPS being open source was another compelling factor for us. Finally, Fast RTPS is also more approachable than other options we evaluated, increasing our confidence that we could make any necessary modifications and contribute fixes back upstream.

In connection with 3DR’s adoption of Fast RTPS, eProsima intends to join the Dronecode foundation to encourage further adoption of Fast RTPS. We’re also happy to announce that upcoming licensing modifications will enable Fast RTPS to be distributed on mobile platforms. Although Fast RTPS is currently licensed under the LGPLv3,  eProsima plans to provide an alternative license for Fast RTPS, enabling use of Fast RTPS under MPLv2, a license developed by the Mozilla Foundation. MPLv2 retains many attributes of LGPLv3 while allowing users to embed Fast RTPS in mobile applications.

3DR is not alone in the decision to use RTPS for data distribution on complex robotics platforms. The Open Source Robotics Foundation, developers of the ROS and ROS2 operating system, have also decided to use RTPS to power future systems. To further facilitate adoption and use of Fast RTPS, eProsima offers options for commercial support and development for companies seeking to use Fast RTPS in their platforms.

Looking for more information? Fast RTPS is available on Github and you can download from eProsima web site the latest binaries. Discussions about Fast RTPS (and RTPS generally) have already started in the Dronecode forums.

More about eProsima...

eProsima, The Middleware Experts, is a company focused on High Performance networking middleware. eProsima provides insight to develop your distributed systems recommending the right middleware products and supporting you in all the stages of the development.

Read more…
3D Robotics

Solo: Open for development

JwpF4t6aVTF3uTlWdnu7NaSYFukbZlbxc9JHcr94EEB-b3griY07GzYkrrO2OP9Bk-eld2fB1p_SBALvFF6KlII5nSq7q7PdhIwIhCJz58OQi3jR0qeHOFgONEmrk9FiwjrgDrM?width=595

3DR is proud to announce that today we’re opening the source code for the Solo flight controller. Solo is a consumer product, aimed at simplifying the process of getting great aerial footage. As a development story, true to 3DR’s open spirit, Solo was developed not just by us at 3DR, but in concert with the ardupilot core development team and contributors around the world. Solo’s flight characteristics, performance and robustness are a testament to the power and adaptability of APM:Copter, which is why we’re so pleased today to officially open that source code to the world at large: The repository with full history is now available for download on GitHub, the first of many development doors that we’ll be opening on Solo.

In addition to opening Solo source, we are today publishing the set of guiding principles that went into developing all of Solo’s subsystems--3DR’s Open Source Policy. The policy, developed by the 3DR software engineering team in conjunction with the the community, addresses key questions: How do we do development? What do we choose to open? Why do we do it in the first place? For 3DR, the policy represents a set of guiding principles on how we plan on doing development, and we feel it important to, in turn, open to the broader community for clarity, accountability and, well, because we’re excited to share it!

Solo: Open for development

KgIjEMK3_D6DJ8nwt0FcsiFoW0EtJspfcLUUujYKukD_DqqmmZheVetn00ogfhdB-qaZsJTbBlpCX7PgXXoJkjaFc-LcaaCcKJCV4eNPHJmyvl8oU-wp5026O53eI5hcP4wd0jA?width=595

We placed open source hardware and software at the core of Solo and view it as essential to our development and platform strategy. But it doesn't stop with the source! We strive to put powerful tools in the hands of developers. In some cases this means source code, and we plan on opening other parts of Solo, including much of SoloLink. In all cases this means making Solo the number one development platform, which means great APIs and developer tools. We plan on deepening DroneKit integration into Solo, so that developers can easily add new shots in Python and apps on Android, with new language bindings for other platforms coming soon.

sGEqNeGTUnh2-TZA7XrCTu2t69-4EdL52jqda8sBq4CeDYsXPSQVjITmlsVkGpu3aGFUYtEFRaFSQ8dI8q0DyhEU3zsdxGWayEOtxXOZnQvBVq6V1NojDlcG15ipyYIcN_LTw_8?width=595

The word ‘platform’ is getting quite a bit of traction these days. However, we believe it’s not enough to call something a platform ahead of building it. A platform isn’t an over-engineered airframe, coincidentally exposed cables or slides in a presentation; it’s a bolt pattern, a pinout, an API and a source code that exists in the real world with engineers to support it. So rather than pay lip service to the term, we chose to place the world’s leading autopilot development platform--APM:Copter--at the heart of Solo, running on the new Pixhawk 2. Connected to the flight controller we have SoloLink, which brings the power and flexibility of Linux computing onboard a consumer vehicle for the first time and appeals to an even broader set of developers. We also added an accessory port, 3DRBUS, on the bottom so that kickstarter projects, academics and businesses can easily integrate their own hardware. We developed DroneKit for use on Solo so that web and mobile developers can build businesses on top of this hardware and software. Most important, we know a great platform gets better with time, and we plan on supporting Solo for years to come.

This is the Solo Developer road map. Let us know what you think. Help us build Solo into the premiere drone platform, and the best home for developers around the world.

 

Read more…
3D Robotics

DroidPlanner 2.0 Beta Announced

3689569469?profile=original

The DroidPlanner team, led by Arthur Benemann, is pleased to announce the next version of its Android-based UI. DroidPlanner 2.0 offers a completely redesigned interface for controlling 3DR multirotors over Mavlink. The UI/UX has been rethought with an emphasis on simplified workflows, glanceable information, and reliability for the most common user actions.

DroidPlanner 2.0 centers around two screens: Telemetry and Planning. The telemetry screen allows users to quickly assess the state of the aircraft and perform quick actions, like loiter and land. The planning screen allows users to create missions on the fly, as well as edit them easily in the field. Other features, particularly those different from the original DroidPlanner, are listed below. The DroidPlanner 2.0 UI was specifically targeted at multirotors, but full plane support and all the features of the original DroidPlanner are coming soon!

Features:

  • Completely redesigned graphical user interface
  • Specifically designed for 3DR multirotors and Iris
  • New telemetry screen showing quick glanceable info: HUD, battery, RSSI, distance
  • Easy to use Home, Land, and Loiter buttons
  • New guided mode with changeable altitude
  • Quick mode switching
  • New planning screen for quick mission generation
  • Easy and powerful mission editing tools
  • Basic radio TX setup
  • Preflight checklist

Coming soon:

  • Survey tools in the mission planning screen
  • FollowMe 2.0 with advanced control options
  • New flight tuning screens
  • Improved radio configuration
Here are the instructions to download:
3) Download the app from the Play Store (this may take 30 mins or so to go live for you after you become a beta tester)

The app is still in a beta release, bugs can be reported on Github.

For additional support check out the DroidPlanner forum.

If you like the app consider making a donation to support the development.

Read more…
3D Robotics

Cameron Rose from the Biomimetic Millisystems Lab from the University of California Berkeley with his flying H2Bird robot. Photo © KIKE CALVO

Cameron Rose quite eloquently put his thoughts and hopes on the futures of UAVs in a recent National Geographic article.  Cameron is a graduate student at UC Berkeley in EECS in Ron Fearing's group, which is famous for, among other things, foundational work in flying insects through the MFI projectrapid prototyping of millirobotis, and the recent spin off company, Dash robotics.

¨My dream is to be able to contribute one day to advancing the field of robotics enough to achieve something even close to the level of maneuverability and control that animals possess in nature,¨ said Cameron Rose, a graduate student in the Department of Electrical Engineering and Computer Science at the University of California, Berkeley. ¨There is so much to be learned from behaviors and control surfaces of animals that can be applied to robotics.

Rose, 25, graduated from the University of Maryland, College Park in 2010 with a Bachelor of Science in Computer Engineering. Today, he is a member of Prof. Ron Fearing’s Biomimetic Millisystems Laboratory, and his research focuses on modeling and control of flapping-winged robots in flight away from equilibrium. ¨I also dream to use my knowledge and passion for robotics to encourage other African American students to pursue similar paths. There are so many middle and high school students that don’t have the same opportunities that I had growing up. If I can do anything to fix that, be it through simply speaking at schools or showing my robots, I’d like to help. I’d like to see the number of African Americans pursuing undergraduate degrees and PhDs in engineering increase.¨

¨I was always fascinated with planes and flight growing up,¨ said Rose. ¨I liked to play with everything from paper airplanes, to building models, to launching model rockets. I have many memories of my dad out in the field behind my house launching all sorts of flying things into the air.¨

Rose is inspired by his grandfather in the way that he approaches his life. ¨He always puts his family and God first,¨ said Rose. ¨The level of passion and joy he has for photography is remarkable in the face of some of the hardship that is in his life. I hope to achieve that level of passion and happiness from my work in spite of the tough times that may arise.¨

¨When I was in high school, I was in the Civil Air Patrol,¨ said Rose. ¨I had the opportunity to attend Solo Flight School, where I could get the classroom training and flight time to work towards my solo flight and eventually my pilot’s license. I’ll never forget the first time I was able to take the plane up in the cockpit by myself. It was one of the most exhilarating and honestly terrifying moments of my life. Fortunately, there were no problems, and I successfully completed the flight by myself. I’ll never forget the feeling of accomplishment, freedom, and relief, when I successfully landed the plane at the end of the flight.¨

Regarding the FAA regulations Rose thinks there should be licenses for operating civilian drones in public settings to ensure people know how to operate them safely around others to avoid injury.

Rose Master’s thesis “Flight Simulation of an Ornithopter” has been published by the University of California. His paper “Cooperative Control and Modeling for Narrow Passage Traversal with an Ornithopter and Ground Station,” was in the 2013 Autonomous Agent and Multiagent Systems (AAMAS) Conference.

¨The H2Bird is a 13 gram flapping-winged robot that includes custom-designed electronics for sensing and control,¨ explained Rose.¨ Flapping flight provides high maneuverability necessary for navigating in indoor environments on a small mass scale. Building fliers on such a small scale is made possible by advances in the size scale of electronics and sensor packages, and through study of flapping aerodynamics. Additionally, minimizing the power consumption of the motors and electronics has enabled flight times up to 5 minutes. The goals of the current project are to develop optimal control policies for single or multiple vehicles to achieve sensing and navigation among un-modeled obstacles such as doors and walls. One of the possible applications for the project in the future is within the realm of disaster relief. If a group of sensor-equipped crawlers and fliers can be released into something like a collapsed building, they can help locate survivors or areas that could be dangerous for humans.¨

5 good things people should know about drones:

  • They’re good for more than just military applications.
  • Occurrences in nature that could not ordinarily be photographed or recorded by humans can be captured with the aid of drones.
  • Drones can be used in emergency situations in conditions that would be dangerous to humans.
  • Drones can be used to transport materials on repetitive journeys that would be a waste of human resources.
  • Drones can be used to create art.

5 things people probably are not aware when it comes to drones:

  • Not all drones are quadrotors.
  • The large majority of them are not armed.
  • Civilian users of drones are increasing more rapidly than military-use drones.
  • More information on your life can be gained simply by using your cell phone than a drone can provide.
  • There are a wide variety of non-military applications for drones that are beneficial.
Read more…
3D Robotics

DSC_0085-300x178.jpg?width=500

Dr. Charlie Rush and Montana State graduate student Ian Johnson have been using the 3DR Y6 for some foundational agricultural disease detection and prevention research. Their project aims to use aerial imagery to characterize the spread of wheat crop diseases. These pest-caused viruses can lead to poor water uptake, and ultimately wasted water and wasted money. Here's the full story from Agrilife:

Writer: Kay Ledbetter, 806-677-5608, skledbetter@ag.tamu.edu
Contact: Dr. Charlie Rush, 806-354-5804, crush@ag.tamu.edu
AMARILLO – Dr. Charlie Rush hopes to use a unique method – helicopter drone – to track disease progression across wheat fields to eventually help producers make better irrigation decisions.
A helicopter drone is being used by Dr. Charlie Rush, Texas A&M AgriLife plant pathologist in Amarillo, to track disease progression across wheat fields. (Texas A&M AgriLife Research photo by Kay Ledbetter)
A helicopter drone is being used by Dr. Charlie Rush, Texas A&M AgriLife plant pathologist in Amarillo, to track disease progression across wheat fields. (Texas A&M AgriLife Research photo by Kay Ledbetter)
Rush, a Texas A&M AgriLife Research plant pathologist in Amarillo, has enlisted the help of Ian Johnson, a Montana State University-Bozeman graduate student who is using his work in the university’s Science and Natural History Filmmaking Program to help scientists conduct research.
Approximately 1.1 million acres of wheat in the High Plains are irrigated, Rush said, making wheat the second-largest user of irrigation water from the Ogallala Aquifer. In this same region, mite-vectored virus diseases are the predominant pathogenic constraint to sustainable wheat production each year.
The viruses causing these diseases are transmitted by the wheat curl mite, he said. Infected wheat plants not only have reduced grain and forage yields, but also greatly reduced root weight and water-use efficiency. Therefore, fertilizer and groundwater applied as irrigation to diseased wheat is largely wasted.
Rush’s team is using the helicopter to take remote images of a field study where they are trying to develop an economic threshold for irrigation of wheat infected with wheat streak and other mite-vectored diseases.
A helicopter drone used by Dr. Charlie Rush, Texas A&M AgriLife plant pathologist in Amarillo, flies over a wheat field to track disease progression. (Texas A&M AgriLife Research photo by Kay Ledbetter)
A helicopter drone used by Dr. Charlie Rush, Texas A&M AgriLife plant pathologist in Amarillo, flies over a wheat field to track disease progression. (Texas A&M AgriLife Research photo by Kay Ledbetter)
“The problem for farmers is that these diseases develop in gradients over time and they don’t know whether or not they should apply new pesticides or fertilizers or water,” he said. “Most of these practices are done in April, and that is when the disease is just starting to show up. They may know they have disease in the field, but they really don’t know how much damage it might cause.
“So what we are trying to do is be able to go in early in the season and look at the disease development at a particular time and then based on what it looks like, say in early April, be able to give them a prediction of what the crop will be at harvest time.”
To do that, Rush said his team has been going into the field using different types of remote imaging, such as the hand-held hyperspectral radiometer, to measure and quantify the severity of disease development in the field.
A wedge image can be made after the helicopter has made about six passes over the field. The images it captures were stitched together by Ian Johnson, a Montana State University-Bozeman graduate student, for a complete picture.
A wedge image can be made after the helicopter has made about six passes over the field. The images it captures were stitched together by Ian Johnson, a Montana State University-Bozeman graduate student, for a complete picture.
“Now the application and use of this helicopter drone is one more way of measuring the disease development,” he said. “The beautiful thing about this is instead of having to deal with handheld devices, you can come in and fly the entire field in a matter of five minutes and get a very, very high resolution. So we are excited about the possibilities this may provide for our project.”
Johnson said he will use the drone to make four or five flights during the growing season and then generate results that will help Rush. This Y6 helicopter, named for its Y shape and six propellers, is made by 3D Robotics and includes an autopilot.
“That allows us to preprogram flights and fly a grid. As it is flying the grid, it takes top-down photos,” Johnson said. “Once we collect the photos, we stitch them together and build a giant photo mosaic of each field we flew over. We are hoping to provide incremental photo coverage of the fields as the disease progresses through the fields.”
Johnson said using this technology for agricultural research provides a whole list of improvements to the previous services available. Many researchers have used satellite imagery before, but this provides resolution 100 to 1,000 times greater than the LandSat satellite imagery Rush and his team used in the past.
“That’s a huge improvement when you are looking at ground-coverage and trying to pick out diseased plants,” he said. “Another improvement is temporal resolution. We can fly this every day for a couple of weeks or every week for a whole season, whereas with satellite you have to wait two to three weeks for a pass and that is if you can get your slot. So this is a great improvement.”
Another thing, Johnson said, is the drone is currently using visible spectrum only – still photographs – because the project is focused on the yellow band of light, which easily captures the typical symptoms of wheat streak mosaic.
“But this is a modular system,” he said. “We could put near infrared, thermal, any array of multi-spectral sensors on here to capture whatever data it is the project demands.”
Rush said one of the things he is most excited about with this new technology is that although aerial images have been taken before and unmanned aircraft have been used to measure things, “they have never been used to our knowledge to manage irrigation applications, especially in diseased crops.
“This is something that is totally new,” he said. “Obviously, in the Texas Panhandle where water is such a precious resource, anything that we can do to reduce waste or farmers putting on irrigation water when it is not going to pay off for them is going to be a positive thing.”
Rush said he is confident that with the studies currently underway, this new technology will allow them to very quickly provide growers with the information they need to better manage their irrigation.

Read more…
3D Robotics

Oil spill research from the sea and air

3689561201?profile=originalThe University of Miami is studying oil spills with drifters, UAVs, and a kite!  From The Atlantic:

More than three-and-half years after the Deepwater Horizon disaster spewed millions of gallons of petroleum into the Gulf of Mexico, scientists are launching drones and ocean-going sensor arrays off the Florida coast in an effort to map the path of future oil spills before they devastate beaches and coastal ecosystems.

Researchers from the University of Miami and other scientists are placing 200 GPS-equipped “drifters” in the surf zone just off Fort Walton to map where the ocean currents take the devices. Sensors placed on the ocean surface and seabed will track the movement of colored dye that will be released during the three-week experiment that began today. Two drones outfitted with GoPro cameras will also monitor where the currents take the drifters and dye. Since the drones can only stay aloft for an hour at a time, a camera-carrying kite will also be deployed.

All the data collected will be used to construct a computer model of near-shore ocean currents to predict how future oil spills or other pollutants will disperse as the approach the shore.

“Computer models will be able to give us better estimates of where the oil spill will go, and how fast and in which patterns it will spread,” Tamay Özgökmen, a University of Miami professor and the director of the Consortium for Advanced Research on Transport of Hydrocarbons in the Environment, told The Atlantic in an email. “This can help emergency responders to better direct their limited resources. In the longer term, models are also helpful to make sense of any ecological damage that may have occurred in the environment.”

For instance, that model can also predict where currents will carry shrimp larvae – crucial information given the importance of fishing to the Gulf Coast economy.

The Surfzone Coastal Oil Pathways Experiment is part of a larger $500 million effort funded in part by oil giant BP in the wake of the Deepwater Horizon catastrophe. Depending on the strength of the currents, the drifters and drones will be deployed over an area that could stretch from hundreds of square yards to many square miles, according to Özgökmen.

And a link to the official PR from the UM Rosenstiel School of Marine & Atmospheric Science:

MIAMI – (Dec. 2, 2013) – A University of Miami (UM) Rosenstiel School of Marine and Atmsopheric Science-led study to understand the path of oil or other pollutants in coastal areas begins offshore of Ft. Walton Beach, Florida today. During the three-week SCOPE Experiment – Surfzone Coastal Oil Pathways Experiment – scientists will deploy GPS-equipped drifters and other advanced instruments to study the ocean currents along the coast to better understand how oil may move onshore in the event of a future spill.

“In the aftermath of the Deepwater Horizon oil spill it became clear that understanding the ocean currents in the surfzone is vital to improve our understanding and prediction of oil spills,” said Dr. Tamay Özgökmen, UM Professor and Director of the Consortium for Advanced Research on Transport of Hydrocarbons in the Environment (CARTHE). “There are catastrophic socio-economic impacts when oil spills reach our beaches.”

UM Professor Ad Reniers and his colleague Professor Jamie MacMahan from the Naval Postgraduate School in Monterey, Calif., will deploy a variety of instruments, including 200 GPS-equipped drifters, unmanned aerial vehicles and pressure and dye sensors at the surface and at varying depths, to measure the movement of ocean currents along the coast to study how oil, fish larvae, or toxins in the water are carried by currents close to shore.

“This study will collect important data necessary to understand the ocean currents in the near-shore marine environment,” said Reniers, associate professor of applied marine physics at the UM Rosenstiel School and lead investigator of the SCOPE Experiment. “The information collected will be used to develop computer models of the coastal zone, to improve our scientific understanding of this region in the event of a future oil spills, as well as to better understand how larvae or water pollutants travel close to shore.”

The unmanned aerial vehicles will be equipped with cameras to monitor the drifters and used in a dye experiment, where EPA-approved colored dye is placed in the near-shore waters to collect visual data on the movement of currents. Several of the drifters being deployed during the experiment were designed by students from three Florida high schools, MAST Academy, South Broward High School and Maclay School, as part of a CARTHE-sponsored educational outreach program. This research is made possible by a grant from the Gulf of Mexico Research Initiative (GoMRI). The GoMRI is a 10-year, $500 million independent research program established by an agreement between BP and the Gulf of Mexico Alliance to study the effects of the Deepwater Horizon incident and the potential associated impact of this and similar incidents on the environment and public health. For more information, visit http://gulfresearchinitiative.org

SCOPE is the second large experiment conducted by CARTHE that brings together a wide range of scientific experts and experiment with measurement methods to study oil spills. The first experiment, called GLAD (Grand Lagrangian Deployment), was conducted near the Deepwater Horizon site in the summer of 2012 also under the support of GoMRI. Information collected by scientists from both experiments will be used to model the transport and fate of oil in the Gulf of Mexico, in the event of a future spill.

The SCOPE Experiment is a project of the UM-based CARTHE. The CARTHE program includes 26 principal investigators from 12 research institutions in eight states. Together these scientists are engaged in novel research through the development of a suite of integrated models and state-of-the-art computations that bridge the scale gap between existing models and natural processes. For more information about CARTHE, please visit www.carthe.org or on Facebook at www.Facebook.com/carthe.gomri

About the University of Miami’s Rosenstiel School

The University of Miami is the largest private research institution in the southeastern United States. The University’s mission is to provide quality education, attract and retain outstanding students, support the faculty and their research, and build an endowment for University initiatives. Founded in the 1940’s, the Rosenstiel School of Marine & Atmospheric Science has grown into one of the world’s premier marine and atmospheric research institutions. Offering dynamic interdisciplinary academics, the Rosenstiel School is dedicated to helping communities to better understand the planet, participating in the establishment of environmental policies, and aiding in the improvement of society and quality of life. For more information, visit: www.rsmas.miami.edu

Read more…
3D Robotics

3689551664?profile=original

Both Mission Planner and DroidPlanner have recently added features for simplifying aerial survey. These tools help take the guesswork out of mission planning with cameras onboard, taking pictures to be stitched into a mosaic. Rather than manually specifying a grid flight pattern, you can instead specify you camera, desired operational altitude, and triggering method. The survey tool takes care of the rest, and allows you to see the expected coverage as well as estimates of ground resolution, required number of pictures, and other statistics.

3689551617?profile=original

Survey is completely airframe independent and works for both copter and plane. By framing the mission planning problem in terms of aircraft operation and desired result, rather than the means to that result, planning is vastly simplified, repeatable, and easy to do in the field. 

Fly the camera, not the plane!

The current version of DroidPlanner now supports the following features in Survey:

  • Simple user interface
  • Multiple supported cameras
  • Real-time visualization of the flight plan
  • Projected camera footprint visualization
  • Calculated statistics, such as coverage area and mission length

 

The current version of Mission Planner now supports the following features in Survey:

  • Both simple and advanced user interfaces
  • Multiple supported cameras and custom camera definition
  • Automatic camera parameter identification by photo upload
  • Real-time visualization of the flight plan
  • Projected camera footprint visualization
  • Calculated statistics, such as coverage area and mission length
  • Trigger type selection and auto population of the distance-based trigger, CAM_TRIGG_DIST

 

This is the first of a series of posts explaining the end-to-end process of aerial survey, from planning to stitching.  The wiki will be updated with tutorials on 1) mission planning, 2) aircraft setup and operation, 3) camera control and 4) post-processing. And new features are always being added, so stay tuned!

 

3689551686?profile=original

 

Read more…
3D Robotics

3689539367?profile=original

Our all-star interns have been brewing up some cool devices this summer at 3DR. Three words: Oculus Rift FPV.  

We have the Oculus communicating through the Mission Planner as a joystick input, driving the pan/tilt servo gimbal on our Skyhunter fixed-wing aircraft. Video is piped to the ground via standard 5.8GHz TX/RX gear, with the 3DR OSD kit in line. The end result is one for which you should stay firmly seated; even without the video stretched all the way to the periphery, the experience is incredibly immersive.

Our current setup is based on monocular video, but stayed tuned for more from Project Warg, including stereoscopic video, new airframes, and closed-loop aircraft control with the Oculus!

Read more…
3D Robotics

DSC_0085-300x178.jpg?width=550Dr. Charlie Rush and Montana State graduate student Ian Johnson have been using the 3DR Y6 for some foundational agricultural disease detection and prevention research. Their project aims use aerial imagery to detect wheat crop diseases. These pest-caused what viruses can lead to poor water uptake, and ultimately wasted water and wasted money.  Here's the full story from Agrilife:

Writer: Kay Ledbetter, 806-677-5608, skledbetter@ag.tamu.edu
Contact: Dr. Charlie Rush, 806-354-5804, crush@ag.tamu.edu

AMARILLO – Dr. Charlie Rush hopes to use a unique method – helicopter drone – to track disease progression across wheat fields to eventually help producers make better irrigation decisions.

A helicopter drone is being used by Dr. Charlie Rush, Texas A&M AgriLife plant pathologist in Amarillo, to track disease progression across wheat fields. (Texas A&M AgriLife Research photo by Kay Ledbetter)

A helicopter drone is being used by Dr. Charlie Rush, Texas A&M AgriLife plant pathologist in Amarillo, to track disease progression across wheat fields. (Texas A&M AgriLife Research photo by Kay Ledbetter)

Rush, a Texas A&M AgriLife Research plant pathologist in Amarillo, has enlisted the help of Ian Johnson, a Montana State University-Bozeman graduate student who is using his work in the university’s Science and Natural History Filmmaking Program to help scientists conduct research.

Approximately 1.1 million acres of wheat in the High Plains are irrigated, Rush said, making wheat the second-largest user of irrigation water from the Ogallala Aquifer. In this same region, mite-vectored virus diseases are the predominant pathogenic constraint to sustainable wheat production each year.

The viruses causing these diseases are transmitted by the wheat curl mite, he said. Infected wheat plants not only have reduced grain and forage yields, but also greatly reduced root weight and water-use efficiency. Therefore, fertilizer and groundwater applied as irrigation to diseased wheat is largely wasted.

Rush’s team is using the helicopter to take remote images of a field study where they are trying to develop an economic threshold for irrigation of wheat infected with wheat streak and other mite-vectored diseases.

A helicopter drone used by Dr. Charlie Rush, Texas A&M AgriLife plant pathologist in Amarillo, flies over a wheat field to track disease progression. (Texas A&M AgriLife Research photo by Kay Ledbetter)

A helicopter drone used by Dr. Charlie Rush, Texas A&M AgriLife plant pathologist in Amarillo, flies over a wheat field to track disease progression. (Texas A&M AgriLife Research photo by Kay Ledbetter)

“The problem for farmers is that these diseases develop in gradients over time and they don’t know whether or not they should apply new pesticides or fertilizers or water,” he said. “Most of these practices are done in April, and that is when the disease is just starting to show up. They may know they have disease in the field, but they really don’t know how much damage it might cause.

“So what we are trying to do is be able to go in early in the season and look at the disease development at a particular time and then based on what it looks like, say in early April, be able to give them a prediction of what the crop will be at harvest time.”

To do that, Rush said his team has been going into the field using different types of remote imaging, such as the hand-held hyperspectral radiometer, to measure and quantify the severity of disease development in the field.

Read more…
3D Robotics

Agricultural research at Montana State

DSC_0085-300x178.jpg?width=550Dr. Charlie Rush and Montana State graduate student Ian Johnson have been using the 3DR Y6 for some foundational disease detection and prevention research. Their project aims use aerial imagery to detect wheat crop diseases. These pest-caused diseases can lead to poor water uptake by the diseased wheat, and ultimately wasted water and wasted money.  Here's the full story from Agrilife:

Writer: Kay Ledbetter, 806-677-5608, skledbetter@ag.tamu.edu
Contact: Dr. Charlie Rush, 806-354-5804, crush@ag.tamu.edu

AM

Read more…