All Posts (14049)

Sort by

DIY Low Profile Gimbal Mechanics

I like flying my DJI F550 hexacopter to shoot aerial video of various scenery,  I use a Contour HD 1080P camera mounted to the F550 to capture the video. 3689527112?profile=original I have been through countless iterations on mounts for the camera and my current contraption is a "Food Storage Bowl" with some memory foam inside to reduce the vibrations from the airframe to the camera.  I can't even remember how many modifications I've gone through with shaping the foam inside the bowl of this setup nor how many different foam types I've tried until by chance I got a piece of this memory foam.

I get a lot of teasing from friends who also fly about the food storage bowl..."you got a sandwich in there?"

Now that I've gotten most all of the jello out of the camera the next step was to get the camera stable so the wind affecting the airframe won't affect the camera's video recording.  I need a gimbal.

One of the nice things about having a food storage container below the airframe is that it makes for excellent landing gear.  It also is darned good protection from impact for the camera, and yes I learned that one the gard way, but that's another story.

So my gimbal needs to fit inside the confines of my food storage container.

Most of the gimbals I've seen are all very tall and hang low below the airframe.  I wanted to do something different.

I found some really great mechanical parts atServoCity, specifically their "Servo Blocks".  I also wanted some good fast and powerful servos so I got a pair of Futaba BLS156HV.  This is what I've got so far:

3689526979?profile=original

3689527071?profile=original

This setup is very low profile and also eliminated a bunch of mechanics because the servos output shafts direct drive the shafts of the Servo City servo blocks.  

This setup is strictly Roll and Tilt, no Pan because I just want to eliminate airframe motion.

I also needed the "Roll" servo to be concentric with the camera lens centerline.  A fellow local flyer with a CNC was very generous with his time and even found the piece of aluminum channel at a local boat builder.  Thank you John!

The Contour HD camera has a mount bracket with a 1/4-20 standard camera thread and the fabricated "L" bracket has a hole positioned so the camera is as far back in the assemble as possible.

Next up is the mounting of this gimbal to the airframe.  This will be done with some Servo City spacers that fit to the four holes seen in the lower gimbal photo.

Wait until you see the new food container!

Read more…

Thirty Days (#19): Office IR

Moving out of the near IR from the last post, we go into the far infrared. This is footage shot with a 40g, uncooled, 8-14um spectral band camera. Half the size of a GoPro and perfect for mounting on a small flying machine!

3689527012?profile=original

3689527055?profile=original

Read more…
Moderator

 

Dear Friends,

VR Lab is happy to present our last updates in the project  in last week we doing a great upgrades :

The entry level mechanical GoPro Fiber Carbon Gimbal is ready only 158 gr with motors, in the video you can see the first test, It's very simple to balance the camera to obtain better result.

The VR Gimbal is ready for developer we put online the opensource repository , in the next days will be update with recent workspace : library and code the code is compatible wit VR Universal IDE .

https://code.google.com/p/vrgimbal/

Today we finish to implement the first version of code that manage motors on pitch , roll and yaw . In the first test we are using only P not yet ID for improve corretion but  the result is yet very good , in the nexts days put online new video:)

Now is possible to setting the parameter with a serial port interface : sensor calibration , PID , on and off of motors , power for control the motors , step / degree ecc ecc . We start to develop a console for end user. We would use mavlink for interconnect the Gimbal to Console application ... in our test we are yet using also 3DR Radio and it work fine !

For more info and live update check here :

http://www.virtualrobotix.com/group/vr-gimbal-user-group?xg_source=activity

Best

Roberto

 

Read more…

Photosynthesis Assessments

3689526841?profile=original

NDVI Result

I recently modified an A2300 by taking out the IR filter and finding a gel filter that blocks red light, and lets IR pass.  I took two pictures, one with the rosco 2007 gel filter, and one without.  NEITHER image has the built-in glass IR filter.

I used this tutorial to do the image manipulation.  I hear there are processing tools out there, I'm going to give them a try, and perhaps create a macro for photoshop to make it quicker.

Visible base

3689526797?profile=original

NIR base - with gel filter

3689526883?profile=original

I then processed the image in Photoshop using two methods.  First Normalized Differential Vegetative Index (NDVI) in the image at the top, and then NRG (below) where Near-Infrared, Red, and Green are used to compose a picture instead of the usual Red, Green, and Blue.  (Thanks for the great site publiclab!  http://publiclab.org/wiki/ndvi)

NRG result

3689526916?profile=original

I'm obviously itching to do this with aerial images, but I wanted to throw this out there, and see where I'm falling short.  I think the NDVI image begins to show me valuable information (where less IR is reflected) but any tips on improving results is greatly appreciated.  Comments on importance of custom white balance are appreciated.

If you have any questions, please don't hesitate to ask!  Thanks!

Read more…

3689526859?profile=original

Excerpts from the Telegraph

May 27, 2013:

3689526814?profile=original

...

The small, lightweight, battery-powered Falcon drones can be launched by hand in minutes and fly over a range of five miles for up to 90 minutes. Fitted with high-resolution infrared cameras, they can pick out elephants, rhinos and lions as well as anyone that might be tracking them.

...

When the Falcon drone's creator Chris Miser arrived at Olifants West conservancy at the foot of the Klein Drakensberg mountains of Limpopo last weekend, he planned to run some simple test flights to get used to the bushy terrain.

These were abandoned when a call came through on Saturday afternoon from a neighbouring reserve that two of their rangers had been shot at by suspected poachers – one taking a bullet in his hand-held radio.

...

3689526868?profile=original

Full story here: http://www.telegraph.co.uk/news/worldnews/africaandindianocean/southafrica/10082727/Drones-join-war-on-rhino-poachers-in-South-Africa.html

Read more…

3689526694?profile=originalHere is a translation of an article I found last week (Dutch: http://nutech.nl/gadgets/3482751/drones-mogen-zonder-vergunning-lucht-in.html)

The prohibition by the Ministry of Defence on making photos and videos from the air is withdrawn by June 1. Google and Micrsoft's mapping services have made this ban redundant. Owners of drones don't need a permit anymore.

Owner of drones, kites or private helicopters or planes who wants to make photographs can now do so without permission. Before a permit from Netherlands Ministry of Defence was a requirement. Those who received a permit for taking pictures was handed over a map with where the pilot was not allowed to fly. Violating the law would result in 2 months in prison.

This law was intended to counter espionage, but is overtaken by making satellit pictures public. Defence points at Microsoft and Google as cause for the change in legislation because they fall outside the regulations.

With the disappearance of this ban, it is now also allowed to take pictures of military objects. It remains forbidden to take pictures from military objects from the ground. Defence still holds the right to close down airspace under certain conditions.

A spokesman for the Ministry of Defence stressed that lifting the ban doesn't mean that other regulations are still active. For example under the Data Protection Act it is still not allowed to take pictures when flying over other peoples property.

I hope my translation makes sense, some words were hard to translate correctly.

Well, this seems like good news for my fellow Dutch pilots who take pictures with the model aircraft.

Read more…
3D Robotics

3689526671?profile=originalI'm obsessed with low-cost IR imagery for agricultural drones these days, so this appeared on my radar. I've backed the Kickstarter project, which has just passed its funding goal:

A simple, cheap infrared camera which can measure plant health -- for geek gardeners, farmers, and open source DIY scientists.

What could farmers, gardeners, students or environmental activists do with an infrared camera that costs as little as $35?

What is Infragram?

Infragram is a simple, affordable near-infrared camera produced by the Public Laboratory community in a series of collaborative experiments over the last few years. We originally developed this technology to monitor wetlands damages in the wake of the BP oil spill, but its simplicity of use and easy-to-modify open-source hardware & software makes it a useful tool for home gardeners, hikers, makers, farmers, amateur scientists, teachers, artists, and anyone curious about the secret lives of plants.

What can you do with Infragram?

  • Monitor your household plants
  • Teach students about plant growth and photosynthesis
  • Create exciting science fair projects
  • Generate verifiable, open environmental data
  • Check progress of environmental restoration projects
  • Pretend you have super-veg-powers

Near-infrared photography has been a key tool for planning at the industrial and governmental level: it is used on airplanes and satellites by vineyards, large farms, and even NASA for sophisticated agricultural and ecological assessment. In contrast, Infragram allows average people to monitor their environment through verifiable, quantifiable, citizen-generated data. Just as photography was instrumental to the rise of credible print journalism, inexpensive, open-source data-collection technologies democratize and improve reporting about environmental impacts.

Start exploring your world today with Infragram!

How does it work?

Photosynthesizing plants absorb most visible light (less green than red and blue, which is why they're green to our eyes!) but reflect near-infrared. When you take a picture with the Infragram, you get two separate images -- infrared and regular light -- and a false-color composite that shows you where there are big differences. Bright spots in the composite means lots of photosynthesis! (Learn more here

We're able to get both channels in one by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "superblue" filter (read more here). The images are later processed online -- combining the blue and infrared channels into an image map of photosynthesis (as shown above).

What you get

DIY Filter Pack: This is just a piece of "superblue" filter which you can use to turn your webcam or cheap point-and-shoot into an infrared camera. The filter allows you to take an infrared photo in the "red" channel of your camera, and a visible image in the "blue" channel. You'll also receive a white balance card and instructions on how install your filter -- it's pretty easy!

Infragram Webcam: This inexpensive but flexible reward is perfect for plugging directly into your laptop or integrating into other projects. It's also ideal for your Raspberry Pi, if you want to take it outdoors, do timelapse photography, or write scripts to control your camera. It ships as a bare circuit board with a USB cable - like an Arduino. 

Infragram Point & Shoot: Just want a camera? This is a straightforward, if basic, point-and-shoot: you can simply take photos as you normally would, then upload them to our free and open-source web app to quickly and easily get a variety of composite images and analyses. To accomplish this, we're simply modifying existing cameras which we'll buy in bulk, using the "superblue" filter. This isn't an SLR or even a particularly fully featured camera -- it likely won't have an LCD screen and may be "rebranded" with a Public Lab sticker -- but it's the new filter we've put inside which counts. 

The final configuration will depend on the # of backers, but it will likely use AAA batteries and a micro SD card. We're promising a minimum of 2 megapixel resolution, but should be able to do much better, especially if we get a lot of backers. Basically, the more money we raise, the better these cameras will get! 

How you’ll develop your images 

Whether you’re using our DIY filter with your own camera, the Infragram Webcam, or theInfragram Point & Shoot, you’ll be following the same, easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. 

1. Calibrate. In order to get the most meaningful data possible from your plant images, it’s a good idea to ‘calibrate’ your camera, taking into account the current lighting conditions (sunny vs. cloudy, indoors vs. outdoors) at the time that you’re taking your photos: this makes it much easier to compare ‘plant health’ images taken at different times, in different places, and by different cameras. To make this easy, we’ll likely be providing an additional ‘white balance card’ -- simply, a card that has a standard color -- in our kits. By recording an initial image that includes this card, you’ll be able to use our online software to “standardize” the colors in all of your images. If you don’t have a card, don’t worry -- there will also be opportunities to calibrate your imagery automagically later, using our analysis software, and the results might be just as good. 

 2. Take your snapshot. “Rhododendrons -- say cheese!” Using your own camera (modded with our DIY filter), the Infragram Webcam, or the Infragram Point & Shoot, you’ll record the scene of your choosing -- ideally, with some vegetation-y life forms in it. Take pictures of household plants, garden vegetables, trees -- we’ve grabbed a lot of useful agricultural imagery from cameras dangling from kites and balloons! The Public Lab website and mailing list are already full of examples and suggestions related to infrared photography, and it’s easy to start a discussion with community members about your ideas, or ask for advice. 

3. Upload. After you’ve finished an image capture session, you’ll want to upload your images using the (free, open source) online software our community is developing. This will likely simply involve navigating to a particular URL and dragging-and-dropping your images onto a specified area of a webpage. Easy peasy. 

 4. Analyze. If you thought the prior steps were fun, this step is fun +1 . We’re planning on providing a suite of image analysis tools online, so that everyone from researchers to geek gardeners can analyze, tweak, modify, and re-analyze their imagery to their heart’s content, extracting useful information about plant health and biomass assessment along the way. 

 5. Share. And perhaps the most exciting aspect of all: your imagery, your work, and your insights can easily be shared with the rest of the Public Lab community via this online service, the Public Lab mailing lists, and wikis and research notes at http://publiclab.org. Develop a kite-based aerial imagery project with your friends; get advice from NDVI researchers in the community as to the best techniques for yielding useful information from your garden photos; create and collaborate on new methods and protocols around DIY infrared photography. Public Lab’s ‘share and share alike’, ‘open source’ learning community model is not only fun -- it’s a great way to make rapid progress on any project!

Prototypes

The Infragram has been in active development for over a year now; our first prototype was made over 2 years ago and was simply some custom filter material taped to a camera! 

Others have used 2 aligned webcams, and many have been tested alongside custom Raspberry Pi controller code which auto-composites the imagery. Read more about the collaborative, open source development of this tool here: http://publiclab.org/tag/near-infrared-camera

The modification to the camera happens inside, out of sight, unlike in some of the above prototypes. The final version will be based on a mass-produced camera like the one below (one of our prototype mods), though we are waiting to see how many backers we get before settling on a final model. If we have enough backers, we'd love to do a fully custom enclosure as well!

Gallery

Here are some photos showing Infragram infrared composite images from various prototypes. You can find more images here.

Who are we?

Public Lab is a community of tinkerers and concerned citizens (supported by a nonprofit) which develops and applies open-source tools for environmental exploration and investigation. Our small nonprofit has run three successful Kickstarters--Grassroots Mapping the BP Oil SpillBalloon Mapping Kits, and DIY Spectrometry

Our community of contributors help each other problem solve and trouble shoot through online mailing lists and research notes, organized by region and topic. This allows community members to share research, and for the combined brain power of dozens of people to answer questions and innovate rather than relying on a traditional "customer service" model. Join the Public Lab infrared discussion list today to get started! 

On this project, we've collaborated with a range of different groups to meet the diverse needs of gardeners, farmers, environmental activists, conservationists, and more. These include: Gulf Restoration NetworkFarmHackGreenStart, the Design Trust’s Five Borough Farm project, Belize Open Source, and others!

Public Lab Staff in Cocodrie, LA--January 2013Public Lab Staff in Cocodrie, LA--January 2013

Risks and challengesLearn about accountability on Kickstarter

This is Public Lab’s fourth Kickstarter campaign; our experience with the previous three successful ones have taught us lots about everything from production to fulfillment and international shipping. We've already shipped functional prototypes for the camera (which was actually a reward in a previous campaign), and are confident that we’ll be able to turn out a great new design for larger scale production with your support.

Our main unknowns at this point are the final specs on the different cameras we'll produce -- such as maximum resolution, battery life, and housing, which we have plenty of options for, but for which we need to know final order quantities before finalizing design work and choosing a supplier. We're lucky to have a vibrant open source community behind the project at PublicLab.org -- one which you're encouraged to join to help move the project along by contributing your skills!

The fact that we are already a highly transparent and inclusive open source community means that sharing every step of the process is baked into our DNA. When we've had trouble in the past, we've asked for help and our contributors and backers have pitched in!

Read more…
Developer

3689526562?profile=original

A couple of weeks ago Tridge mentioned an idea: It would be great if droneshare scanned uploaded tlogs for parameters that are outside of the recommended range. 

I've just now gotten around to adding this feature, so now any time you upload a flight via the webpage or Andropilot it will check the parameters against the recommended values in the APM source code.  If it sees a parameter outside of the recommended range it will show a warning.

If you see a warning for one of your vehicles, it is probably best to ask in the appropriate DIY drones forum.  It may be that the value you are using is fine and the comment in the source code is just wrong - I don't think these ranges have been used much in the past, so some errors are possible.

3689526708?profile=original

Here's a typical flight that has a warning.  

Also, I can now happily report that a few hundred new flights are being uploaded each week.

Read more…

3689526555?profile=original

 "Fly 'em hard and put 'em away wet.  They're only drones."

Game of Drones is a new video series focused on designing and creating hardcore airframes that can take punishment.  Waterproof, bulletproof & crash-proof are the goals.  Pictured here is the "Flying Squirrel" or the Mark III design.  With the GoPro camera integrated right in, this is an ideal airframe to learn how to fly in FPV.

3689526490?profile=original

 

Motors mount directly to the material with the "T" bracket...

 

3689526529?profile=original

 

The rough mold on the vacuum-forming table.  The GoPro was a last-minute addition.

 

3689526616?profile=original

 

A few of the design concepts that will soon be arriving from GoD.

 

3689526655?profile=original

This spherical ring design is strong enough to support 200 lbs. without warping or bending.  Now to find motors that can actually lift that weight.

 

 

Read more…

PX4 as Student and Research Platform

3689526221?profile=original

The image shows the successful flight of an extremely simple fixed wing controller tutorial (flies GPS waypoints or in manual mode, has support for MAVLink parameters, runs automatically synchronized to the attitude filter). Simple can mean many things, in this case we are referring to simple for hobbyists or students interested in estimation and control. Developing on PX4 is simple in the same fashion as writing a "Hello World!" application from scratch is simple on a normal Unix or Windows machine (the equivalent is the PX4 "Hello Sky" tutorial). To write flight-control code there is no need to dig through the whole codebase, or to mess with the main loop (and introducing unintended side effects).

Of course it is still flexible enough to run monolithic designs, and APM Plane and Copter are fully operational (thanks to Andrew Tridgell and Randy Mackay) and are the recommended flight app at this point for average users.

The open source autopilot field has evolved recently quite a bit, and while everybody these days can hack together a flying quadrotor within a few days, taking a "simple" hardware platform like Arduino or Maple means just reinventing the wheel, since people have done it already. Not many of these new projects ever make it even close to what e.g. APM provides. To really improve over the state of the art, its important to focus on the flight code and to make it fly better, instead of creating yet another half-done piece of hardware.

The whole rationale behind PX4 is different: Similar to VxWorks for the automotive and aerospace industries (guess what the Curiosity rover runs and guess what most likely controls the vehicle you're driving), we're trying to provide a real-time, flexible base platform (based on NuttX, which is POSIX-inspired like VxWorks) and add common library blocks like mixers, estimators, sensor drivers and controllers to it. But if a new developer want to drive a different platform (a rover, boat, blimp) with it, or run his own flight controller, there is no need to rip everything apart - it can just be added and run as application, without affecting other users of the platform.

For the same reason the linked tutorials also look so similar to normal Unix programming examples. And there is a quite successful case study for this platform based approach - if you look at the PX4 interprocess communication you will note that working with PX4 is in many ways similar to working with ROS (Robot Operating System) in classic robotics.

If you want to get your hands dirty, here the most relevant links from the text again:

Read more…

http://i.i.com.com/cnwk.1d/i/tim2/2013/05/24/bridge_collapse_AP529994068518_fullwidth_620x350.jpg

The Skagit River bridge collapse on Interstate 5 in Washington State, USA, got me thinking about using the Arducopter for large, inaccessible infrastructure inspections. I drove my family across this bridge many times so I took its unanticipated collapse harder than most who heard the news in far away places. Granted, metal fatigue or corrosion didn't directly lead to the collapse. Current data (witnesses and surveillance camera footage) point to repeated collisions between a large truck-ferried load and the overhead bridge structure as the truck passed through that section of bridge. But the bridge dates from the 1950s and didn't include redundant load-bearing paths. It's classified as a  "fracture critical" bridge. There are more "fracture critical" bridges in the USA and we don't necessarily know their structural health in detail.

The quadcopter platform appears well suited for bridge inspections from beneath or above the bridge deck. I figure it needs to stay in line of sight of an operator's RC controller, needs to provide video on-screen display so the operator sees what the drone sees, and some level of "standoff protection" to keep the quad from striking the bridge structure. The bridge is more dangerous to the quad than the quad is to the bridge. Maybe ultrasonic ranging for structure following or standoff might do the trick. The quadcopter must keep its distance from the structure while under control of the operator, though. I lean toward a "hard envelop limiter" in the control law or control mode software so the quad won't hit the inspected item no matter what the operator commands ( or a variable limiter that requires a lot of operator override to get the quad closer, that's more in keeping with my Boeing bias.)

The quad blades may require a perimeter shield round the blades to keep them from contacting bridge structure, too. Just in case a wind gust shove the "airbot" inspector too close to the structure.

Protecting the quadcopter from inadvertent structure contact is an interesting problem in itself. It seem tractable on the surface. The greater challenge is remote sensing of the structure. What kind of sensors can a quad carry near to a structure to probe its integrity? Is it limited to passive sensing (optical) or active sensing (induction from a distance, or via a probe, to measure structure response, or sampling (wow, take a flake), or ultrasound? We can learn a lot by dumping some energy into the structure and measuring its response, in addition to watching it in parts of the visual spectrum.

Remote sensing satellites solved a lot of these problems. They may offer direct of indirect advice on the sensor approach.  Satellite sensors can measure  the reflected light spectrum from soils, plants, rocks, for example. Making sense of the spectra requires "ground truth" measurements. The spectra need to be calibrate to soil or plant conditions and mineral types.

But paint can mask rust. And fatigue cracks may not show at the structure surface. If this is the case, then energy needs to be directed into the structure, from the quad, and it's effects need to be measured, by the quad. The wikipedia entry on non-destructive metal testing lists some interesting methods that could (in theory) be adapted to a quad - such as magnetic particle testing, liquid penetrate testing, and methods that require some form of physical contact during the measurement, like eddy current testing, ultrasound testing.

Physical contact testing would be a challenge in that the quad need to station keep, not hit the structure, and simultaneously extend one or more devices to make contact with the structure during the test. Is that a great controls problem or what?  And more - the quad measurement device needs energy (battery) and that's weight. The quad itself needs energy from a battery to fly for a while - 10 minutes or so. This may be the biggest challenge to quad copters that actively probe structures. The energy need to fly and actively probe the structure must be provided by batteries. It may be a difficult optimization to allow a reasonable inspection and reasonable mission time.

Passive structure observation seems like a shoe-in with a quad and one or more cameras, with respect to passively probing the structure and mission time. A quad can take close-up video and pictures of structures illuminated from beyond the quad by the operator. That can address the battery issue - if the operator can use a different device to illuminate the structure with ultraviolet or infrared or laser, and the quad can measure the reflected result, then the "active probe" is not on the quad, doesn't need to be lifted and doesn't need to be powered by the quad. 

But the kinds of light we can shine on the subject may not illuminate all possible defects we want to detect.

Overall an interesting set of engineering problems, when we think about using a quad copter to inspect structures...

Read more…

SYDNEY | Sun May 26, 2013 5:15pm EDT

(Reuters) - Moving carefully along a row of apple trees, two of Australia's newest agricultural workers check if the fruit is ripe or the soil needs water or fertilizer.

Meet "Mantis" and "Shrimp", agricultural robots being tested to do these tasks and more in a bid to cut costs and improve productivity in Australia's economically vital farm sector, which exported $39.6 billion ($38.8 billion)of produce in 2012.

Australia is one of the leaders in the field and, with a minimum wage of $15.96 per hour and a limited workforce, has a big incentive to use robots and other technology such as unmanned aircraft to improve efficiency.

It hopes to tap fast-growing Asian neighbors, where the swelling ranks of the middle class increasingly want more varied and better quality food from blueberries to beef.

"The adoption of new technology is going to be crucial for Australia to maintain its competitiveness in terms of the global agricultural sector," said Luke Matthews, commodities strategist at the Commonwealth Bank of Australia.

"If we don't adopt new technology, we can give up on these high-flying ambitions of being the food bowl of Asia."

Agriculture now accounts for 2 percent of Australia's gross domestic product, but the government forecasts it could reach 5 percent by 2050. Its growth is particularly important now the once-booming mining sector is slowing.

Australia is the world's second-biggest wheat exporter and arable farmers are already using specialized technology aimed at improving efficiency, including satellite positioning software to allow farmers to map out land and soil to determine optimal inputs.

Using such technology to optimize the use of fertilizer can boost profitability at grain farms by 14 percent, according to a study by Australia's Commonwealth Scientific and Industrial Research Organisation.

COLOUR RECEPTORS

A robot effortlessly plucking fruit is some way off, though a range of simpler tasks are within reach to add to existing technology such as automatic steering of harvesters.

Salah Sukkarieh, Professor of Robotics and Intelligent Systems at the University of Sydney and developer of Mantis and Shrimp, says the next phase aims for robots to do increasingly complex jobs such as watering and ultimately harvesting.

"We have fitted them with a lot of sensors, vision, laser, radar and conductivity sensors - including GPS and thermal sensors," said Sukkarieh, speaking at his laboratory housing a collection of both ground robots and unmanned air vehicles.

The technology could have the biggest application in horticulture, Australia's third-largest agricultural sector with exports of $1.71 billion in the last marketing year, since a fixed farm layout lends itself better to using robots.

Robots and an unmanned air vehicle that are being developed at the University of Sydney had passed field tests at an almond farm in Mildura, Victoria state, said Sukkarieh.

Propelled by sets of wheels and about the height of a man, the robots were named after the native Mantis shrimp because of the marine crustacean's 16 different color receptors, capable of detecting up to 12 colors. Humans only have four, three of which pick up colors.

This capacity to recognize color already allows the robots to sense whether fruit is ripe.

The data can then be processed by computer algorithms to determine what action the robot should take. This could be to water or apply fertilizer or pesticides, or to sweep and prune vegetation, and eventually the aim is to harvest the crop.

"If tomorrow we got an apple, orange or tomato farmer that wants a robot to go up and down these tree crops reliably and accurately, we can do that within six months to a year."

"The question is can we make them more intelligent," added Sukkarieh, who also sees the technology being attached to standard farm vehicles and foresees a fully automated horticulture farm within 10 years.

BRUISED APPLES

Australian farmers, who depend on seasonal labor for jobs such as picking fruit and vegetables, said they would welcome high-tech help.

"Berry picking by a robot would be difficult but if they could produce a robot, I could make a significant saving," said Allan Dixon, co-owner of the Clyde River Berry Farm in New South Wales, who typically takes on five people every year.

To get enough agricultural workers, Australia allows in some labor from neighboring Pacific island countries and East Timor, as well as using backpackers on temporary work visas.

Some fruit farmers remain skeptical.

"Apples will always need to be harvested by hand, due to their fragile nature. They bruise very easily," said Lucinda Giblett, director at Newton Orchards in Western Australia.

"We see no current opportunities offered by agricultural robots. Even as a pruning device, application is very limited," added Giblett

PRODUCTIVITY

Further productivity gains will be needed if Australia is to reach its target of being the main food supplier to Asia.

A 2011 study by the Australian Bureau of Agriculture and Resource Economics and Sciences said around two-thirds of the increase in the monetary value of agricultural production in the last 50 years in the country was down to gains in productivity

Another survey by the Grains Research and Development Corporation showed 67 percent of respondents in 2011 used auto-steer technology to guide machinery such as harvesters and sprayers, up from 47 percent in 2008.

Obstacles to using more technology remain, however, including the cost of buying or renting equipment and slower growth in research and development spending. Some studies show growth in the use of satellite imagery and soil mapping has stagnated in Australia and the United States in recent years.

Regardless of whether it can meet its targets to supply more food to Asia, Australia is expected to play a big role in global food security by being one of the test beds for new ways to produce food more efficiently in often harsh conditions.

(Writing by Ed Davies; Editing by Michael Urquhart)

Read more…

Drone mapping in Peru

University grounds: Above

Below you can find some results of a 4 week mapping campaign I did in Peru along with people from KU Leuven, Belgium and University La Molina, Peru. Images were processed with PIX4D and dronemapper. Special thanks to JP from Dronemapper for the good support and tips&tricks. I hope you like the videos!

Archeological site (Caral):

Amazone area:

Read more…

3689526109?profile=original

I was playing with my custom built quadrocopter based on Multiwii platform, but had trouble finding a comprehensive guide to stick configuration functions online. I stumbled upon Mr.Hamburger's awesome cheatsheet, but found out that for my Mode 1 stick configuration it was wrong. I have throttle and yaw on the right stick and pitch & roll on the left. So I made a version that works for my configuration. I haven't tested all functions, but they should work.

You can download it as PDF. And once again, all the kudos to Mr. Hamburger for his efforts.

Read more…
Developer

QGroundControl Update

3689525987?profile=original

After investing most time into PX4 for quite a long time now, the PX4 native stack has made it into a shape where we can fly fixed wing in autonomous mode with it, and there is also progress on multicopters. This put QGroundControl back into the development focus, since the field tests showed some potential improvements, and the PX4 user feedback indicated that features like RC calibration were highly wanted. The screenshots below show the current 1.0.9 version.

In parallel there is work under way from the 3DR dev team under Michael Obornes lead to add some APM-specific features  to QGroundControl and to bring some of the very well engineered features of Mission Planner to a cross platform codebase. One of the visually most prominent and first features will be the Primary Flight Display, but there is much more to come.

A first Mac OS binary is available on the QGroundControl downloads page, a windows binary will follow shortly.

3689525831?profile=original

3689526061?profile=original3689525845?profile=original3689526086?profile=original

Read more…

Here is my connection (You can check GPIO pinout here):

3689525957?profile=original

I used theFTDI GPS adapter cable, solder pads can be used also, just have in mind the TX pad = RX Ublox and RX pad = TX Ublox.

To get information from GPS in the Raspberry Pi (RPi), we will use gpsd.

Open a terminal

Install gpsd:

sudo apt-get install gpsd gpsd-clients python-gps

Setup RPi UART:

sudo nano /boot/cmdline.txt
delete the "ttyAMA0" parameter to have:
dwc_otg.lpm_enable=0 console=tty1 root=/dev/mmcblk0p2 rootfstype=ext4 elevator=deadline rootwait

sudo nano /etc/inittab
comment (add a hash #) the line below "#Spawn a getty on Raspberry Pi serial line" like this:
#T0:23:respawn:/sbin/getty -L ttyAMA0 115200 vt100

Reboot the RPi to apply the configuration:
sudo reboot

At this moment gpsd program is installed. If any gpsd session is open, stop it by:

sudo killall gpsd

Due to 3DR GPS is configured at 38400 bauds, set the RPi to 38400 bauds will be needed:

stty -F /dev/ttyAMA0 38400

(when RPi reboot baud rate returns to default)

Start gpsd:

sudo gpsd /dev/ttyAMA0 -F /var/run/gpsd.sock

Display GPS data:

cgps -s

When 3DR GPS is fixed, the fix blue led will be blinking and you will get the data like this:

3689526008?profile=original

If the GPS is not fixing, try outdoors where you have a clear sky view.

Read more…