3D Robotics

3689526671?profile=originalI'm obsessed with low-cost IR imagery for agricultural drones these days, so this appeared on my radar. I've backed the Kickstarter project, which has just passed its funding goal:

A simple, cheap infrared camera which can measure plant health -- for geek gardeners, farmers, and open source DIY scientists.

What could farmers, gardeners, students or environmental activists do with an infrared camera that costs as little as $35?

What is Infragram?

Infragram is a simple, affordable near-infrared camera produced by the Public Laboratory community in a series of collaborative experiments over the last few years. We originally developed this technology to monitor wetlands damages in the wake of the BP oil spill, but its simplicity of use and easy-to-modify open-source hardware & software makes it a useful tool for home gardeners, hikers, makers, farmers, amateur scientists, teachers, artists, and anyone curious about the secret lives of plants.

What can you do with Infragram?

  • Monitor your household plants
  • Teach students about plant growth and photosynthesis
  • Create exciting science fair projects
  • Generate verifiable, open environmental data
  • Check progress of environmental restoration projects
  • Pretend you have super-veg-powers

Near-infrared photography has been a key tool for planning at the industrial and governmental level: it is used on airplanes and satellites by vineyards, large farms, and even NASA for sophisticated agricultural and ecological assessment. In contrast, Infragram allows average people to monitor their environment through verifiable, quantifiable, citizen-generated data. Just as photography was instrumental to the rise of credible print journalism, inexpensive, open-source data-collection technologies democratize and improve reporting about environmental impacts.

Start exploring your world today with Infragram!

How does it work?

Photosynthesizing plants absorb most visible light (less green than red and blue, which is why they're green to our eyes!) but reflect near-infrared. When you take a picture with the Infragram, you get two separate images -- infrared and regular light -- and a false-color composite that shows you where there are big differences. Bright spots in the composite means lots of photosynthesis! (Learn more here

We're able to get both channels in one by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "superblue" filter (read more here). The images are later processed online -- combining the blue and infrared channels into an image map of photosynthesis (as shown above).

What you get

DIY Filter Pack: This is just a piece of "superblue" filter which you can use to turn your webcam or cheap point-and-shoot into an infrared camera. The filter allows you to take an infrared photo in the "red" channel of your camera, and a visible image in the "blue" channel. You'll also receive a white balance card and instructions on how install your filter -- it's pretty easy!

Infragram Webcam: This inexpensive but flexible reward is perfect for plugging directly into your laptop or integrating into other projects. It's also ideal for your Raspberry Pi, if you want to take it outdoors, do timelapse photography, or write scripts to control your camera. It ships as a bare circuit board with a USB cable - like an Arduino. 

Infragram Point & Shoot: Just want a camera? This is a straightforward, if basic, point-and-shoot: you can simply take photos as you normally would, then upload them to our free and open-source web app to quickly and easily get a variety of composite images and analyses. To accomplish this, we're simply modifying existing cameras which we'll buy in bulk, using the "superblue" filter. This isn't an SLR or even a particularly fully featured camera -- it likely won't have an LCD screen and may be "rebranded" with a Public Lab sticker -- but it's the new filter we've put inside which counts. 

The final configuration will depend on the # of backers, but it will likely use AAA batteries and a micro SD card. We're promising a minimum of 2 megapixel resolution, but should be able to do much better, especially if we get a lot of backers. Basically, the more money we raise, the better these cameras will get! 

How you’ll develop your images 

Whether you’re using our DIY filter with your own camera, the Infragram Webcam, or theInfragram Point & Shoot, you’ll be following the same, easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. 

1. Calibrate. In order to get the most meaningful data possible from your plant images, it’s a good idea to ‘calibrate’ your camera, taking into account the current lighting conditions (sunny vs. cloudy, indoors vs. outdoors) at the time that you’re taking your photos: this makes it much easier to compare ‘plant health’ images taken at different times, in different places, and by different cameras. To make this easy, we’ll likely be providing an additional ‘white balance card’ -- simply, a card that has a standard color -- in our kits. By recording an initial image that includes this card, you’ll be able to use our online software to “standardize” the colors in all of your images. If you don’t have a card, don’t worry -- there will also be opportunities to calibrate your imagery automagically later, using our analysis software, and the results might be just as good. 

 2. Take your snapshot. “Rhododendrons -- say cheese!” Using your own camera (modded with our DIY filter), the Infragram Webcam, or the Infragram Point & Shoot, you’ll record the scene of your choosing -- ideally, with some vegetation-y life forms in it. Take pictures of household plants, garden vegetables, trees -- we’ve grabbed a lot of useful agricultural imagery from cameras dangling from kites and balloons! The Public Lab website and mailing list are already full of examples and suggestions related to infrared photography, and it’s easy to start a discussion with community members about your ideas, or ask for advice. 

3. Upload. After you’ve finished an image capture session, you’ll want to upload your images using the (free, open source) online software our community is developing. This will likely simply involve navigating to a particular URL and dragging-and-dropping your images onto a specified area of a webpage. Easy peasy. 

 4. Analyze. If you thought the prior steps were fun, this step is fun +1 . We’re planning on providing a suite of image analysis tools online, so that everyone from researchers to geek gardeners can analyze, tweak, modify, and re-analyze their imagery to their heart’s content, extracting useful information about plant health and biomass assessment along the way. 

 5. Share. And perhaps the most exciting aspect of all: your imagery, your work, and your insights can easily be shared with the rest of the Public Lab community via this online service, the Public Lab mailing lists, and wikis and research notes at http://publiclab.org. Develop a kite-based aerial imagery project with your friends; get advice from NDVI researchers in the community as to the best techniques for yielding useful information from your garden photos; create and collaborate on new methods and protocols around DIY infrared photography. Public Lab’s ‘share and share alike’, ‘open source’ learning community model is not only fun -- it’s a great way to make rapid progress on any project!

Prototypes

The Infragram has been in active development for over a year now; our first prototype was made over 2 years ago and was simply some custom filter material taped to a camera! 

Others have used 2 aligned webcams, and many have been tested alongside custom Raspberry Pi controller code which auto-composites the imagery. Read more about the collaborative, open source development of this tool here: http://publiclab.org/tag/near-infrared-camera

The modification to the camera happens inside, out of sight, unlike in some of the above prototypes. The final version will be based on a mass-produced camera like the one below (one of our prototype mods), though we are waiting to see how many backers we get before settling on a final model. If we have enough backers, we'd love to do a fully custom enclosure as well!

Gallery

Here are some photos showing Infragram infrared composite images from various prototypes. You can find more images here.

Who are we?

Public Lab is a community of tinkerers and concerned citizens (supported by a nonprofit) which develops and applies open-source tools for environmental exploration and investigation. Our small nonprofit has run three successful Kickstarters--Grassroots Mapping the BP Oil SpillBalloon Mapping Kits, and DIY Spectrometry

Our community of contributors help each other problem solve and trouble shoot through online mailing lists and research notes, organized by region and topic. This allows community members to share research, and for the combined brain power of dozens of people to answer questions and innovate rather than relying on a traditional "customer service" model. Join the Public Lab infrared discussion list today to get started! 

On this project, we've collaborated with a range of different groups to meet the diverse needs of gardeners, farmers, environmental activists, conservationists, and more. These include: Gulf Restoration NetworkFarmHackGreenStart, the Design Trust’s Five Borough Farm project, Belize Open Source, and others!

Public Lab Staff in Cocodrie, LA--January 2013Public Lab Staff in Cocodrie, LA--January 2013

Risks and challengesLearn about accountability on Kickstarter

This is Public Lab’s fourth Kickstarter campaign; our experience with the previous three successful ones have taught us lots about everything from production to fulfillment and international shipping. We've already shipped functional prototypes for the camera (which was actually a reward in a previous campaign), and are confident that we’ll be able to turn out a great new design for larger scale production with your support.

Our main unknowns at this point are the final specs on the different cameras we'll produce -- such as maximum resolution, battery life, and housing, which we have plenty of options for, but for which we need to know final order quantities before finalizing design work and choosing a supplier. We're lucky to have a vibrant open source community behind the project at PublicLab.org -- one which you're encouraged to join to help move the project along by contributing your skills!

The fact that we are already a highly transparent and inclusive open source community means that sharing every step of the process is baked into our DNA. When we've had trouble in the past, we've asked for help and our contributors and backers have pitched in!

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Actually, my lack of understanding is leading me down some interesting research rabbit holes. Any way you look at it, I am backing this project to at least get the filter, and experiment some myself.
  • Also, I just read about BARC scans, or Burned area reflectivity coefficent.
  • Also, researching and responding on a phone is difficult, as to which my many errors in my response can attest.
  • forest.
  • Ok, I skimmed this while on a break at work, and knew I was missing a piece of the puzzle. Shortly after my second question, my brain kicked in, and I realized the difference between NEAR infrared and heat energy (infrared like what FLIR deareatects.) However, I did think, while this could not detect a forest fire in its infancy, it could allow forest managment officials guage the need for response to an active fire based on the health of the
  • These infrared images are not in the thermal range so it is very different from a FLIR camera. The infrared light these cameras record is in the near-infrared range. 

  • And, Ned, thank you for your reply.
  • Is this true infrared photography? In other words, if a small fire were burning in a forest, would this be able to "read" the heat difference between ground and flame, and someone looking at it could tell. I may not be fully understanding how this works, in comparison to a FLIR camera, for example.
  • 3D Robotics

    They've posted an update on the use of their filters for aerial photography. Looks good!

    3692728780?profile=original

  • There is offline processing available for 2-camera (1 near-infrared and 1 visible) and single camera (superblue) setups using plugins that work in ImageJ and Fiji. Here is an overview (http://publiclab.org/notes/nedhorning/11-1-2012/update-photo-monito...) although when this was written the single camera options were not incorporated. 

    Real time video processing is certainly doable and I'd like to work on that but haven't found the time.

This reply was deleted.