optical flow (1)

Centeye Modular Vision Sensors

8529205664?profile=RESIZE_710xIt has been awhile since I’ve posted anything about recent Centeye hardware. Some of you may remember my past work implementing light weight integrated vision sensors (both stereo and monocular) for “micro” and “nano” drones. These incorporated vision chips I personally designed (and had fabricated in Texas) and allowed combined optical flow, stereo depth, and proximity sensing in just 1.1 grams. Four of these on a Crazyflie were enough to provide omnidirectional obstacle avoidance and tunnel following in all ambient lighting conditions. Their main weakness was that they were time-consuming and expensive to make and adapt to new platforms. Some of you may also remember our ArduEye project, which used our Stonyman vision chip and was our first foray into open hardware. Although that project had a slow start, it did find use in a variety of applications ranging from robotics to eye tracking. I have discussed, privately with many people, rebooting the ArduEye project in some form.

Like many people we faced disruption last year from COVID. We had a few slow months last Summer and I used the opportunity to create a new sensor configuration from scratch that has elements of both Ardueye and our integrated sensors. My hypothesis is that most drone makers would rather have a sensor that was modular and easy to reconfigure or adapt, or even redesign, and are OK if it weighs “a few grams” rather than just one gram. Some users even told me they prefer a heavier version if it is more physically robust. Unlike the nano drones I personally develop, if your drone weighs several kilograms, an extra couple grams is negligible. I am writing here to introduce this project, get feedback, and gauge interest in making this in higher quantities.

My goals for this “modular” class of sensors were as follows:

  • Use a design that is largely part agnostic, e.g. does not specifically require any one part (other than optics and our vision chip) in order to minimize supply chain disruptions. This may sound quaint now, but this was a big deal in 2020 when the first waves of COVID hit.
  • Use a design that is easy and inexpensive to prototype, as well as inexpensive to modify. We were influenced by the “lean startup” methodology. This includes making it easier for a user to modify the sensor and it’s source code.
  • Favor use of open source development platforms and environments. I decided on the powerful Teensy 4.0 as a processor, using the Arduino framework, and using Platform IO as the development environment.

We actually got it working. At the top of this post is a picture of our stereo sensor board, with a 5cm baseline and mass of 3.2 grams, and below is a monocular board suitable for optical flow sensing that weighs about 1.6 grams. We have also made a larger 10cm baseline version of the stereo board, and have experimented with a variety of optics. All of these connect to a Teensy 4.0 via a 16-wire XSR cable- The Teensy 4.0 operates the vision chips, performs all image processing, and generates the output. We have delivered samples to collaborators (as part of a soft launch) who have indeed integrated them on drones and flown them. Based on their feedback we are designing the next iteration.

8529205301?profile=RESIZE_710x

As with any new product you have to decide what it does and what it does not do. Our goal was not to have an extremely high resolution- those already exist, and the reality is that having a high resolution has other costs in terms of mass, power, and light sensitivity. Instead, we sought to optimize intensity dynamic range. The vision chips use a bio-inspired architecture in which each pixel individually adapts to its light level independent of other pixels. The result is a sensor that can work in all light levels (“daylight to darkness”, the latter with IR LED illumination), can adapt nearly instantaneously when moving between bright and dark areas, and function even when both bright and dark areas are visible.

Below shows an example of the stereo sensor viewing a window that is open or closed. (Click on the picture to see at native resolution.) The current implementation divides the field of view into a 7x7 array of distance measurements (in meters) which are shown. Red numbers are those measurements that have passed various confidence tests; cyan numbers are those that have not (thus should not be used for critical decisions). Note that when the window is open, the sensor detects the longer range to objects inside even though the illumination levels are about 1% that of outside. A drone with this sensor integrated would be able to sense the open window and fly though it, and not suffer a temporary black-out once inside.

8529205888?profile=RESIZE_710x

A more extreme case of light dynamic range is shown in the picture below. This was taken with a different sensor that uses the same vision chip. On the top left is a picture of the sensor- note that it was in the sunlight, thus would be subject to the “glare” that disrupts most vision systems. On the top right is a photograph of the scene (taken with a regular DSLR) showing sample ranges to objects in meters. On the bottom is the world as seen by the sensor- note that the Sun is in the field of view at the top right, yet the objects in the scene were detected. Other examples can be found on Centeye’s website.

8529207066?profile=RESIZE_710x

We are currently drafting up plans for the next iteration of sensors. For sure we will be including a 6-DOF IMU, which will be particularly useful for removing the effects of rotation from the optical flow. We are also envisioning an arrangement with the Teensy 4.0 placed nearly flush with the sensor for a more compact form factor. There is still discussion on how to balance weight (less is better) with physical robustness (thicker PCBs are better)! Finally I am envisioning firmware examples for other applications, such as general robotics and environment monitoring. I am happy to discuss the above with anyone interested, private or public.

Read more…