(More info and full post here)
I've been experimenting with putting 360 degree vision, including stereo vision, onto a Crazyflie nano quadrotor to assist with flight in near-Earth and indoor environments. Four stereo boards, each holding two image sensor chips and lenses, together see in all directions except up and down. We developed the image sensor chips and lenses in-house for this work, since there is nothing available elsewhere that is suitable for platforms of this size. The control processor (on the square PCB in the middle) uses optical flow for position control and stereo vision for obstacle avoidance. The system uses a "supervised autonomy" control scheme in which the operator gives high level commands via control sticks (e.g. "move this general direction") and the control system implements the maneuver while avoiding nearby obstacles. All sensing and processing is performed on board. The Crazyflie itself was unmodified other than a few lines of code in it's firmware to get the target Euler angles and throttle from the vision system.
Below is a video from a few flights in an indoor space. This is best viewed on a laptop or desktop computer to see the annotations in the video. The performance is not perfect, but much better than the pure "hover in place" systems I had flown in the past since obstacles are now avoided. I would not have been able to fly in the last room without the vision system to assist me! There are still obvious shortcomings- for example the stereo vision currently does not respond to blank walls- but we'll address this soon...
Comments
Hello Goeffrey,
Have you made progress on this project ?
Can we expect a commercial product soon ?
Best Regards
Fantastic project. Any updates on testing or availability?
Awesome! I’m making a small aircraft platform (nano UAS), a UAV of coax helicopter, the rotor of its diameter is about 20cm, it can be level flight stable and flexible at a low altitude. This kind of flying vehicle only flies in the altitude level of approximately 40m, so I don't plan to use GPS. Currently, most common solution is open source Smart Camera PX4FLOW. But I think his effect is not good enough, maybe it only has one camera of overlook?
I am looking forward to the New applications for small and nano UAS, I hope that if it could be use on my aircraft? Could I buy your sensors or the related software? Where can I buy it? Looking forward to your reply, thank you :D
A major milestone Geoffrey. Great work. You have a solid foundation for some serious indoor exploration autonomy. Do let us all know how you get on with the white walls, low light and avoidance above/below. I, for one, am very interested.
put me down for atleast 3 maybe 5.
keep it up
@Hugues- Thank You, and you do bring up a good question. The prototype shown wasn't intended for commercialization, but more to see how far we can push vision at this scale. And that is only after a few months of tweaking that version so there is a lot of room for improvement. (I hope people appreciate me trying to be sober about the tech by showing entire flights, revealing both strengths and weaknesses, rather than just 5-second segments where things happen to work!)
Yes, I am working on modular versions that are more suited for commercialization and use on a variety of drones. But, I want first to get one to work really well before getting 1000 made!
Incredible! Very impressed. This has to be the future of drones because:
-it is self-reliant (no dependencies on weather nor environment blocking GPS signals)
-it is small and light. All regulations in all countries will let people fly drones freely only if we remain under 1 or 2 pounds. It makes it also safer
-it makes a step forward toward flying a swarm of these things, aggregating their individual small processing power to achieve the most complex tasks (like bees or termites)
Do you intend to commercialize or publicize your stereo camera boards or what are your plans ? It would be a crime against our community to keep this for yourself !
@Fnoop- I agree. The near-Earth problem for drones is analogous to the "last mile problem" for the Internet that was a big focus of the 90's and 00's. The notional Amazon delivery drone can't rely on GPS to land on your doorstep! We're not quite yet at Skynet though...
@Laser- Good question. I think the two techniques are complementary. One way to describe a visual environment is with two numbers- the contrast levels within the texture that is visible, and ambient light levels. Stereo is good when there is contrast. If it is dark, you can add illumination yourself and have something like a LeapMotion (provided the image sensor chips are adequately sensitive- another of my obsessions). However when there is no texture, stereo won't work. This is a good domain for laser or structured light ranging. There are other differences- Stereo has the benefit of obtaining multiple depth measurements, up to a depth map, without any moving parts. Laser ranging, on the other hand, can get precise measurements in a specified direction, potentially pretty fast.
For lasers though there is a potential problem- if the ambient light levels are too high the laser can get washed out at longer distances. You can address this with a more powerful laser, better optics, or narrow pass-band filters, but these can get impractical as the size goes down- the laws of physics only allow so many photons generated per joule of energy. However we have been experimenting with laser ranging as well (triangulation type, not TOF) and it will make it's way on this platform. For platforms of the scale most people fly here and for many anticipated applications of drones (deliveries, etc.) the two techniques complement each other nicely.
Awesome!
This is great!