Laurent Eschenauer's Posts (6)

Sort by

Since Fleye is safe to be touched/pushed, we are experimenting with innovative ways to interact with it. You already saw in past videos that it can be thrown in the air or stopped just by catching it and performing a 90° rotation.

In this video, we show the 'selfie mode' in which the user just needs to push the drone to trigger an automated move, with the drone stepping back to capture a selfie.

The drone detects the push, its strength and its direction thanks to the accelero. It goes away following the push direction, for a distance that depends on the strength of the push. It then turns around looking at its origin location and comes back to its position. The positioning is achieved using optical flow tracking.

This kind of control, where a machine 'mimics' physics, is called impedance control. It is often used in robotics for man-machine interactions, but I haven't seen this done with drones yet. Other ideas we have is to mimic lunar gravity when launching Fleye, or enable the operator to easily move Fleye around by simply grabbing it and placing it at another location.

Below is another demo, filmed with a 360 camera, so you can see there really was no pilot in the loop! As always, I'm around, happy to answer questions and looking forward to your feedback! 

Read more…

Hello everyone,

Over the past six months I’ve shared with you the progress of our ‘flying ball’ project Fleye. So I imagine you’ll be interested to know that we are now live on Kickstarter. The video tells our story and the page explains in more details the technical specifications.

In addition to the form factor,  Fleye is also interesting on the computing performance side. We have a hybrid autopilot, using a cortex M4 for time sensitive control systems and a ARM A9 on board Linux computer to run custom drone applications. We will provide an API and SDK enabling developers to write custom application running directly on the drone.

Our platform is based on an iMX6, which has a GPU supporting OpenGL/OpenCL, and we support OpenCV, the popular computer vision library. This means that you can write applications that leverage the video feed to take actions, such as color/tag detection/tracking, face detection, etc.

Fleye is thus quite different than a classic RC controlled drone. We prefer to call it a ‘flying robot’ since it opens many possibilities to experiment safely with a fully autonomous flying platform.

 

What do you think? I would love to get your feedback and I remain of course available to answer your questions.

 

For the Fleye team,

 

Laurent

 

Read more…

Hello everyone,

It has a while since I last updated you on the progress of our flying camera ball. Here are a few short videos showing you our key achievements, I'm curious on your feedback on this !

First, we have integrated our onboard computer (quad-core ARM A9, GPU, 1GB RAM), and camera (5mpx, 1080p 30fps) within the drone and wrote a first version of the mobile app. In the above video you can see the 'virtual tripod' mode, where the user focus on managing altitude/orientation and doesn't need to worry of drift.

In this second video, we see the progress of our autopilot (100% custom development, using MBD approach), which enables stable and smooth flight. Watch this video to see how it reacts to collisions.

Finally, here is a third video showing you the drone flying outside and some camera footage. We still need to work on that, in particular the lens needs to be upgraded. Also note that we don't have a gimbal, and don't have electronic image stabilization yet. This footage was stabilized offline while editing the video.

What do you think ? How do you like this little flying ball ? As always, happy to read your comments and answer your questions below !

Thanks,

-Laurent

Read more…

Last month I shared with you our 'spherical drone' concept and promised to come back with more technical content. Well, I have just published a blog post explaining how we enable in place hovering and how the drone reacts to external disturbances. I'm sure this is 'control 101' for some of you, but hopefully others will find this interesting and will learn something new.  I'll summarize the key points and graph below.

The first step to recover from a collision like the one in the video, is to be aware of the collision :-) This is known as 'attitude estimation' and done by fusing a lot of sensor data together. In Fleye, we are using 3D IMU (accelero, gyro, magneto) together with a ultrasound and an optical flow tracking. They are all fused using various kind of filters to maximize the estimation quality and robustness.

As an example, here is a diagram showing the (X,Y) position estimated by the drone during our complete push and bump sequence (watch here). You can see that during the stable in-place hovering, the drone is capable of staying within a 20cm x 20cm area and that it tracks its deviation nicely when being pushed and bumped into.

3689657389?profile=originalBased on the attitude estimation, we can then detects rapid change in the drone orientation and perform the actual control. We are using three loops in our controller: a first rapid loop is used just to control rotational speed, and thus the overall stability. A second, slower, loop is used to control the position. Finally a third loop is used for navigation and trajectory planning. The following figure answer the question 'what happened when the drone got bumped into?': it shows how the sudden change in pitch (blue line) triggered the actuation of one control vane (red) to bring the drone back to stable hovering.

3689657619?profile=original

I hope you find this kind of content interesting, please don't hesitate if you have questions and share your ideas of technical topics you would like us to discuss !

Read more…

3689654004?profile=originalWe are working on a flying camera that differs from the regular quadcopters, using a Ducted Fan UAV design. This allow us to have a much safer and robust device, that the user can easily hold, push, grab, etc.. Fleye is about the same size and weight than a soccer ball (22cm wide, 350g). It is powered by a single propeller, fully shielded. You can safely hold it in your hands, throw it, catch it and fly it nearby people.

We are starting to communicate and gather feedback now, with a tentative launch (crowdfunding) this fall. You can read more details on our blog.

I know some of your usual questions, so let me try to already answer them :-)

  • This is not APM but our own autopilot, entirely developped with a Model Based Design approach for the core sensor fusion and control and running on a cortex M4.
  • Smartphone controlled over wifi, no RC version planned at this stage
  • We have a dual core ARM A9 on board for managing wifi, video streaming, and some computer vision tasks

I would really like to get your feedback on this !

Thanks,

Laurent

Read more…

I've developed an Autonomous flight library for the Parrot ARDrone 2.0, entirely in Javascript. It implements things like an Extended Kalman Filter, PID Controller, Tag detection and mission planning. It is built on top of the popular nodecopter nodejs library. Being Javascript, this makes it really easy to play with and hack with the code, I think it is an excellent platform for people eager to learn more about probabilistic robotics and computer vision.

I would love to get feedback from this community and to add support for other drone platform if it makes sense.

The video above is a basic example of an autonomous flight following a square pattern and landing back on target.

http://eschnou.github.io/ardrone-autonomy/

 

Read more…