Hello everyone,

Over the past six months I’ve shared with you the progress of our ‘flying ball’ project Fleye. So I imagine you’ll be interested to know that we are now live on Kickstarter. The video tells our story and the page explains in more details the technical specifications.

In addition to the form factor,  Fleye is also interesting on the computing performance side. We have a hybrid autopilot, using a cortex M4 for time sensitive control systems and a ARM A9 on board Linux computer to run custom drone applications. We will provide an API and SDK enabling developers to write custom application running directly on the drone.

Our platform is based on an iMX6, which has a GPU supporting OpenGL/OpenCL, and we support OpenCV, the popular computer vision library. This means that you can write applications that leverage the video feed to take actions, such as color/tag detection/tracking, face detection, etc.

Fleye is thus quite different than a classic RC controlled drone. We prefer to call it a ‘flying robot’ since it opens many possibilities to experiment safely with a fully autonomous flying platform.


What do you think? I would love to get your feedback and I remain of course available to answer your questions.


For the Fleye team,




E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • So why is the wind tolerance so low? Is it endemic to the design?
  • Laurent,

    Congrats!  Looks very promising.  We spent some time on our "Flying Bucket" project a few years ago and shelved it because we found it to be about 50% less efficient than a quad of similar weight, and it used a bunch of energy outdoors in wind.  However, we never got to an optimum duct.  Good luck with your campaign.  Despite the dismal track record that those before us have established I backed your Kickstarter.  For the simple reasons that I like to support innovation in this space, and you were honest about your wind limits in the specs.




  • Congrats on using in-house developed control system.

    Btw. regarding the battery, have you determined approximate relation between the capacity and the flight time?

  • http://www.freescale.com/products/arm-processors/i.mx-applications-...

    a nice architecture indeed - we have this in mind for other applications as well.

  • MR60

    Felicitations mes compatriotes ! Go !

  • Developer

    Sorry, I missed the Cortex M4 part. I quoted from that technote you posted.

    I suppose this makes it technically the second 'smartdrone' with Octo Linux running on a Dual (or Quad?) Core A9 ARM 800MHz processor with a Arm Cortex M4 flight control. Similar to the 3DR Solo setup with onboard Arm Linux (single core) and PH2 (Arm32 Cortex M4) 

  • Thanks Bill. We use a hybrid architecture:  the autopilot and all system critical work is running on a separate cortex m4, which means that the operating system can freeze and the drone keeps on flying (and gently lands if it realize the cpu is dead).

    As for opencv on imx6, it can indeed be accelerared by the GPU, either using OpenGL and/or OpenCL. Here is an application note from Freescale on this topic: https://cache.freescale.com/files/32bit/doc/app_note/AN4629.pdf

  • Developer

    It's great that it supports the OpenCV library, but as cautionary note it only supports image capture using the GPU  acceleration on the iMX6 board. OpenCV uses 9as examples) CPU based implementations and that is not going tow work well if that CPU is also the doing flight control.

    re: http://cache.freescale.com/files/32bit/doc/app_note/AN4629.pdf

    "Lastly, note that use of OpenCV is not mandatory. In this application note, the author decided to use it only for convenience, since it´s very easy to capture feed from the camera and use its IplImage structure for decoding and analysis. However, OpenCV´s advanced math or image processing features are not used because they are all CPU-based. In order to speed up the image processing, avoid using all CPUbased features."

    That would question the ability to do higher order image processing  for object avoidance as is. You will  need to implement optimized routines on the GPU for it to be a successful. (This is the same for RPi/Solo/Odroid etc... to achieve the same goals)

    That said, it does look like a project that can deliver on what they promise. (it's expensive, but seem to be an indicator that they understand the costs)

  • I'll have to admit that I pretty much think this: https://docs.google.com/spreadsheets/d/1JssYSCiJ2d9QP5egDwL1fGiPMFn...

    But hopefully yours will be the project that changes my mind :)

  • You have to compile the code yourself for these vehicles, but they are called the Coax and Single frame types.


This reply was deleted.