In october I helped fund the parallella one board computer. There are many of those around right now, the raspberry pi, the udoo, beagleboard black, but this one is different, because it mates a super-powerful Xilinx Zynq 7020 application processor, which is a 1 ghz dual core Arm A9 (same as in ipad 2) with a FPGA fabric the size of a medium fpga chip. This makes it a highly flexible processor, one that in itself could surpass most of the ARM based flight controllers on the market right now. The fpga fabric in itself is something that could prove very useful, since it can perform signal processing with a breeze, it comes extremely natural to fpga to do filtering, pid and such real-time processes, with minimal latency.

IMG_0359-1024x749.jpg?width=600

But what's really exciting about the parallella is its epiphany 16-core or 64-core (there are two models) parallell arm processor. edit: RISC processor. This thing is perfect for computer vision, optical flow, processing point clouds for 3D scanning and on the fly GIS processing. With its super low power requirements it's perfect for battery powered robotics.

43c4b513ab596eacb98f861f4086d6f2_large.jpg?1351108884&width=600 

 

But the best part is the price, at only $99 it's something that you can loose in a crash without loosing to much sleep over it!


On the back of the board there are connectors for 48 fpga pins, uarts and more. We now need to make a daughter board with sensor and connectors for servos and ESCs, rx and all the other stuff we need, but I want everyone to chime in and voice their needs for their specific applications, what sensors do you think are the best? I've started a discussion at the parallella forums, with a community effort we can make this something really great!


Join our discussion at the parallella forum!

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Developer

    The obvious choice for "easy" optimization of algorithms on a platform like Parallella is the OpenCL framework. And since it is already supported, we are off to a running start.

  • I think there'll be compilers available to rather easily port large parts of the point cloud library to run partially on the epiphany. One cool application I've thought about is absolution positioning in known environments  first you scan an environment  then you divide it into sectors based on gps data, by accessing the data in the sector that you're in you can determine your position with very high precision, it would be like detecting people but the other way around, detecting the enviroment. Check out the PCL, it's really awesome, the tutorials are great:

    http://pointclouds.org/documentation/tutorials/

    Documentation - Point Cloud Library (PCL)
  • > I see what you're getting at but if you look at the stm32 processors, you can't really expect them to handle something like the point cloud library on real time depth data

    Completely agree. The stm32 gives the elbow-room to not have to do simplified state estimation (with a bit of room left over), but it does not get to anything like the processing needed to fully take advantage of modern computer vision tools.  

    The parallella looks awesome.

  • Hi John, I agree, if sensor and servo stuff was only value, easier to just interface to PX4.

    But the real point was the parallel frame buffer hardwired to the Parallel processor, that has cool capabilities.

    And Jack, of course you are right, but if its really depth image stuff you want to deal with you really need that parallel clout.

    I'm telling you this! - like you don't know, - RIGHT ! ! ! !

    Of course, probably better to just put the whole depth image acquisition and processing system on one board.

    Interesting times are definitely coming.

  • Takes a lot of one off programming for just 1 device to use those things.  When you're talking about dozens of different multicore  boards coming out, it gets hard to commit 6 months to one.

  • Developer

    Gary, no need to make yet another specialized sensor board. The APM with simple software to just dump sensor data back to the Parallelle using UART or I2C, and generate resulting servo outputs would do. And not require any PCB design, just some simple wiring between the boards. Heck even a simple $30 KK board would probably do the job.

  • Hi Goran, Absolutely right 3D point cloud (depth image) is what is most important for the future.

    And the Parallella is uniquely well designed to do the sort of processing required to make sense of it.

    Could probably even manage a reasonable take on real time SLAM for a lot of uses.

    Went to the Parallella forum and suggested a flight controller add on with all the usual suspects (servos, Accels, Gyros, Mag, baro and GPS port).

    But more importantly a dual (3D cloud or Color configurable) frame buffer directly interfaced to the Epiphany chip and various external image device inputs.

    Think of the possibilities.

  • Sergiu, I see what you're getting at but if you look at the stm32 processors, you can't really expect them to handle something like the point cloud library on real time depth data. To create multirotors that can be around people and not hurt them, by autonomously avoiding them, something like the new primesense capri sensor (kinect the size of a smartphone camera sensor) with a fast enough processor is needed. This will also be very useful for real-time 3d scanning, neural network based maneuvering etc

    And hopefully, a lot of work that goes into the arm developments can be reused on this platform. The epiphany processor can be used as a special coprocessor. Personally, I'm not too excited about the stm32 stuff, I think it's really the bare minimum. 

    I think it comes down to, do we need to keep moving forward? I think the answer always will have to be yes. I'm not satisfied until I can ask my flying robot to go pick up some groceries for me, feed the cat while I'm away, etc.

    Here's the capri: http://www.primesense.com/news/capri-1-25-demo-at-gdc13-video/

    And of course there is this: https://www.youtube.com/watch?v=eWmVrfjDCyw which runs on a 1.6ghz intel atom, shows what could be possible. 

  • That is really cool.  I've wanted an FPGA on the FC for a few years.  You don't mention anything about sensors though.  It would be ideal to pipe the sensors to the FPGA directly to let the lower level stabilization algorithms run there and directly control the PWM output.  IO might start to get a bit tight for the daughter board then.

  • I simply don't understand. Next year for sure, somebody will give us a link to

    one another possible autopilot: 256 processors, 128GB RAM, 12GHZ speed. All in one chip.

    Theoretically, one super autopilot  like this will be able to work with 10 video cameras ,etc,etc,etc.

    But who will really make the software to keep occupied one such powerful autopilot? How long it will take?

    And after a while when some firmware will be ready, already another super autopilot with 512 processors will be

    available...

    Let's be honest: look at the mikrokopter - one atmega and one ARM for navigation, Autoquad- only one STM32 at "only" 168Mhz, DJY Wookong, etc.

    For STM32F4, software is currently under development for many folks here and is far from bringing him at "100% load".

    Regards,

    Sergiu

This reply was deleted.