3D Robotics

Parrot releases Ubuntu-powered SLAM development board

3689699800?profile=original

From Canonical:

Parrot collaborates with Canonical to launch the Parrot S.L.A.M.dunk, a new development kit for the creation of autonomous and obstacle avoidance drones and robots. Powered by Ubuntu and ROS (Robot Operating System), it gives developers a familiar environment to prototype solutions such as autonomous driving, 3D mapping, or simply using the on board stereo camera and sensors for data gathering.

Just attach the Parrot S.L.A.M.dunk to a drone, plug it into the power source and flight controller and you’ve transformed your drone into an intelligent robot.

Screen-Shot-2016-09-07-at-10.32.09.png?width=600

The Parrot S.L.A.M.dunk is particularly suited to an environment with no GPS or numerous obstacles where its S.L.A.M. (Simultaneous Localization and Mapping) software can be used to help the drone understand and navigate its environment.

With support for Ubuntu and ROS, it uses the most popular and versatile robotic development environment. This means that whilst drones are the primary market for the Parrot S.L.A.M.dunk, it can be used for a much wider set of “robots”, flying wings, articulated arms and roving robots amongst others. Ubuntu and ROS are the preferred choices for robotics developers and researchers, explaining why Parrot decided to choose to offer the combination as a key component of their development kit.

From a hardware point of view, the Parrot S.L.A.M.dunk packs an impressive spec into just 140g, including:

  • NVIDIA Tegra K1
  • Fish-eye stereo camera with a 1500×1500 resolution at 60fps
  • Inertial-measurement unit (IMU)
  • Ultrasound sensor
  • Magnetometer
  • Barometer

 

To top off the list, the Parrot S.L.A.M.dunk boasts a HDMI port… just plug it into a screen and you’ll get one of the oddest shaped Ubuntu 14.04 computer you can find! Being able to run an Ubuntu desktop directly from the device is a great way for developers to do quick iterative development directly on the board and test their results literally on the fly.

unnamed

Screen-Shot-2016-09-07-at-10.32.28.png?width=700

Parrot collaborates with Canonical to launch the Parrot S.L.A.M.dunk, a new development kit for the creation of autonomous and obstacle avoidance drones and robots. Powered by Ubuntu and ROS (Robot Operating System), it gives developers a familiar environment to prototype solutions such as autonomous driving, 3D mapping, or simply using the on board stereo camera and sensors for data gathering.

Just attach the Parrot S.L.A.M.dunk to a drone, plug it into the power source and flight controller and you’ve transformed your drone into an intelligent robot.

Screen Shot 2016-09-07 at 10.32.09

The Parrot S.L.A.M.dunk is particularly suited to an environment with no GPS or numerous obstacles where its S.L.A.M. (Simultaneous Localization and Mapping) software can be used to help the drone understand and navigate its environment.

With support for Ubuntu and ROS, it uses the most popular and versatile robotic development environment. This means that whilst drones are the primary market for the Parrot S.L.A.M.dunk, it can be used for a much wider set of “robots”, flying wings, articulated arms and roving robots amongst others. Ubuntu and ROS are the preferred choices for robotics developers and researchers, explaining why Parrot decided to choose to offer the combination as a key component of their development kit.

From a hardware point of view, the Parrot S.L.A.M.dunk packs an impressive spec into just 140g, including:

  • NVIDIA Tegra K1
  • Fish-eye stereo camera with a 1500×1500 resolution at 60fps
  • Inertial-measurement unit (IMU)
  • Ultrasound sensor
  • Magnetometer
  • Barometer

 

To top off the list, the Parrot S.L.A.M.dunk boasts a HDMI port… just plug it into a screen and you’ll get one of the oddest shaped Ubuntu 14.04 computer you can find! Being able to run an Ubuntu desktop directly from the device is a great way for developers to do quick iterative development directly on the board and test their results literally on the fly.

unnamed

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • hi gary,

    yes, the sonars are special in the way they should be handled.

    in sw as well as hw.

    one reason to concentrate a little more on other sensors.

    need to fly more ... :-/

    cheers robert

  • Hi Marc - fellow travler,

    Yeah after much work you can get SONAR to sort of work OK for some specific uses, but it is a struggle and then is marginal.

    Laser range sensors work a lot better and in fact with that lovely stereo camera on the SLAM Dunk a simple laser line or hatch illumination pattern could permit quick and easy extraction of range information across the grid.

    Speaking of SLAM, using a TK1 and simple stereo camera I will be interested to see just what level of SLAM they can reasonably achieve.

    SLAM (Simultaneous location and mapping) is one of the most computationally intensive things there is and combine that with navigation and pathfinding and using the simplest and most computationally difficult mechanisms (stereo camera) and that is going to be one really busy little TK1.

    I guess the Sonar is to keep you from running into the wall while your trying to process all that data.

    Best Regards,

    Gary

  • I struggled to make a sonar work with Arducopter. Eventually, after much effort it sort of, kind of, half worked. It may have its uses though. I can tell you I can hover a Phantom 3 Pro indoors and have sonar and optical flow keep it impressively still -- I have used a P3 Pro for interior video in bigger houses -- so you could say that is a non-toy application That is probably analogous to the Parrot's use of sonar, as you say.

  • Hi Robert,

    Bugs in code were not the problem, the inherent limitations of the Maxim Sonars we support are the problem.

    RFI, EMF, Vibration, Sound, Air rushing past, all of these create huge problems for the SONAR.

    I was one of what have actually been a very few people to get the SONAR working well enough on an Arducopter to use it.

    I had to use vibration mounts, an aluminum box internally coated with graphite paint, a cone that blocked out most external acoustic noise and shield the wiring so that no stray RFI got in.

    And then it worked OK.

    Parrot has been using SONAR for a long time and has designed it specifically for it's chassis, it works OK in the existing Parrots.

    But here you have a module designed to add on to various undefined aircraft, which is not the same thing at all and it is going to be a lot more problematic.

    It was never a problem with code, SONAR is intrinsically problematic, doesn't really like working in air at all and has no business in modern UAS.

    SONAR may be good for toys, like old Parrot.

    Just my opinion, but I have actually done it and know how it works, so I am sticking to it.

    Best,

    Gary

  • @gary

    you said: There are very good reasons why we abandoned Sonar.

    i am not shure how many bugs you had in the code ...

    in general the parrot drones do fly well. not comparable with the raw readings from a sonar when you compare their altitude hold. so there must be something different ...

  • Gary I think it's going to miss the basket by the price alone, I read 1249€...

  • Fourth Quarter 2016 and no pricing.

    The stereo camera is interesting, unfortunately, extracting a 3D point cloud or even basic object navigation information for that matter from a stereo camera is a really computationally intensive business.

    I am afraid the little TK1 will be straining its GPUs trying to unravel that

    The TX1 would have been much better.

    And Sonar - Really!

    Sonar is hugely problematic, every kind of noise imaginable affects it negatively and it responds very differently to various surfaces also, use, accuracy and range in air are very limited.

    There are very good reasons why we abandoned Sonar.

    Still, this is an interesting kit of parts if seriously sub-optimal and I suppose some worthwhile use can be made of it.

    At this point I would be paying a lot more attention to Intel's RealSense based robotics and flight controller efforts, especially when they start incorporating the Movidius SOC.

    I'd like to see that with a TX1 too, Movidius's SOC seems very nice and low power, but the reall GPU grunt of the TX1 seems like it will still serve real 3D vision needs better.

    I am afraid the Parrot S.L.A.M. Dunk is going to miss the basket by the time it is actually available.

    Best Regards,

    Gary

  • @jab

    the is no need to get an ethernet port ;-)

    you have usb 3.0

    and it's nice to see what ivan djelic has achieved.
    linux as base is set - for shure when you know his activities for ubifs.

  • amazed but not satisfied. i will try to port it to tx1 and build a pibop3702296027?profile=original

  • This is really good ! but I am also disappointed because it is not TX1 based. We (at Neo-Robotix) are exclusively developing TX1 based apps for professionals for 6 months now, using various sensors. It would be good to have everything in one box.

This reply was deleted.