Víctor Mayoral's Posts (28)

Sort by

Towards ROS-native drones

3689702513?profile=original

Originally published in Medium:

Announcing alpha support for the PX4 flight stack in a path towards drones that speak ROS natively.

The drones field is an interesting one to analyze from a robotics perspective. While capable flying robots are reasonably new, RC-hobbyists have been around for a much longer time building flying machines developing communities around the so called flight stacks or software autopilots.

Among these, there’re popular options such as the Paparazzi, the APM (commonly known asardupilot) or the PX4. These autopilots matured to the point of acquiring autonomous capabilities and turning these flying machines into actualdrones. Many of these open source flight stacks provide a general codebase for building basic drone behaviors howevermodifications are generally needed when one has the intention of tackling traditional problems in robotics such as navigation, mapping, obstacle avoidance and so on. These modifications are not straightforward when performed directly in the autopilot code thereby, in an attempt to enhance (or sometimes just simplify) the capabilities of autopilots, abstraction layers such as DroneKit started appearing.

For a roboticist however, the common language is the Robot Operating System (ROS). Getting ROS to talk to these flight stacks natively would require a decent amount of resources and effort thereby, generally, roboticists use a bridge such as the mavros ROS package to talk to the flight stacks.

We at Erle Robotics have been offering services with flying robots using such architecture but we’ve always wondered what would be the path towards a ROS-native drone. In order to explore this possibility we’ve added support for the PX4 Pro flight stack.

Supporting the PX4 Pro flight stack

1*ErmgDNaQj2PWIrKYOMzkHg.png

The PX4 Pro drone autopilot is an open source flight control solution for drones that can “fly anything from a racing to a cargo drone — be it a multi copter, plane or VTOL”. PX4 has been built with a philosophy similar to ROS, composed by different software blocks where each one of these modules communicates using a publish/subscribe architecture (currently, a simplified pub/sub middleware called uORB).

In an internal attempt to research the path of getting ROS-native flight stacks and to open up this work to the community I’m happy to announceofficial alpha support for the PX4 Pro in all our products meant for developers such as the PXFminiErle-Brain 2 or Erle-Copter. Our team has put together a new set of Operating System images for our products that will help you switch between flight stacks easily.

To install PX4 Pro, just type the following:

sudo apt-get purge -y apm-* # e.g.: apm-copter-erlebrain
sudo apt-get update 
sudo apt-get install px4-erle-robotics

ROS-native flight stacks

Using the PX4 Pro flight stack as a starting point, our team will be dedicating resources to prototype the concept of a drone autopilot that speaks ROS natively, that is, that uses ROS nodes to abstract each submodule within the autopilot’s logic (attitude estimator, position control, navigator, …) and ROS topics/services to communicate with the rest of the blocks within the autopilot.

Ultimately, this initiative should deliver a software autopilot capable of creating a variety of drones that merges nicely with all the traditional ROS interfaces that roboticists have been building for over a decade now.

If you’re interested in participating with this initiative, reach us out.

Read more…

mQXo3hMW8gN8QsTSkCm3s6dsxKraLWBOXidaCFngqp7cNozcvBAh70gBa7-6rSXEZho2qcyw9vtzZMweLKTasirkG6W1PfM0YBTbuS3jys9ISraaBZtGKhCGt0ZqNPCB_FMQPzH0

 

Our team at Erle Robotics is glad to announce that the simulation for the Erle-Rover has been launched and open sourced.

For those that don't know it yet, Gazebo is a powerful real time physics simulator that allows the developers to test their algorithms in a secure and agile way. It runs along ROS (Robot Operating System) and that’s probably what makes it perfect, plus it offers a variety of sensors (such as lidars, cameras, range finders) in form of plugins that can be added to any robot model. We expect the APM community to benefit from this work and simulate their autonomous rovers using these tools.

The plugin has been developed over AurelienRoy’s plugin which, in a nutshell, is an interface between Gazebo and APM:Rover (ardupilot). Erle Robotics added to this plugin support for their Rover.

 

To show the power of this plugin, here are some behaviours we developed and tested:

  • Obstacle avoidance : a simple but powerful enough algorithm to avoid obstacles. This behaviour was tested first under Gazebo and then brought to real life scenario. It uses a 270º HOKUYO lidar sensor, but all the details and the source code can be found here.

  • Line following : line follower algorithm using the camera. Again, all details as well as the source code can be found here.

  • SLAM : SLAM application using hector_mapping node, a powerful SLAM approach that can be used only with a lidar sensor. All the details, including source code are available here. This also was translated to a real scenario.

Additional information and resources:

Read more…

First flight with the Raspberry Pi 3

We managed to get our hands into a Raspberry Pi 3 and decided to give APM a try with it using the PXFmini. Here’s a walkthrough over some of the tests that our team conducted: 

Benchmarking

So, down to the benchmarks, we performed 3 types of tests using sysbench and the default Raspbian images (no APM running for now). SysBench is a modular, cross-platform and multi-threaded benchmark tool for evaluating OS parameters that are important for a system running a database under intensive load. The output of sysbench looks like this:

sysbench --test=cpu --cpu-max-prime=20000 run

sysbench 0.4.12:  multi-threaded system evaluation benchmark

Running the test with following options:

Number of threads: 1

Doing CPU performance benchmark

Threads started!

Done.

Maximum prime number checked in CPU test: 20000

Test execution summary:

   total time:                          477.3324s

   total number of events:              10000

   total time taken by event execution: 477.3236

   per-request statistics:

        min:                                 47.69ms

        avg:                                 47.73ms

        max:                                 85.54ms

        approx.  95 percentile:              47.72ms

Threads fairness:

   events (avg/stddev):           10000.0000/0.00

   execution time (avg/stddev):   477.3236/0.00

 

The output is graphed against the Raspberry Raspberry Pi 2 (note that smaller bars indicates better results):

k5cICF47xgcSNlYrz8qWVyb4wQLBoWevCPCHl94me3m5w5EMks4BPQq3LmuA1duff6EWOe7kYkN63JBq1Fh4KFbM-mZqcVIWy2ImaKe4tcyLuL3668adCySkyPTp6eBHy-9JJfAQ

sysbench --test=cpu --num-threads=1 run



HinsJCn2SeFkgFTYC-TDFUR0pVB6Dv1ddZy0dzMpWRq0HLSFqsjmuyCHKODO6IM3_5rSdCzT0j9ThNMsQQ-0uWX2K_Nrl2OVVC10I1Jl8cfzrMYjHSS6WhHSHfcyfxrDweaFmahL

sysbench --test=memory run --memory-total-size=2G

rURY_MBXrYtN-rqabZnH8VUv6iduqjH7r6XHCi4EnCh5nLgu5L-uZm5Y9FnuCXvB_Jgn5hau0pECBAupVCMUYI6NL6VN4ZJuZE8M0YKHybFqtEfKtxbb1rBafPPZjVwcqxh2tSpC

sysbench --test=memory run --memory-total-size=2G --num-threads=1 --memory-oper=read

The 64 bits CPU of the Pi 3 dramatically improves the results of this particular tests over the Pi 2. It’s 40% - 60% faster.

Mounting the autopilot:

We mounted the shield as follows which proved to be robust enough. Vibrations were kept at a reasonable level as will be shown later in the log analysis:

3689682612?profile=original

Flying with APM and log analysis

After a short flight we started some checks. The official documentation of APM will tell us:

“Check the scale on the left and ensure that your vibration levels for the AccX and AccY are between -3 and +3.  For AccZ the acceptable range is -15 to -5.“

_Zt0xMUTdjMq0zZ_noEqL7kWMaRVMVZuLzaJQLe2RZV5sxr_0vdr13RBMf4K4knW2obPYuutIn75_I2DjX11-ORtRQHym1MH7a0kvoIT9aw9OLqgU_yceI6rhFtEm8BwW1edOlxf

Fine on this. We also felt that the drone was reacting really good so made a few plots that convinced us about the autopilot responses with the Raspberry Pi 3:

dyBj024hPwbAOYdrU9EFfT45nomahOT_rh3JOoEtow-x5foj-DHiYmRX7e5fMX-nobv2G8Dgea64VkjCkwp62HqPtaHSpBRZfCBNPrPbjxAZJSxgVGof3XulLSuhTf-1SJkbTQ1N

 kpzZsCpTKZ3GHyCAZq623y_9CN9wbYO2m6f3d91B6j1It7N0h1NVUMTo_-CTydPPzswm6wj0nq9mkm8ZuZIxcYyI7uaiJJUJeFNWEix371Ef88chwdqomkPI1bt4N0jJPGCFZkTb

Conclusion

Raspberry Pi 3 is a great candidate to make APM Linux autopilots using the PXFmini!. Having Bluetooth and WiFi is indeed a great asset removing the need of using additional USB dongles unfortunately, WiFi is solely b/g/n and does not support the 5 GHz frequency band (which comes handy when flying with RC controllers that work in the 2.4 GHz band).

Read more…

3689675749?profile=original

Following up with the PixHawk Fire Cape (PXF) series that was started a while ago, I'm happy to announce the PXFmini, an open autopilot shield for the Raspberry Pi. This autopilot shield allows anyone to create ready-to-fly autopilots with support for Dronecode’s APM flight stack priced at only 69 €. The shield has been designed with low cost in mind and specially for the Raspbery Pi Zero (it is also compatible with other Raspberry Pi boards). Find below some of its features:

 

A tiny yet powerful design

The PXFmini shield weighs only 15 grams and on it’s 31mm x 71 mm embeds all the power electronics necessary to comply with most of the existing components for drones using its 2xI2C and UART ports.

The design is based on previous iterations with proper APM upstream support and provides 8 PWM output channels as well as a PPMSUM input.

 

A shield for sensing

PXFmini includes a 9 axes IMU (MPU9250), a digital barometer (MS5611) and an ADC for voltage measurements.

 

An improved experience

Forget about breaking those DF13 connectors. We’ve decided to bet on the new JST GH connectors (adopted by the Dronecode Foundation) to provide an amazing new experience. We’ve also partnered with manufacturers to provide DF13 to JST GH converters.

 

Open design

All the schematics are open for you to hack around. They’ll be released as soon as the boards start shipping. Shipping will start in early February 2016.

 

We made a short clip closing the year where we show the board:

https://www.youtube.com/watch?v=tXZb2gN9SEg

 

Thanks everyone and merry christmas!

Read more…

Erle Robotics "blanco" for Erle-Brain 2.0

Erle-Brain2_W6.jpg?width=750

Following our previous iteration, and after receiving feedback from the community our team is glad to present "blanco" Debian Jessie image from Erle Robotics. This image valid for Erle-Brain 2 chances include:

  • Matplotlib installed
  • MAVProxy (docs)
  • Dronekit 2.0.0rc8 (dronekit page)
  • WiFi boot time upgrades
  • Added scripts for switching vehicles in Desktop
  • Added Picture script
  • Added ROSimple, a multiplatform simple way to program robots using ROS.
  • Disabled graphical target at boot, Desktop can now can be launched with sudo systemctl start lightdm

We're specially excited about ROSimple which we open sourced here. This work came out the training experience we've been having over the last months:

For several weeks we taught different groups (going from the high-shool level to the PhD one) how to make use of our robots. While most of the people quickly understand the different mechanical parts of a robot, understanding the underlying software is something that took quite a bit of effort.

When making our training sessions, one of the first things we try to introduce is the concept of the Robot Operating System (ROS). While there're many resources on this topic, it's a fact that learning ROS takes some effort even at the PhD level so we started prototyping ideas to make this proccess as simple as possible.

We wanted to reach high schools students so we realized that we had to remove the assumption of "coding skills" from the equation. This made us look into systems like Scratch for robot programming. The output of these prototypes has become ROSimple. A multiplatform web-based tool for programming robots and drones that use ROS. In fact, ROSimple itself is a ROS package.

ROSimple support for APM (through mavros) is currently being explored:

https://www.youtube.com/watch?v=XPNr4d6ZwMo

"blanco" can be obtained from here.

Read more…

Raspberry_Pi_Zero.jpg

Announced a new Raspberry:

Today, I’m pleased to be able to announce the immediate availability of Raspberry Pi Zero, made in Wales and priced at just $5. Zero is a full-fledged member of the Raspberry Pi family, featuring:

  • A Broadcom BCM2835 application processor
    • 1GHz ARM11 core (40% faster than Raspberry Pi 1)
  • 512MB of LPDDR2 SDRAM
  • A micro-SD card slot
  • A mini-HDMI socket for 1080p60 video output
  • Micro-USB sockets for data and power
  • An unpopulated 40-pin GPIO header
    • Identical pinout to Model A+/B+/2B
  • An unpopulated composite video header
  • Our smallest ever form factor, at 65mm x 30mm x 5mm

With this, shields like the PXF 2.0, NavIO, Raspilot and the one from VirtualRobotix may have a way to cut down costs.

Read more…

687474703a2f2f65726c65726f626f746963732e636f6d2f626c6f672f77702d636f6e74656e742f75706c6f6164732f323031352f31312f494d475f32303135313131305f3039333033342d32312e706e67

Erle Robotics has taken much from this open hardware environment that has been created around the Dronecode Foundation. There are many individuals and groups that helped or contributed in some way and among them i believe it's worth mentioning two groups:

  • The PX4 team led by Lorenz Meier which has truly been an inspiration for many. Their hardware designs have had a huge impact and he PixHawk Fire Cape 2.0 has the spirit imposed by the PX4 designs.
  • 3DRobotics as the enabler of this open drone reality. 

 

With all this in mind, I am happy to announce the PixHawk Fire Cape 2.0 (PXF 2.0): an open hardware multi-platform shield to build Linux-based drones. The naming, has been selected honouring Philip Rowse who initially designed the first PXF concept (which on his own based this work in the PixHawk itself). Although we won’t be retailing this board directly we will be offering Erle-Brain 2, a commercial solution based on this board. Sources are freely available at https://github.com/erlerobot/PXF2.

Designs are out there for people to hack and contribute back.

Read more…

Erle Robotics "azul" for Erle-Brain 2.0

3689670929?profile=original

Listening to the feedback provided recently we (at Erle Robotics) have decided to start new batch of releases about the  Debian-based operating system that our brains wears. The image includes:

  • Latest APM binaries for copter, plane and rover included (service launches copter by default)
  • mavros preinstalled and launched at init setting up different bridges for additional ground control stations.
  • ROS Indigo preinstalled and launched at init
  • Integrated camera images available directly through a ROS topic.
  • Different lasers, lidars and RGB cameras supported (additional devices supported).
  • WiFi hotspot mode preconfigured on wlan0 network interface with DHCP. The default SSID is erle-brain-2 (more)
  • WiFi infraestructure mode preconfigured on wlan1
  • Zeroconf stablished to be able to reach the brain through erle-brain-2.local from different connections (more).
  • Graphical User Interface (GUI) (more)
  • ROS 2.0 support (read more).
Read more…

Crawl around our offices with Erle-Spider

As some of you may know, we’ve designed Erle-Spider to deliver an open, low cost and Linux-based robot powered by ROS that hopes to bring robotics to many. To show it to the world we’ve put together a little game. Here’s how it works:
Erle-Spider is crawling around our offices and life streaming video of what it sees at https://www.youtube.com/c/ErleRobotics/live. The robot can be moved around with the following commands actioned through Twitter: 
@ErleRobotics make #erlespider go [forward | backwards | right | left]http://igg.me/at/erle-spider/x/7817776
For example if I were to move the Spider forward to meet part of the team behind it I'd go and tweet: "@ErleRobotics make #erlespider go forwardhttp://igg.me/at/erle-spider/x/7817776" (note that YouTube imposes about 30 seconds of delay on the video feed).
We'll be making this little game available in our office hours (CEST). If you're interested in getting one for Christmas go to our indiegogo campaign and support us.
Thanks,
Read more…

Erle-Spider crowdfunding campaign

3689665735?profile=original

About a month ago we announced Erle-Spider, a drone with legs powered by Ubuntu with the Robot Operating System at its core and support/compatibility with MAVLink and other Dronecode Foundation tools. Over the last weeks we've been working very hard to launch Erle-Spider for an affordable price and today I'm happy to announce it's crowdfunding campaign.

3689665839?profile=original

Erle-Spider is available starting at 399 $ and is based on the Robot Operating System (ROS). ROS packages will be available to connect Erle-Spider with existing MAVLink devices and interfaces. 

3689665777?profile=original

Updated link: Erle-Spider can be supported at https://www.indiegogo.com/projects/erle-spider-the-ubuntu-drone-with-legs/#/story.

Read more…

3689661388?profile=original

I'm happy to introduce Erle-Spider, a new drone with legs powered by Ubuntu that uses the Robot Operating System (ROS) at its core. 

Erle-Spider shows just how easy it is to build an essentially complex robot, when powered by our robotic brain (Erle-Brain) and adding functionality from the Ubuntu-based app store. This new robot is literally a computer with legs that has 18 degrees of freedom in total. It will be released as an open source and a cost-effective platform, with the aim of being the premier legged drone for education, research and inspection tasks where we see a huge potential in hard-to-reach places, such as pipes or disaster areas.

3689661461?profile=original

Many behaviors implemented on Erle-Spider have been inspired by the fantastic work that APM developers do and it's our goal to make it compatible with many of the existing tools that this community and the Dronecode Foundations supports. It will be possible to program autonomous missions with it through "Mission Planner" or "APM Planner 2.0" while at the same time, simulation will be possible using several of the tools supported by the Open Source Robotics Foundation (the guys behind ROS).

A short clip is available at https://www.youtube.com/watch?v=mWrvqt4ZCj0. ;

Erle-Spider will be launched soon so If you're interested we'd recommend subscribing to stay tuned. More information available at Erle-Spider one pager.

Read more…

3689658941?profile=original

As someone who came from robotics to the RC side of drones I discovered that there were many things that I found not that well documented. The learning experience over the last 2/3 years taught myself and my team many things thereby a few weeks ago we decided to launch the Ubuntu DIY drone kit, a low cost kit available in red/white and yellow/black aimed for people that desired to build themselves a Linux (Ubuntu) drone with instructions and material to do everything step-by-step.

Today we present here our experience. Everything is based on technologies supported and sponsored by the Dronecode Foundation:

3689658996?profile=original

The assembly guide has been written with few assumptions in mind so that pretty much anyone can build their own smart drone (referring to the Linux side of it) and the operation manual provides an introduction to the basic concepts of autopilots and APM.

We were excited to see how many people started purchasing this kit thereby we though that it would make sense to organize a local session about "Building your own Linux drone" teaching people the basics about drone assembly and maintenance, APM, GCS and safety aspects.

3689659056?profile=original

3689659040?profile=original

3689659077?profile=original

The course duration was 2 weeks and it has been a complete success. We've had 15 students and they all were really satisfied. We closed the course with a visit to Erle Robotics and introduced slightly the engineering side of what we do.

Many thanks to Lander and David for a fantastic work supporting the course. Those interested in reproducing this experience can get in touch with us at https://erlerobotics.com/blog/dronedu/. We'd be happy to share material and additional content.

I'm personally quite interested in what the community thinks. Are there any previous experiences? Any advice on this side? Anyone interested to take over this initiative and reproduce it somewhere else?

Links:

Read more…

Learning ROS series

maxresdefault.jpg

Post updated to include new videos.

The Robot Operative System (ROS) is an open-source, framework for robot application development maintained by the Open Source Robotics Foundation (OSRF).

A ROS system is comprised of a number of independent nodes, each of which communicates with the other nodes using a publish/subscribe messaging model that can be deployed over different computers. ROS was originally developed in 2007 by the Stanford Artificial Intelligence Laboratory (SAIL) with the support of the Stanford AI Robot project. As of 2008, development continues primarily at Willow Garage, a robotics research institute, with more than 20 institutions collaborating within a federated development model.In February 2013, ROS stewardship transitioned to the Open Source Robotics Foundation. ROS is released under the terms of the BSD (Berkeley Software Distribution) license and is an open source software.

At Erle Robotics we believe that ROS is the key tool for the future development of robotics. In a few years ROS has changed the field unifying universities and industry around the world, enhanced collaboration, sharing of algorithms and reuse of code. With the feedback from our users and community, we’ve decided to put together a series about learning ROS.

In our last episode (the third one) we explain how create a ROS package that allows a drone to autonomously takeoff, do stuff (or idle as it’s our case) and land (source code). Watch the last part of the video (minute 8:58) where we show a life demo of the code developed during the session.

This work is developed using:

- Erle-Brain http://erlerobotics.com/blog/product/erle-brain
- Erle-Copter Ubuntu drone http://erlerobotics.com/blog/product/erle-copter-ubuntu

List of episodes:

Read more…

Learning ROS series - 1

maxresdefault.jpg

The Robot Operative System (ROS) is an open-source, framework for robot application development maintained by the Open Source Robotics Foundation (OSRF). A ROS system is comprised of a number of independent nodes, each of which communicates with the other nodes using a publish/subscribe messaging model that can be deployed over different computers.

ROS was originally developed in 2007 by the Stanford Artificial Intelligence Laboratory (SAIL) with the support of the Stanford AI Robot project. As of 2008, development continues primarily at Willow Garage, a robotics research institute, with more than 20 institutions collaborating within a federated development model. In February 2013, ROS stewardship transitioned to the Open Source Robotics Foundation. ROS is released under the terms of the BSD (Berkeley Software Distribution) license and is an open source software.

At Erle Robotics we believe that ROS is the key tool for the future development of robotics. In a few years ROS has changed the field unifying universities and industry around the world, enhanced collaboration, sharing of algorithms and reuse of code. With the feedback from our users and community, we’ve decided to put together a series about learning ROS using our Linux drones and autopilots. Here's the first episode:

https://www.youtube.com/watch?v=d5YAJh6Z2B0&list=PL39WpgKDjDfVfiNVG47DBi93wsh2XHKVO&index=1

We'll be showing how to make autonomous behaviors through the Robot Operating System so stay tuned for more.

Read more…

An app store for drones

3689651256?profile=original

As introduced previously, we've been working over the last months with Canonical to push an app store based on Snappy Ubuntu Core, the new distribution from Ubuntu that includes a marketplace where people is encouraged to put their algorithms and behaviors (even for sale).

Erle-Copter was previewed last week at IoT World as the first drone with apps and through Erle-Brain one can easily install snaps that correspond with different vehicles (Copter, Plane or Rover for now). 

Besides packaging APM or ROS for the App Store, there's also many other apps that are starting to show up that add many more functionalities to the robots and drones built with our robotic brains:

3689651170?profile=original

Docs are available in our website that explain how to create an app step-by-step:

3689651283?profile=original

For those wishing to make applications using ROS we'd recommend checking these tutorials as well as the ros2snap repository.

Read more…

brain-v1.1-focus.png

I’m delighted to introduce Erle-Brain v1.1 as the most complete Linux autopilot:

3 (1)

Erle-Brain is being adopted by schools, universities and research centers around the world.

The journey of Erle-Brain started back on late 2013 with the BeaglePilot project giving birth of the first APM Linux-based drones and finishing up with the publication: “Towards an Open Source Linux autopilot for drones“.
Much has happened since and the BeagleBone Black was just featured at the Embedded Linux Conference as a great platform for building Linux drones.

 

Our love for Linux drones however goes far beyond making them fly and accomplish autonomous missions. As roboticsts we are already thinking about embedding more intelligence in our flying robots (obstacle avoidance, SLAM, image and speech recognition, …) and for that we are putting a lot of development time into the Robot Operating System (ROS). We are releasing demos such as this one. Expect more to come.

Just a few days ago we decided to launch a crowdfunding campaign: BeagleUAV. Through BeagleUAV, you’ll be able to access all the Erle-Brain for a discounted price of 150 €.

Go ahead and get yourself into Linux drones with the BeagleBone Black and the PixHawk Fire Cape ;).

Read more…

PixHawk Fire cape crowdfunding campaign

3689642781?profile=original

Here it is! Finally, the PixHawk Fire Cape (PXF) is ready for a wider audience and we've launched a crowdfunding campaign so that everyone can get theirs. We've been iterating through different generations over the last year and with the help of many, we are finally proud to announce the PXF starting at $100.

This board was born out Philip's hands and pushed by members of this community so it's just reasonable to try making it as affordable and accesible as possible. Some technical details:

Sensors

  • MPU6000: 3-axis gyroscope, 3-axis accelerometer and temperature sensor.
  • MPU9250: 3-axis gyroscope, 3-axis accelerometer, 3-axis magnetometer and temperature sensor
  • LSM9DS0: 3-axis gyroscope, 3-axis accelerometer, 3-axis magnetometer and temperature sensor.
  • MS5611-01BA03: Barometer that includes pressure and temperature sensors.

Connectors

  • 3x LED indicators (Green, Amber and Blue)
  • 2x serial UART ports (ttyO0 is not active)
  • 1x CAN connector and transceiver
  • 3x I2C ports
  • 1x Buzzer 
  • 1x Safety switch
  • 9x PWM output channels
  • PPM/S.Bus in
  • 1x Spektrum 
  • 1x Power brick connector
  • 1x Battery backup (1 LiPo cell)
  • 1x ADC
  • 2x GPIOs exposed (IO)
  • 1x analogic pressure sensor (AIR)

Mechanical characteristics

  • Size: 88.6 x 54.73 x 20.69 mm 
  • Layers: 6 
  • PCB Thickness: 1.62 mm
  • Weight: 31 grams

Support the crowdfunding campaign of the PXF here.

Read more…

An autopilot with ROS support

3689641300?profile=original

We've been working towards improving ROS support and creating packages that people can use to build applications. Here's our current status:

Through mavros someone can have control of the autopilot from ROS. Similarly, the pwmbuzzer,statusled and ubled nodes (and topics) give direct access to hardware.

In the following video, Alex provides a walkthrough for some of the packages involved:

Packages are available at our github repositories and documented at the ROS Wiki:

We'd love to hear your opinion about which ROS abstractions you'd like to see in Erle-Brain.
Regards,

Read more…

"hello drone" in the app store

appstore

A few weeks ago we announced a partnership with Canonical to support the next generation of drones that will be connected to the internet, update automatically and will have access to an app store for drones. Following up with our passion for bringing Linux-based drones to the market we are happy to share that the first apps are starting to show up in the store and are freely available.

Our goal is to create computers with the ability to fly that would allow a big community of Linux experts and hackers to jump into robotic application development using the best tools and frameworks (ROS) for robotic application development. We plan to release soon an image for Erle-Brain that will allow you to play with the store and make Linux drones so if your are interested go and fetch yours from the store.

The “hello drone” app is teared down and explained in this article:

hello drone app

Support for hardware access in Snappy Ubuntu Core has recently been announced so expect more apps to be released  over the following weeks interacting with hardware. Our approach will be focused on ROS.

Read more…

ROSsifying an APM-based autopilot

Great stuff happening at Erle Robotics while we push our Linux autopilot, Erle-Brain, towards deeper and deeper ROS integration.

Here's Alex sharing with you some of the first packages we coded (video):

Some links:

- PWM: https://github.com/erlerobot/ros-hydr...
- Buzzer: https://github.com/erlerobot/ros-hydr...
- Ubled: https://github.com/erlerobot/ros-hydr...
- StatusLed: https://github.com/erlerobot/ros-hydr...

Read more…