Chris Anderson's Posts (2718)

Sort by
3D Robotics

3689716615?profile=original

These aren't drones per se, but I can help but post this story by Owen Churchill in Sixthtone, which exemplifies the DIY spirit. 

CHONGQING, Southwest China — Dismembered remote-controlled airplanes lie strewn across an unmade mattress: a motor glued crudely into an improvised wooden housing, landing gear fashioned out of a metal ruler, and sheets of foam advertising board that will become wings, fuselages, and flaps.

This is the spare room of Hu Bo, an 18-year-old from a village two hours from Chongqing who has turned his hand to making flyable models of China’s home-grown aircraft. He started with decades-old military planes and has recently worked on the new flagbearer of the country’s civil aviation industry, the C919 passenger jet. Using self-taught techniques, open-source plans downloaded from the internet, and cheaply acquired or improvised parts, Hu sells his finished models online — unless he crashes them during the test flight. “My technique isn’t so good,” he tells Sixth Tone with a smile as he tinkers with his latest build, a 1.4-meter-wide, twin-propeller plane modeled on China’s Y-12 utility aircraft.

Like millions of other Chinese children, Hu grew up under the guardianship of his grandparents while his parents traveled in search of work, dividing his time between doing homework, playing with the family’s dog, and making intricate paper airplanes at the family home in Yangliu Village, a tiny hamlet around 100 kilometers west of Chongqing.

But now, having scraped together money to buy some basic tools, Hu has joined the ranks of China’s rising number of amateur aviation enthusiasts, spurred on by a huge yet inconsistently regulated drone industry and inspired by the increasing prowess of the country’s home-grown fleet of both military and civilian aircraft. A number of fifth-generation fighter jets are slated to enter service in the next few years; the maiden flight of the world’s widest seaplane — the AG600 — is scheduled for this year; and the country’s first large passenger jet in decades, the C919, took to the skies for the first time on May 5, catalyzing the emergence of a new generation of patriotic plane spotters despite its plethora of foreign parts.

For poor people like us, we have time but no money, so we have to make it ourselves.

Hu has never been on a plane, nor has he ever purchased a complete remote-controlled plane. While he was inspired to start building planes after seeing local friends discussing the latest and best modeling equipment on social media, he has little respect for people who throw money at the hobby. “They are all renminbi fliers,” he says, referring to China’s monetary currency. “For poor people like us, we have time but no money, so we have to make it ourselves. For them, they have money but no time, so they just buy everything outright.”

Buyers on Xianyu — the secondhand version of China’s premier online marketplace, Taobao — have already praised the caliber of Hu’s models, especially the C919. But at least for now, Hu has little interest in making a profit from his planes — if he believes the buyer is a genuine plane enthusiast who will cherish his models, he doesn't charge much more than the cost of the materials, making around 60 yuan (just under $9) for up to a week’s work.

It’s barely a living wage, and it does little to ease the acute financial pressure that his family currently faces. Six months ago, Hu and 22-year-old Xu Xifang became parents. Hu introduces Xu as his wife, but you won’t find their names on any of China’s marriage records: At 18, Hu is still four years away from the country’s marriageable age for men. Also in the family home is his 8-year-old sister and their 65-year-old grandmother, who works 12-hour shifts at a local waste collection plant for 50 yuan per day.

He may barely be covering his costs, but for Hu, building and flying planes has provided welcome respite to the desperate pursuit of livelihood that has defined his childhood. At 14, he quit school to join his parents in China’s southern Guangdong province to work in a brick factory, where he stayed for four years before returning to his home village to settle down and start a family.

“I don’t want my son to be like me,” Hu says, reflecting on his own experience of living without his parents. “No matter where I am, whether I do manual work or business, whether I have nothing at all, I will always stay beside him.”

Indeed, Hu’s pastime-cum-business has become something of a family affair, with his wife, Xu, helping out when she can with printing and cutting the plans. “At the beginning, I didn’t like that he was doing this — I didn’t think he had any hope,” she says as she slices around the cockpit of a C919. “But then I watched him doing it and saw that he was so happy, so I just let him keep going.”

As Hu squats on the floor putting the final touches on his Y-12, his 6-month-old son An An looks on, engrossed, even reaching out from time to time to prod a wing or grab a propeller. A cool breeze one recent afternoon brought a brief lull in the stifling heat and meant that Hu could take his son up into the hills that surround their home to witness the plane’s maiden flight. “The look in his eyes when he sees a model plane is different to when he sees other toys,” says Hu, back in the house after an accident-free test flight. “Today, I saw that look.”

My grandfather made wheelbarrows, my dad made little toys, and now I make airplanes. I think my son will take the next step and make satellites.

Hu has high hopes for his son. By the time Hu is 30, he wants to have built a plane that he himself can fly in, a project he says he will need his son’s help with. But his aerospace aspirations for An An don’t stop there. “My grandfather made wheelbarrows, my dad made little toys, and now I make airplanes,” he says. “I think my son will take the next step and make satellites.”

For now, Hu may have to concern himself with matters closer to home. Until recently, Hu had been free to fly what he wanted, where he wanted. But that looks set to change, as local authorities around the country scramble to limit the use of remote-controlled aircraft following a series of close encounters between drones and passenger jets. One of the most recent occurred at nearby Chongqing’s Jiangbei Airport, when two instances of drone interference were reported in one evening.

Hu calls it a “day of sadness” for model plane enthusiasts like him. Since the incident, local authorities have stipulated that all model plane fliers must submit their personal information and aircraft specifications to a local government representative. Such restrictions appear to be more stringent than the drone-focused regulations currently being rolled out nationwide, which do not mention fixed-wing model airplanes. Flying zones are also being restricted, and now the nearest designated area to Hu’s home is 50 kilometers away.

Despite his sadness, Hu calls the restrictions reasonable and vows that he would never flagrantly violate them. His recent test flight of the Y-12 in the nearby hills was an exception, he says, explaining that he made sure to control his height, speed, and distance. “Even a bird can be dangerous if it strikes an aircraft,” he says, “let alone a drone weighing 10, 20 kilograms.”

Hu Bo carries his model Y-12 aircraft after taking it out for a flight in Dazu County, Chongqing, May 16, 2017. Wu Yue/Sixth Tone

Hu Bo carries his model Y-12 aircraft after taking it out for a flight in Dazu County, Chongqing, May 16, 2017. Wu Yue/Sixth Tone

Regulations aside, there are other matters that stand in the way of Hu’s passion. His parents recently moved from Guangdong to Chengdu in Chongqing’s neighboring Sichuan province to join a relative’s wholesale grocery company. Hu has been earmarked as one of the company’s delivery drivers, and so has spent the last few weeks preparing for his driving test.

The day after the maiden flight of Hu’s Y-12, he and Xu made the 230-kilometer bus ride to Chengdu, leaving behind Hu’s grandmother and sister. True to his vow never to leave his son, Hu has taken An An with him; true to his passion, Hu has also taken some basic tools and the plans for a massive, 2.4-meter-wide model of China’s new military transporter, the Y-20, despite the fact that he’ll struggle to find anywhere in the metropolis of Chengdu to fly it.

“If I go a day without making planes, then I feel like a smoking addict who hasn’t had a cigarette,” says Hu. One day, he says, he’ll harness that addiction and open a small workshop with a few employees making several planes a day. “That way, my family can have a better life.”

Read more…
3D Robotics

From New Atlas, a writeup on new research from ETH and MIT:

Thanks to new research from MIT and ETH Zurich, however, it may soon be possible for drones to autonomously follow along with an actor, keeping their face framed "just right" the whole time – while also avoiding hitting any obstacles.

To utilize the experimental new system, operators start by using a computer interface to indicate who the drone should be tracking, how much of the screen their face or body should occupy, where in the screen they should be, and how they should be oriented toward the camera (choices include straight on, profile, three-quarter view from either side, or over the shoulder).

Once the drone is set in action, the computer wirelessly sends it control signals that allow it to fly along with the actor as they walk, adjusting its flight in order to maintain the shot parameters. This means that if the actor were to start turning their back on the drone, for instance, it would automatically fly around in front of them, to keep their face in the shot. Likewise, if they started walking faster, the drone would also speed up in order to keep them the same distance from the camera.

It's additionally possible for the aircraft to follow small groups of actors, working to keep that group framed a certain way within the shot. The user can stipulate one of those actors as the main subject, ensuring that the drone moves in order to keep other actors from blocking the camera's view of them.

The system utilizes algorithms that predict the actor's trajectory about 50 times a second, allowing the aircraft to effectively stay one step ahead of the action. This also allows it to correct its own flight path if its onboard sensors detect that it's heading toward a stationary obstacle, or if a moving obstacle (such as an actor) is on a collision course with it.

A team led by MIT's Prof. Daniela Rus will be presenting a paper on the research later this month at the International Conference on Robotics and Automation. The system is demonstrated in the video below.

Source: MIT

Read more…
3D Robotics

Drone learns to fly by crashing (a lot)

Excerpt from the IEEE Spectrum article:

“Learning to Fly by Crashing,” a paper from CMU roboticists Dhiraj Gandhi, Lerrel Pinto, and Abhinav Gupta, has such a nice abstract that I’ll just let them explain what this research is all about:

[T]he gap between simulation and real world remains large especially for perception problems. The reason most research avoids using large-scale real data is the fear of crashes! In this paper, we propose to bite the bullet and collect a dataset of crashes itself! We build a drone whose sole purpose is to crash into objects [. . .] We use all this negative flying data in conjunction with positive data sampled from the same trajectories to learn a simple yet powerful policy for UAV navigation.

Cool, let’s get crashing!

One way to think of flying (or driving or walking or any other form of motion) is that success is simply a continual failure to crash. From this perspective, the most effective way of learning how to fly is by getting a lot of experience crashing so that you know exactly what to avoid, and once you can reliably avoid crashing, you by definition know how to fly. Simple, right? We tend not to learn this way, however, because crashing has consequences that are usually quite bad for both robots and people. 

The CMU roboticists wanted to see if there are any benefits to using the crash approach instead of the not crash approach, so they sucked it up and let an AR Drone 2.0 loose in 20 different indoor environments, racking up 11,500 collisions over the course of 40 hours of flying time. As the researchers point out, “since the hulls of the drone are cheap and easy to replace, the cost of catastrophic failure is negligible.” Each collision is random, with the drone starting at a random location in the space and then flying slowly forward until it runs into something. After it does, it goes back to its starting point, and chooses a new direction. Assuming it survives, of course.

Once a collision happens, the images from the trajectory are split into two parts: the part where the drone was doing fine, and the part just before it crashes. These two sets of images are fed into a deep convolutional neural network, which uses them to learn whether a given camera image means that going straight is a good idea or not. After 11,500 collisions, the resulting algorithm is able to fly the drone autonomously, even in narrow, cluttered environments.

During this process, the drone’s forward-facing camera is recording images at 30 Hz. Once a collision happens, the images from the trajectory are split into two parts: the part where the drone was doing fine, and the part just before it crashes. These two sets of images are fed into a deep convolutional neural network (with ImageNet-pretrained weights as initialization for the network), which uses them to learn, essentially, whether a given camera image means that going straight is a good idea or not. After 11,500 collisions, the resulting algorithm is able to fly the drone autonomously, even in narrow, cluttered environments, around moving obstacles, and in the midst of featureless white walls and even glass doors. The algorithm that controls the drone is simple: It splits the image from the AR Drone’s forward camera into a left image and a right image, and if one of those two images looks less collision-y than going straight, the drone turns in that direction. Otherwise, it continues moving forward

Read more…
3D Robotics

3689715489?profile=original

Full review at our sister site, DIY Robocars, here, but this is how it starts:

I’m a huge fan of the OpenMV cam, which is a very neat $65 integrated camera and processor with sophisticated built-in computer vision libraries, a Micopython interpreter, and a very slick IDE (a car powered entirely by it came in 2nd in the Thunderhill DIY Robocars race). Now there is a competitor on the block, Jevois, which offers even more power and a lower cost. I’ve now spent a week with it and can report back on how it compares.

3689715631?profile=original

I tested it on an autonomous rover, to see how it performed in a stand-along embedded environment (as opposed to being connected via USB to a PC):

3689715715?profile=original

Read the rest here...

Read more…
3D Robotics

Free webinars on drone design simulation

3689714945?profile=original

Two years ago Simscale had a good series of free webinars on using simulation tools for drone design. Now they're doing another series on May 11, 18 and 25.  From the announcement:

The growing community of private DIY Drone designers and manufacturers has inspired us to create a workshop series focusing on the simulation of a drone design. The series is directed to makers and drone enthusiasts, who want to learn how to modify and optimize their own drone design.

Participants will receive a hands-on, interactive introduction to the application of engineering simulation in DIY Drone Design, and will learn from top experts how to leverage the free, cloud-based SimScale platform for their own projects and designs.

Every webinar session comes with an optional simulation homework assignment and submitting all will qualify you for certification and prizes. There is no prior knowledge or software required to join this webinar series. All participants will get free access to SimScale with all the required simulation features.

 

Overview:

  • The Drone Design Workshop will consist of a series of 3 one-hour webinar sessions
  • The live, online sessions will take place on Thursdays at 5:00 p.m. Central European Time, with the first session starting on May 11th
  • Learn how to virtually test and optimize drone designs
  • Get dedicated and individual support to learn how to perform fluid dynamics, structural, or thermodynamics simulations.

Session 1 – 11th of May, 2017

AERODYNAMICS & PROPELLER DESIGN

 

In the first session, the trainer  will talk about to the fundamentals of aerodynamics and propeller design. You will learn how to predict the performance of your design using fluid flow simulations. Following this, you will design your own propeller as part of the homework.



Session 2 – 18th of May, 2017

STRUCTURAL SIMULATION

 

The second session of the drone workshop introduces participants to structural simulations of assemblies which can be very challenging. Beside this, we will also discuss strategies for lightweight design and how to apply them to your own models.



Session 3 – 25th of May, 2017

DROP ANALYSIS


The third and last session is dedicated to the simulation of a drop crash. Simulating the impact of the drone for several velocities will help you 
understand the critical falling velocity of your design.

 

Interested? Register for free and attend the SimScale Drone Design Workshop.

 

Read more…
3D Robotics

More powerful Pixhawk 3 coming soon

3689714598?profile=original

The PX4/Dronecode team and Drotek have been working on the next generation of Pixhawk autopilots, and you can now see a preview of that with the Pixhawk 3 Pro. It's based on the new PX4 FMU4 Pro standard, which includes a full suite of next-generation sensors and and the more powerful STM32F469 processor. It's designed for the Dronecode/PX4 flight software, which is the current official Pixhawk standard.

The board is currently in developer release, but will be taken out of beta after testing is complete in the next month or two.

All details are here (and below):

------------------

Introduction

FMUv4-PRO takes input from all of the Pixhawk stakeholders; end users, developers, researchers and manufacturing partners. Goals for this iteration of the platform are:

  • – An integrated, single-board flight controller for space constrained applications
  • – A modular multi-board flight controller for professional applications
  • – Sufficient I/O for most applications without expansion.
  • – Improved ease-of-use.
  • – Improved sensor performance
  • – Improved microcontroller resources (384 KB RAM, 2 MB flash).
  • – Increased reliability and reduced integration complexity.
  • – Reduced BoM and manufacturing costs.

 

Key design points

  • – All-in-one design with integrated FMU and optional IO and lots of I/O ports.
  • – Improved manufacturability, designed for simpler mounting and case design.
  • – Separate power supplies for FMU and IO (see power architecture section).
  • – Onboard battery backup for FMU and IO SRAM / RTC.
  • – Integration with two standard power bricks.

 

Technology upgrades

  • – Microcontroller upgrade to STM32F469; flash increases from 1MiB to 2MiB, RAM increases from 256KiB to 384KiB, more peripheral ports.
  • – ICM-20608, MPU9K integrated gyro / accelerometer / magnetometers.
  • – LIS3MDL compass (HMC5983 is now obsolete).
  • – Sensors connected via two SPI buses (one high rate and one low-noise bus)
  • – Two I2C buses
  • – Two CAN buses
  • – Voltage / battery readings from two power modules
  • – FrSky Inverter
  • – JST GH user-friendly connectors

 

I/O ports

  • – 6-14 PWM servo outputs (8 from IO, 6 from FMU).
  • – R/C inputs for CPPM, Spektrum / DSM and S.Bus.
  • – Analog / PWM RSSI input.
  • – S.Bus servo output.
  • – 6 general purpose serial ports, 2 with full flow control, 1 with separate 1A current limit, 1 with FrSky protocol inverter.
  • – Two I2C ports.
  • – Two external SPI ports (unbuffered, for short cables only).
  • – Two CANBus interfaces.
  • – Analog inputs for voltage / current of two batteries
  • – On-ground usage piezo buzzer driver.
  • – Sensor upgrade connector scheme
  • – High-power RGB LED.
  • – Safety switch / LED.

 

Mechanical Form Factor

  • – 71 x 49 x 23 mm (with case)
  • – 45g (with case)
  • – Standardized microUSB connector location
  • – Standardized RGB led location
  • – Standardized connector locations

 

System architecture

FMUv4-PRO continues the PX4FMU+PX4IO architecture from the previous generation, incorporating the two functional blocks in a single physical module.

 

PWM Outputs

Eight PWM outputs are connected to IO and can be controlled by IO directly via R/C input and onboard mixing even if FMU is not active (failsafe / manual mode). Multiple update rates can be supported on these outputs in three groups; one group of four and two groups of two. PWM signal rates up to 400Hz can be supported.

Six PWM outputs are connected to FMU and feature reduced update latency. These outputs cannot be controlled by IO in failsafe conditions. Multiple update rates can be supported on these outputs in two groups; one group of four and one group of two. PWM signal rates up to 400Hz can be supported.

All PWM outputs are ESD-protected, and they are designed to survive accidental mis-connection of servos without being damaged. The servo drivers are specified to drive a 50pF servo input load over 2m of 26AWG servo cable. PWM outputs can also be configured as individual GPIOs. Note that these are not high-power outputs – the PWM drivers are designed for driving servos and similar logic inputs only, not relays or LEDs.

 

Peripheral Ports

FMUv4-PRO recommends separate connectors for each of the peripheral ports (with a few exceptions). This avoids the issues many users reported connecting to the 15-pin multi-IO port on the original PX4FMU-PRO and allows single-purpose peripheral cables to be manufactured.

Five serial ports are provided. TELEM 1, 2 and 3 feature full flow control. TELEM4 can be switched into inverted mode by hardware and has no flow control. Serial ports are 3.3V CMOS logic level, 5V tolerant, buffered and ESDprotected.

The SPI ports are not buffered; they should only be used with short cable runs. Signals are 3.3V CMOS logic level, but 5V tolerant.

Two power modules (voltage and current for each module) can be sampled by the main processor.

The RSSI input supports either PWM or analog RSSI. CPPM, S.Bus and DSM/ Spektrum share now a single port and are auto-detected in software.

The CAN ports are standard CANBus; termination for one end of the bus is fixed onboard. .

 

Sensors

The new ICM-20608 has been positioned by Invensense as higher-end successor of the MPU-6000 series. The software also supports the MPU-9250, which allows a very cost-effective 9D solution.

Data-ready signals from all sensors (except the MS5611, which does not have one) are routed to separate interrupt and timer capture pins on FMU. This will permit precise time-stamping of sensor data.

The two external SPI buses and six associated chip select lines allow to add additional sensors and SPI-interfaced payload as needed.

IMU is isolated from vibrations. 

Power Architecture

Key features of the FMUv4-PRO power architecture:

  • – Single, independent 5V supply for the flight controller and peripherals.
  • – Integration with two standard power bricks, including current and voltage sensing.
  • – Low power consumption and heat dissipation.
  • – Power distribution and monitoring for peripheral devices.
  • – Protection against common wiring faults; under/over-voltage protection, overcurrent protection, thermal protection.
  • – Brown-out resilience and detection.
  • – Backup power for IO in the case of FMU power supply failure.
  • – Split digital and analog power domains for FMU and sensors.

 

FMU and IO Power Supplies

Both FMU and IO operate at 3.3V, and each has its own private dual-channel regulator. In order to address issues seen with PX4v1 and noisy power supply connections, each regulator features a power-on reset output tied to the regulator’s internal power-up and drop-out sequencing.

The second channel of each dual regulator is switchable under software control. For FMU this is used to permit power-cycling the sensors (in case of sensor lockup), and for IO this will make it possible to perform the Spektrum binding sequence.

 

Power Sources

Power may be supplied to FMUv4-PRO via USB (no peripherals in this mode) or via the power brick ports. Each power source is protected against reverse-polarity connections and back-powering from other sources. Power spikes observed on the servo bus (up to 10V) led to the removal of the power-from-servo option, users wanting this feature can connect the servo rail with a cable containing a Zener diode to the 2nd power brick port.

The FMU + IO power budget is 250mA, including all LEDs and the Piezo buzzer. Peripheral power is limited to 2A total.

 

Power Brick Port

The brick port is the preferred power source for FMUv4-PRO, and brick power will be always be selected if it is available.

 

Servo Power

FMUv4-PRO supports both standard (5V) and high-voltage (up to 10V) servo power with some restrictions. IO will accept power from the servo connector up to 10V. This allows IO to fail-over to servo power in all cases if the main power supply is lost or interrupted. FMUv4-PRO and peripherals combined may draw up to 2A total.

Power is never supplied by FMUv4 to servos.

 

USB Power

Power from USB is supported for software update, testing and development purposes. USB power is supplied to the peripheral ports for testing purposes, however total current consumption must typically be limited to 500mA, including peripherals, to avoid overloading the host USB port.

 

Multiple Power Sources

When more than one power source is connected, power will be drawn from the highest-priority source with a valid input voltage.

In most cases, FMU should be powered via the power brick or a compatible offboard regulator via the brick port or servo power rail.

In desktop testing scenarios, taking power from USB avoids the need for a BEC or similar servo power source (though servos themselves will still need external power).

 

Summary

For each of the components listed, the input voltage ranges over which the device can be powered from each input is shown.

Brick ports Servo rail USB port
    FMU 4 – 5.7V no yes
    IO 4 – 5.7V 4 – 10V yes
    Peripherals 4 -5.7V, 2A max 4 – 5.7V, 250mA max yes, 500 mA max

 

Peripheral Power :

FMUv4-PRO provides power routing, over/under voltage detection and protection, filtering, switching, current-limiting and transient suppression for peripherals.

Power outputs to peripherals feature ESD and EMI filtering, and the power supply protection scheme ensures that no more than 5.5V is presented to peripheral devices. Power is disconnected from the peripherals when the available supply voltage falls below 4V, or rises above approximately 5.7V.

Peripheral power is split into two groups:

  • – TELEM 1 has a private 1A current limit, intended for powering a telemetry radio. This output is separately EMI filtered and draws directly from the USB / Brick inputs. Due to the noise induced by radios powering a radio from this port is not advised.
  • – All other peripherals share a 1A current limit and a single power switch. 

Each group is individually switched under software control.

The Spektrum / DSM R/C interface draws power from the same sources as IO, rather than from either of the groups above. This port is switched under software control so that Spektrum / DSM binding can be implemented. Spektrum receivers generally draw ~25mA, and this is factored into the IO power budget. S.Bus and CPPM receivers are supported on this rail as well.

 

Battery Backup :

Both the FMU and IO microcontrollers feature battery-backed realtime clocks and SRAM. The onboard backup battery has capacity sufficient for the intended use of the clock and SRAM, which is to provide storage to permit orderly recovery from unintended power loss or other causes of in-air restarts. The battery is recharged from the FMU 3.3V rail. 

 

Voltage, Current and Fault Sensing :

The battery voltage and current reported by the power brick can be measured by FMU. In addition, the 5V unregulated supply rail can be measured (to detect brown-out conditions). IO can measure the servo power rail voltage.

Over-current conditions on the peripheral power ports can be detected by the FMU. Hardware lock-out prevents damage due to persistent short-circuits on these ports. The lock-out can be reset by FMU software.

The under/over voltage supervisor for FMU provides an output that is used to hold FMU in reset during brown-out events.

 

EMI Filtering and Transient Protection :

EMI filtering is provided at key points in the system using high-insertion-loss passthrough filters. These filters are paired with TVS diodes at the peripheral connectors to suppress power transients.

Reverse polarity protection is provided at each of the power inputs.

USB signals are filtered and terminated with a combined termination/TVS array.

Most digital peripheral signals (all PWM outputs, serial ports, I2C port) are driven using feature series blocking resistors to reduce the risk of damage due to transients or accidental mis-connections.

Inputs / Outputs

InputOutput pixhawk pro by Drotek

Read more…
3D Robotics

3689714866?profile=original

From DARPA. Have you ever seen so many swarming Iris+s?

Service Academies Swarm Challenge Live-Fly Competition Begins

U.S. Army, U.S. Navy, and U.S. Air Force academy teams compete in education-focused experiment to pave the way for future offensive and defensive swarm tactics for warfighters

Image Caption: This data visualization portrays the swarms of unmanned aerial vehicles that flew today over Camp Roberts, Calif., while implementing autonomous swarm tactics designed by Cadets at the U.S. Military Academy and the U.S. Air Force Academy for the DARPA Service Academies Swarm Challenge. U.S. Army, U.S. Navy, and U.S. Air Force academy teams are competing in an education-focused experiment designed to pave the way for future offensive and defensive swarm tactics for warfighters. Click below for high-resolution image.

Small unmanned aerial vehicles (UAVs) and other robots have become increasingly affordable, capable, and available to both the U.S. military and adversaries alike. Enabling UAVs and similar assets to perform useful tasks under human supervision—that is, carrying out swarm tactics in concert with human teammates—holds tremendous promise to extend the advantages U.S. warfighters have in field operations. A persistent challenge in achieving this capability, however, has been scalability: enabling one operator to oversee multiple robotic platforms and have them perform highly autonomous behaviors without direct teleoperation.

To help make effective swarm tactics a reality, DARPA created the Service Academies Swarm Challenge, a collaboration between the Agency and the three U.S. military Service academies—the U.S. Military Academy, the U.S. Naval Academy, and the U.S. Air Force Academy. An experiment at its heart, the research effort is designed to encourage students to develop innovative offensive and defensive tactics for swarms of small UAVs. Today the effort started its three-day Live-Fly Competition at Camp Roberts, a California Army National Guard post north of Paso Robles, Calif., which is hosting more than 40 Cadets and Midshipmen to demonstrate the highly autonomous swarm tactics they have developed since work started in September.

“In less than eight months, you have shown yourselves to be dedicated and talented participants in a complex and timely research effort,” Timothy Chung, the DARPA program manager leading the Swarm Challenge, told the teams. “DARPA is proud to have you—our future warfighters and Service leaders—participating in this endeavor to explore offensive and defensive swarm tactics. Now is your chance to show each other, DARPA, and our invited Defense Department guests your precedent-setting work toward an important goal: helping future U.S. forces maintain superiority in tomorrow’s technological and mission environments.”

“Not to mention your chance to claim bragging rights over your rival academies,” he added.

Teams

The Service Academies Swarm Challenge is the most recent example of how DARPA works to ensure the technological superiority of U.S. military forces by periodically engaging the U.S. military Service academies in research-oriented competitions. These competitions aim to cultivate the great potential of young officers-to-be and encourage their career-long collaboration with DARPA. In both 2014 and 2015, for example, DARPA conducted the DARPA Service Academy Innovation Challenge and the Service Academies Cyber Stakes.

A fourth-year capstone design course, the Swarm Challenge has pushed students to achieve “zero to swarm in eight months.” The goal: help the academies go from having little swarm-related expertise to developing capabilities with potentially near-term applicability for operational training and fielding—all within one academic year.

“This is one of the first opportunities for the next generation of operators and tacticians to explore and understand swarm-versus-swarm interactions,” Chung said. “It's not just about the platforms or the links or the communications—it’s about behaviors. That's one of the key takeaways for the Service Academies Swarm Challenge: that we're really zeroing in on swarm tactics as a battle skill. That's where advances will reap innovative benefits for future warfighters.”

Ranging in size from 11 to 21 students, the teams bring cross-disciplinary expertise in diverse technical and nontechnical fields, from computer science, robotics, and systems engineering to military strategy and operations. The students also bring fresh insights that DARPA and the U.S. military can learn from—and potentially expand upon—to enhance the tactical effectiveness of swarm systems.

DARPA provided all the hardware, much of the software, and a lot of know-how to get the teams started. The Agency also developed new support infrastructure to enable the teams to practice and compete in a virtual environment in preparation for this week. DARPA initially provided some example swarm tactics and the teams have since designed a number of their own to debut at the Live-Fly Competition.

Gameplay

The Service Academies Swarm Challenge is testing cutting-edge swarm tactics through a time-honored game that is all about tactics: Capture the Flag. Two teams at a time play inside the Battle Cube, a cubic airspace 500 meters on a side, 78 meters above the ground. Each team has been given 20 fixed-wing UAVs and 20 quad-rotor UAVs and, under the rules of play, can field a mixed fleet of up to 25 UAVs for each of two 30-minute battle rounds. Each team protects its “flag” (a large, inflatable ground target) while trying to score the most points before time runs out.

Teams seek to score the most points in the following ways:

  • Air-to-air “tags” by using a simulated (virtual) weapon to hit a sensor on an opponent’s UAV in flight
  • Air-to-ground “tags” by physically landing a UAV on the opponent’s “flag” located on the ground
  • Accomplishments in swarm logistics by launching as many UAVs as quickly as possible and keeping them aloft as long as possible

As scheduled over the next three days, the first match pits Air Force against Army; the second, Army plays Navy, and the third match has Navy squaring off against Air Force. The team that wins both of its matches wins the competition and takes home a trophy and bragging rights.

Technical Challenges

While people often think about swarms as simply being large collections of robots, swarms in fact have five defining characteristics: number, agent complexity, collective complexity, heterogeneity, and human-swarm interaction. The Service Academies Swarm Challenge is designed to explore these characteristics as they apply to both offensive and defensive swarm tactics, and to harness and leverage them for warfighter benefit.

Operationalizing swarm-related capabilities poses many significant challenges, from technology (e.g., battery size and life, user interfaces) to logistics (e.g., transport, upkeep) to understanding the limitations of what swarms can do. The Swarm Challenge provides an experimental testbed that manages these challenges so the academy students can focus on creating innovative swarm tactics.

DARPA’s interest in developing breakthrough swarm capabilities for national security extends beyond the Swarm Challenge to a number of current programs exploring autonomy, communications, and other technologies, including:

Looking Ahead

The results of the Live-Fly Competition remain to be seen but the Service Academies Swarm Challenge has already planted the seeds for future growth in the increasingly important domain of swarm tactics. Thanks to the past eight months of work, all three academies now have basic curricula and frameworks for conducting unique, accelerated research and field experiments on swarm tactics. The institutions also have dozens of students who have worked side-by-side with leading-edge researchers and operators experienced with advancing the state of knowledge of swarm systems.

“The Cadets and Midshipmen participating in the Service Academies Swarm Challenge will soon be officers, where they can further hone their swarm-tactics skills and share their know-how with fellow Service members,” Chung said. “Building on the lessons they have learned, they can help accelerate us toward a near future in which Service members are able to quickly take advantage of anticipated advances in unmanned aerial system technologies and apply their skills in new and ever more creative ways. No matter which academy comes out on top here, U.S. warfighters will be the ultimate winners.”

Read more…
3D Robotics

OpenMV is my favorite computer vision camera, and it combines a built-in OpenCV-like machine vision library with a microPython interface, all in one little board for $65. It also speaks MAVLink, so Patrick Poirier was able to do the above with just these extra lines of code to the standard built-in color tracking example:

while(True):
clock.tick()
img = sensor.snapshot()
for blob in img.find_blobs([thresholds[threshold_index]], pixels_threshold=100, area_threshold=20, merge=True):
img.draw_rectangle(blob.rect())
img.draw_cross(blob.cx(), blob.cy())
send_landing_target_packet(blob, img.width(), img.height())
print("TRACK %f %f " % (blob.cx(), blob.cy()))

and then send the result to Pixhawk via MAVLink:


def send_landing_target_packet(blob, w, h):
global packet_sequence
temp = struct.pack("<qfffffbb",
0,
(((blob.cx() / w) - 0.5) * h_fov)/2.25,
(((blob.cy() / h) - 0.5) * v_fov)/1.65,
0, #int(z_to_mm(tag.z_translation(), tag_size) / -10),
0.0,
0.0,
0,
MAV_LANDING_TARGET_frame)

Read more…
3D Robotics

DIY Drones at 84,000 members!

3689714093?profile=originalIt's customary and traditional that we celebrate the addition of every 1,000 new members here and share the traffic stats. We've now passed 84,000 members!

Thanks as always to all the community members who make this growth possible, and especially to the administrators and moderators who approve new members, blog posts and otherwise respond to questions and keep the website running smoothly.

Read more…
3D Robotics

Our sister site, DIY Robocars, is running joint outdoor event with Self Racing Cars at the Thunderhill Raceway in Willows, CA.  You can sign up here. We're going to be running a courses for smaller autonomous cars (~1/10th scale), which can use cameras, GPS, LIDAR and other sensors to navigate the course. As with our monthly Oakland events, the track for 1/10th scale cars will have white lines and obstacles. Unlike our indoor events, GPS can be used. Course dimensions and positions will be posted closer to the event.

600_459512473.jpeg

Power and wifi will be provided. We do have some tables and chairs and shade, but not enough for everyone, so if you do have a folding table, chairs and shade of some sort, please do bring that. 

Schedule:

Sat 4/1: 

• Gates open at 9:00am for setup. Testing and training until 12:00

• 12:00 - 1:00pm: Lunch break

• 1:15 - 3:00pm: Competitive rounds

• 3:00 - 5:00pm: Hack, test, prepare for tomorrow

Sun 4/2:

• Gates open at 9:00am

• 10:00am - 12:00: Competitive rounds

• 12:00 - 1:00pm: Lunch break

• 1:00 - 3:00: Hack and test

Here's the announcement for the overall event:

Following the success of the inaugural event last year, Self Racing Cars returns with a new track day designed to push autonomous cars towards true race-ready shape. We're expecting more cars, higher speeds, and some very clever engineering. We will be running a testing and competition event at Thunderhill West on April 1-2, 2017 

Testing and Demoing

Please apply to attend if you have any of the following things to test or demo:

• Autonomous vehicles

• Drivetrain innovation (electric or otherwise)

• Sensors

• Cameras

• Software and Algorithms

• Teleoperation

• Connected Cars (v2v or v2i)

Independent and hobbyist teams are especially welcome. Smaller vehicles and robots can be tested and demoed in the paddock.

We can help match you with a driver and vehicle if you are testing systems, especially sensors. 

Competition

We will be running timing and scoring for best autonomous laps the full-sized 2.1 mile track. Any vehicle that can make the two mile lap is welcome to attempt. Awards will be given for top performers in each class, with competition classes (e.g. "full-sized vehicle", "gas-powered go-kart", "teleoperated", etc.) to be determined as the event unfolds. Comma.ai will be sponsoring a prize for "fastest full track by comma neo vehicle" -- details to follow.

Udacity will be sponsoring a prize for “fastest full track by an independent or small company team” and “fastest 1/10th scale lap” -- details to follow.

If you are interested in sponsoring a challenge, please contact us at contact@selfracingcars.com.

We are currently working on simple vehicle build based on a CIK competition go-kart so that there is a starting point for a future competition class. We are also working on a simple radio flags system. Both of these will be documented as they develop.

Data

As with our last event, we will be sharing data collected with various sensors and systems. Developers building on your systems will be able to see what the output looks like in a real-world yet controlled situation.

Confirmed Attendees

• Renovo Motors (Vehicle Platform for Autonomous/Connected/Electric)

• Comma.ai (Autonomous Vehicles)

• Point One Navigation (Spatial Localization for Autonomous Cars)

• Righthook (Vehicle simulation and testing)

• Revl (Action Camera)

• AutonomouStuff (Autonomous Hardware and Development Platform)

• Swift Navigation, Inc. (GPS for Autonomous Vehicles)

• PolySync (Autonomous Vehicle Software Platform)

• Xsens (IMU, AHRS, GNSS/INS)

• Udacity (Autonomous Vehicle Education)

• Compound Eye (Vision)

• NVIDIA (Self-Driving Car Platforms)

• Velodyne LIDAR (Sensing)

• CivilMaps (Cognition and Autonomous Mapping)

• "Right Turn Clyde" (Autonomous GoKart)

• Vector AI (Deep Learning Intelligence Vehicle)

• Sanborn (HD Maps for Autonomous Driving)

Read more…
3D Robotics

3689713091?profile=originalI love the OpenMV board, a $65 open source computer vision module that combines a camera and powerful M7 processor with a nice built-in Micropython interpreter and OpenCV-like computer vision library. It's also got a fantastic IDE that makes using it the simplest computer vision experience I've ever tried.  You can see in the video below how I used the board by all by itself (it can also control servos) to drive an autonomous car to follow a track:

Because the board can be easily scripted in Python, it can be configured to be pretty much any computer vision sensor you want it to be, from optical flow and line detection to full-on object recognition and tracking. Even better, it can now instantiate as a MAVLink sensor, so you can add it to any MAVLink-compatible autopilot as a sensor. 

Here's the code for making it a MAVlink optical flow sensor. 

And here's the code for making it a MAVLink April Tag sensor. April tags are the best way to recognize codes from afar with computer vision. Demo below and more info here:

Read more…
3D Robotics

RaspberryPi hits 12.5m boards sold

3689713015?profile=original

From The Verge:

The official Raspberry Pi magazine just announced that over 12.5 million of the affordable little Linux boards have been sold since the original Pi was launched in 2012. As The MagPipoints out, this puts the Raspberry Pi past Commodore 64 sales, according to some estimates. That would make the Pi the third best-selling "general purpose computer" ever, behind Apple Macintosh and Microsoft Windows PCs. As commenters have pointed out, this isn't a precisely fair comparison, because there were other Commodore models than just the 64, but it's still a nice milestone all the same.

It's interesting that the most sold is the latest, the RaspberryPi 3, which is in my mind the first one that really reached the potential of a single-board Linux computer (built-in Wifi and decent processing power). So that bodes well for continued accelerating growth.

By comparison, as of four years ago, the Arduino team had estimated that there were 1.4m Arduinos and clones sold and presumably it's several times that now (Arduino was launched in 2007, five years earlier than Raspberry Pi).  The two don't really compete -- Arduino is best for physical computing and interfacing with sensor, while RPi is best for video and running Python code fast -- but I do find that I almost never use Arduinos anymore. There are enough good RPi "hats" that can read and control other hardware that I rarely find the need to add an Arduino into projects now.

Read more…
3D Robotics

Nice to see a 3DR Iris+ there, but I'm slight afraid it got totally destroyed afterward ;-) Check out DroneClash for more mayhem. Looks fun!

In a real life video game of sorts, four teams will battle simultaneously. They can use as many drones as they like, but each team is only allowed one FPV video stream to their drones. This means, in practice, only one drone can be tele-operated per team. But, teams may switch between drones or create autonomous drones, and anything in between. As long as it flies.

Read more…
3D Robotics

3689712685?profile=originalThere's much to like in the new Beaglebone Blue board. Along with the usual Beaglebone processing power and Linux support, it has built-in IMU sensors, good WiFi, proper connectors and (for rovers) motor drivers and wheel encoders. It's also just $79, and supports ArduPilot out of the box!

But before you toss your RasperryPi 3/Navio 2 combo (which is awesome but costs more than twice as much), here are a few things to keep in mind. 

  • No built-in GPS (not a big deal, since outboard GPS is the standard)
  • Not really designed for video. There's no HDMI in or out, and no GPU. You connect a USB camera but it's not as well supported as video is on the RaspberryPi 3
  • Community support for Beaglebone is not as strong as RaspberryPi

That said, for ground rovers I think it's a very interesting choice because of the built-in motor driver and encoders. I'm going to give one a shot. 

Read more…
3D Robotics

3689712741?profile=original

Some highlights from the testimony before Congress yesterday by Brendan Schulman of DJI. I think he's talking about us!

3689712750?profile=original

Summary from Dronelife:

Schulman’s spoken testimony asked for consideration of the increasing proliferation of state and local laws, many of which limit commercial drone operations.  His written testimony

 went further, asking legislators to reconsider a “micro” category, and asking that lawmakers move forward on expanding regular BVLOS flight, flight over people, and flight at night.  “Research and rulemaking priorities should focus on rules that enable the broadest range of beneficial applications that can be achieved within today’s technological capabilities. For example, a rule for routine part 107 night operations would enable search and rescue operations during critical hours when time is of the essence. That’s just one example of an immediate benefit that can be realized through nothing more than rulemaking,” writes Schulman. “Delays in the rulemaking process will have a negative economic impact, and curtail public safety operations, including those that save lives.”

Schulman points out that while technology features such as geofencing offer safety options, they aren’t designed to be used as a sledgehammer solution.  “The notion that airports and drones never mix is an oversimplification,” he points out.  “We have many customers doing important work at airports, enhancing the safety of the national airspace system. Similarly, our live geofencing system can help prevent drones from entering wildfire locations, but we also know that firefighters are using our drones to keep themselves safe and to help fight fires. Completely disabling a drone in such locations would result in a net detriment to public safety.”

“Here is the lesson we have learned that only comes with actual operational experience across hundreds of thousands of customers: while geofencing is a great feature that helps prevent inadvertent operations, it will always require a balanced approach involving exceptions. Requiring drones to simply turn off when they are near airports is not the right solution to safety concerns.”

Schulman’s testimony points out that identifying a single technology in rulemaking is a mistake, locking the industry into a soon to be outdated solution.  “…locking in any specific technology mandate will discourage DJI and our colleagues in the industry from continuing to rapidly develop new safety technologies,” he writes.   “A requirement to implement the best technology available today discourages manufacturers from developing the even better safety technologies of tomorrow.”

Finally, Schulman emphasized that recreational operators should not be forgotten in the rush to support commercial applications.  “Of key importance to the future of innovation in our industry is maintaining a pathway for people to experiment with technology on a personal level,” he points out.  “Today’s hobbyist is tomorrow’s inventor, and tomorrow’s inventor is next year’s technology industry pioneer.”

Read more…
3D Robotics

From The Verge:

Of all of the weird and wonderful ways nature creates resilient bodies, perhaps the most underappreciated is simply being so squidgy your problems just bounce off you. Well, a group of biomimicry researchers from Swiss university EPFL are taking note, and have created a soft, crash-proof drone modeled after the flexible bodies of insects.

According to IEEE Spectrum, the scientists were inspired by a unique joint found in wasp wings. These joints allow the creatures’ wings to stay stiff during flight, but “reversibly crumple” in the case of a crash — saving the thin membranes from tearing. The EPFL researchers didn’t copy this system exactly for their quadcopters, but paired a flexible external frame with a magnetic coupling system that joins the frame to the central body. In the event of a crash the body pops out of the frame before clicking back into place.

The foam arms of the drone are attached via a magnetic joint to the central body. Image: EPFL

“The uniqueness of the proposed design lies in the fact that the frame is rigid during flight, but softens during collisions,” wrote the researchers in a paper published in the journal IEEE Robotics and Automation Letters“This allows combination of the advantages of both rigid and soft systems: stability and rapid response to user commands during flight [and] crash resilience like a soft system.”

The test drone withstood 50 crashes with no permanent damage. The researchers say one main benefit of their system (apart from durability) is keeping humans safe from drone crashes. 

Read more…
3D Robotics

3689712285?profile=originalFrom GPS World:

3DroneMapping completed a project under tight time and space constraints — surveying a tiny tropical island without disturbing guests.

The 15-hectare island nine kilometers from the Zanzibar coast is isolated from the rest of the world. Surrounded by coral reefs and sandbars, the island is home to an exclusive resort, but its limited space is threatened by erosion from changing currents.

Developers are concerned that proposed structures will be washed out to sea in a few years. Because no plans or maps of the island have ever been drawn or surveyed, they felt it was important to provide scale and dimension to architects for a master plan.

surveying-with-pole-W

The survey needed to include existing structures, pathways, major trees, visible services, high-tide marks, levels and contours. It needed to be done in a tight timespan, before the island closed for renovations in three months. Also, the survey could not disturb any guests.

Using a custom-built multi-rotor drone with a high-resolution camera allowed 3DroneMapping to obtain images with good detail but taken far enough from guests to not bother them. Control points were located strategically, in places not visible to the public.

surveying-with-drone-O

Luke Wijnberg, CEO of 3DroneMapping, conducted the survey with the L1 Reach by Emlid. “Such a survey could not have been possible without drones and Reach kit,” Wijnberg wrote in a blog. “Using this technology kept the pricing low for the customer, kept time on the ground and disturbance to guests to a minimum and provided a very quick turnabout time.”

Reach-Base-W

After fieldwork was completed, the photogrammetric process was a fairly simple affair with 600 images collected and control added to the model. A high over and sidelap was required to obtain ground strikes between the vegetation.

The ground strikes were then extracted from the dense point cloud using specialized 3D point cloud editing and classification software. Other features were exported to a CAD program.

All files were handed to the client via an online GIS platform with AutoCAD files for the master planners.

Learn more about the project on the 3DroneMapping website.

Read more…