Chris Anderson's Posts (2718)

Sort by
3D Robotics

The flying umbrella project

That's a tough controls problem! From Popular Science:

The sight of flying umbrellas, changing altitude with a fluttering rhythm, looks more like an animated Disney scene than graduate work by a student engineer.

“I wanted to push the envelope of coordinating drones in the sky,” says the project’s creator Alan Kwan, a student in MIT’s “ACT” (Art, Culture and Technology) program. He wanted his drones to act almost alive, “not like things to be controlled by an algorithm,” he says, “but flying creatures that take on a synchronous life.”

A Hong Kong native, Kwan, 25, has explored scientific art before. He won an award for his Beating Clock project, a reanimated pig heart that keeps time. He’s also exhibited numerous virtual reality projects, like terrifying simulations of mental hospitals and alien abductions. Perhaps the most memorable isBad Trip, an explorable 3D dreamscape filled with scenes from Kwan’s daily life: ten hours of first-person “life logging” videos that he has been recording since November 2011.

Beating Clock

Courtesy of Alan Kwan

Beating Clock

Kwan's interest in blurring the mechanical and the organic is visible in his Beating Clockproject.

Coming off his virtual pursuits, Kwan wanted to explore a physical, real-world project. He found an opportunity in February 2015, when the University of Applied Arts in Vienna commissioned him to explore synesthetic works for their Digital Synesthesia Research Project (DSRP).

Traditional synesthesia is a rare neurological condition where sensory experiences overlap—a person might associate sounds with colors or words with flavors. Digital synesthesia, according to the DSRP’s website, is “an artistic research area which focuses on the possibilities of digital art to create translational and cross-modal sensory experiences and to provide synesthetic experiences for “non-synesthetes.”

Kwan decided to create a synesthesia-like experience by flipping the audience’s perception and giving a typically inanimate object a life of its own: He would make umbrellas behave like jellyfish.

“I think flying robots really embody our fascination with unreal things flying in the sky,” Kwan says. The DSRP agreed—they accepted his proposal for umbrella jellies, and Kwan received additional funding from the MIT Council for the Arts.

The trouble started when Kwan started building: As soon as it began, the project was beset by technical setbacks. Umbrellas are not particularly aerodynamic—neither are the jellyfish Kwan wanted them to imitate.

First, Kwan tried to rig umbrellas to commercial drones, but that method failed quickly. The drones Kwan wanted to use concentrated all their mass in the center of the flying body. This meant that the shaft of the umbrella had nowhere to go, and placing it in an off-centered position threw off the drone’s balance. He would have to build a drone from scratch.

Custom Drone

Courtesy of Alan Kwan

Custom Drone

Kwan's design employs a custom drone that is center mounted to the umbrella, with the battery serving as a stabilizing weight.

Kwan’s classmate, Bjorn Sparrman, helped him to fabricate the custom carbon fiber and metal components of the new drone. The pair also received input from interested peers and faculty.

The umbrella did not just need a holder—the custom drones needed to compensate for the weight of the umbrella, its aerodynamic drag, and its disruption of a free airstream to the drone’s propellers.

“It generates a lot of resistance,” Kwan says. “Airplanes and helicopters are much more aerodynamic. This is, however, a performance. It’s not about drones. It’s about umbrellas—the drones are a tool that we use.”

Sustained flight was becoming possible, but takeoff was not. Kwan developed a launching platform that would accommodate the awkwardly shaped aircraft and let it land more precisely.

Within three months, the drone was successfully airborne indoors. But it had not yet stood up to the rigors of open skies, and it did not yet move like a jellyfish. Kwan tried to devise a mechanical method for the umbrella to pulsate like the real thing. This idea literally would not fly: The servo motors necessary to power such a system would weigh the drones down too much.

Servo Control

Courtesy of Alan Kwan

Servo Control

Kwan's original concept for mimicking jellyfish used servo motors to open and close the umbrellas.

But as it turned out, the servos were not necessary—nothing was. When the drone’s propellers hit just the right speed, the rotors actually changed the air pressure above and below the umbrella. This pulled the umbrella open and shut, just slightly, which created a rippling jellyfish effect. This method is actually similar to the way real jellyfish move. Engineers have comparedjellyfish propulsion to helicopter propulsion, finding that a helicopter’s thrust is much less efficient than the biological version.

By February 2016, Kwan’s flying umbrellas took off outdoors and unassisted. Synchronized to music, the bizarre concept achieves the artist’s goal: Technology has transformed a mechanical object into a living—and flying—creature from the sea.

Digital synesthesia is not the only goal of Kwan’s project. “People have this perception of drones as weapons,” he says, “and I’m trying to push this work in the direction of the poetic. I think that contrast is interesting.”

Read more…
3D Robotics

3689702695?profile=originalIt's customary and traditional that we celebrate the addition of every 1,000 new members here and share the traffic stats. We've now passed 81,000 members!

This year our hosting network, Ning, has been through a lot, including horrible performance issues in the first part of the year that caused us to start looking for a replacement, and then the bankruptcy of their parent company, Mode Media. But now it's in new hands and we're cautiously optimistic that with proper focus and the resources to start addressing some long-time tech issues, it might still be able to turn the corner and remain a viable host for us. We'll give them another month or two to prove this, or we'll restart the process to migrate to a better platform

Thanks as always to all the community members who make this growth possible, and especially to the administrators and moderators who approve new members, blog posts and otherwise respond to questions and keep the website running smoothly.

Read more…
3D Robotics

Airbus Cargo Drone using Dronecode/PX4 stack

zpzSWb5ZhiBDVqVE_QvvGiwIFZpx5EFLCaVD4SrCgWUc7fK83iZY9gIFpeoedimBsI38SkyivRGuZwhrYgjDWkHpFzsJ6l3ZE7r9t6Cl9xIgCbZ1aEkq67DPK4ScOpilHer2wwAATeam prepares quarter-scale drone for takeoff.

From the Local Motors blog, a report from the first bench testing of the 1/4 scale prototype of the Airbus Cargo Drone Challenge winner, which will debut at the Commercial UAV Expo in Las Vegas next month. That's me at right (with safety glasses)

Two months ago our co-creation community was drawing sketches on napkins to determine wing taper ratios and materials for what would ultimately become a finished cargo drone. The build has now progressed to the point that Local Motors and Airbus engineers have completed scale models and experimental test flights.

The nose cone, fuselage, tail boom and several other parts of the full-scale Zelator-28 are nearing completion.

Id5bWNOw4o6ontd2f4riwx-rC9E8FsTtQhWq9BiOQ6SVNlzrd8_LId1VRtlPc2VyTqLktlbTOgWnjbzl55l5_9ixSR3DFpEEq-pgb6ihd4OCX36KCA-bznT9sETpiMxdMqQmMiO5Pre-epoxy full-scale build.

The experimental quarter-scale model took flight this past weekend in Henderson, Nev. Local Motors engineers and Jon Daniels of Praxis Aerospace Concepts International (PACI) teamed with Chris Anderson of 3D Robotics (3DR) on the integration and demonstration of his flight control system for vertical take-off and transitional forward flight, known as PX4. Anderson’s flight system served as an interface for the motors, speed controllers and servomechanisms.

toi7JDj9IEC-HT9BQpRRloOI51flrtiPNl8wtbxNYM1xE1aIW-nOBy9wsvyiIBQg8qR2QReootVm0mwTtOlJ0qjdv8XTwWO48dgIe8W9-YVymMEdynFFnzY5ahRly1hIFxsAz2N4

The test model completed a hover and touchdown, but a few mechanical issues prevented forward flight. The experiment was very beneficial to the build team, however, as it helped identify potential failure modes before stepping up to the full scale aircraft.

“They are simple fixes for the most part. There was a hardware conflict with the servos and an issue with the motor pods in the quarter-scale model,” said Local Motors engineer Alex Palmer.  “[The quarter-scale model] has a different attachment design than the full scale so that issue will resolve itself.”

The team is now focused on completing two full-scale models of the Zelator-28 that should be ready for flight testing in the next two weeks. Airbus engineers are planning to join the team for the first flights of the prototype. There are still active discussions on the project page for the co-creation community to provide ideas and feedback pertaining to wing design and the landing gear. Make your mark on the present and future of unmanned aerial vehicles by getting in on the discussion.

Read more…
3D Robotics

demo gif

The blue line is what the model thinks it should do, the green line is what I actually did when a human was steering it.

From Hackaday, a great project showing how to create a self-driving R/C car that can follow a complex road pattern without human intervention. It uses TensorFlow running on an Intel processor onboard. Click through to read more about the importance of polarizing filters and how to implement TensorFlow.

Unexpectedly they have eschewed the many ARM-based boards as the brains of the unit, instead going for an Intel NUC mini-PC powered by a Core i5 as the brains of the unit. It’s powered by a laptop battery bank, and takes input from a webcam. Direction and throttle can be computed by the NUC and sent to an Arduino which handles the car control. There is also a radio control channel allowing the car to be switched from autonomous to human controlled to emergency stop modes.

They go into detail on the polarizing and neutral density filters they used with their webcam, something that may make interesting reading for anyone interested in machine vision. All their code is open source, and can be found linked from their write-up. Meanwhile the video below the break shows their machine on their test circuit, completing it with varying levels of success.

Read more…
3D Robotics

Just a few years ago, this was PhD-level stuff at Stanford, and now you can do at home for $100. These are awesome times for robotics. From Lukas Biewald at O'Reilly:

Object recognition is one of the most exciting areas in machine learning right now. Computers have been able to recognize objects like faces or cats reliably for quite a while, but recognizing arbitrary objects within a larger image has been the Holy Grail of artificial intelligence. Maybe the real surprise is that human brains recognize objects so well. We effortlessly convert photons bouncing off objects at slightly different frequencies into a spectacularly rich set of information about the world around us. Machine learning still struggles with these simple tasks, but in the past few years, it’s gotten much better.

Deep learning and a large public training data set called ImageNet has made an impressive amount of progress toward object recognition.TensorFlow is a well-known framework that makes it very easy to implement deep learning algorithms on a variety of architectures. TensorFlow is especially good at taking advantage of GPUs, which in turn are also very good at running deep learning algorithms.

Building my robot

I wanted to build a robot that could recognize objects. Years of experience building computer programs and doing test-driven development have turned me into a menace working on physical projects. In the real world, testing your buggy device can burn down your house, or at least fry your motor and force you to wait a couple of days for replacement parts to arrive.

Architecture of the object-recognizing robotFigure 1. Architecture of the object-recognizing robot. Image courtesy of Lukas Biewald.

The new third generation Raspberry Pi is perfect for this kind of project. It costs $36 on Amazon.com and has WiFi, a quad core CPU, and a gigabyte of RAM. A $6 microSD card can load Raspberian, which is basically Debian. See Figure 1 for an overview of how all the components worked together, and see Figure 2 for a photo of the Pi.

Raspberry PiFigure 2. Raspberry Pi running in my garage. Image courtesy of Lukas Biewald.

I love the cheap robot chassis that Sain Smart makes for around $11. The chassis turns by spinning the wheels at different speeds, which works surprisingly well (see Figure 3).

Robot chassisFigure 3. Robot chassis. Image courtesy of Lukas Biewald.

The one place I spent more money when cheaper options were available is the Adafruit motor hat (see Figure 4). The DC motors run at a higher current than the Raspberry Pi can provide, so a separate controller is necessary, and the Adafruit motor hat is super convenient. Using the motor hat required a tiny bit of soldering, but the hardware is extremely forgiving, and Adafruit provides a nice library and tutorial to control the motors over i2C. Initially, I used cheaper motor controllers, but I accidentally fried my Pi, so I decided to order a better quality replacement.

Raspberry Pi with motor hat and cameraFigure 4. Raspberry Pi with motor hat and camera. Image courtesy of Lukas Biewald.

$15 camera attaches right into the Raspberry Pi and provides a real-time video feed I can use to recognize objects. There are tons of awesome cameras available. I like the infrared cameras that offer night vision.

The Raspberry Pi needs about 2 amps of current, but 3 amps is safer with the speaker we’re going to plug into it. iPhone battery chargers work awesomely for this task. Small chargers don’t actually output enough amps and can cause problems, but the Lumsing power bank works great and costs $18.

A couple of HC-SR04 sonar sensors help the robot avoid crashing into things—you can buy five for $11.

I added the cheapest USB speakers I could find, and used a bunch of zip ties, hot glue, and foam board to keep everything together. As an added bonus, I cut up some of the packaging materials the electronics came with and drew on them to give the robots some personality. I should note here that I actually built two robots (see Figure 5) because I was experimenting with different chassis, cameras, sonar placement, software, and so forth, and ended up buying enough parts for two versions.

My 4WD robot and her 2WD older brotherFigure 5. My 4WD robot (right) and his 2WD older sister. Image courtesy of Lukas Biewald.

Once the robot is assembled, it’s time to make it smart. There are a milliontutorials for getting started with a Raspberry Pi online. If you’ve used Linux, everything should be very familiar.

For streaming the camera, the RPi Cam Web interface works great. It’s super configurable and by default puts the latest image from the camera in a RAM disk at /dev/shm/mjpeg/cam.jpg.

If you want to stream the camera data to a webpage (very useful for debugging), you can install Nginx, an extremely fast open source webserver/proxy. I configured Nginx to pass requests for the camera image directly to the file location and everything else to my webserver.

http {    server {       location / {             proxy_pass http://unix:/home/pi/drive.sock;          }             location /cam.jpg {                 root /dev/shm/mjpeg;          }    } } 

I then built a simple Python webserver to spin the wheels of the robot based on keyboard commands that made for a nifty remote control car.

As a side note, it’s fun to play with the sonar and the driving system to build a car that can maneuver around obstacles.

Programming my robot

Finally, it’s time to install TensorFlow. There are a couple of ways to do the installation, but TensorFlow actually comes with a makefile that lets you build it right on the system. The steps take a few hours and have quite a few dependencies, but they worked great for me.

TensorFlow comes with a prebuilt model called “inception” that performs object recognition. You can follow the tutorial to get it running.

Running tensorflow/contrib/pi_examples/label_image/gen/bin/label_image on an image from the camera will output the top five guesses. The model works surprisingly well on a wide range of inputs, but it’s clearly missing an accurate “prior,” or a sense of what things it’s likely to see, and there are quite a lot of objects missing from the training data. For example, it consistently recognizes my laptop, even at funny angles, but if I point it at my basket of loose wires it consistently decides that it’s looking at a toaster. If the camera is blocked and it gets a dark or blurry image it usually decides that it’s looking at nematodes—clearly an artifact of the data it was trained on.

Robot plugged inFigure 6. Robot plugged into my keyboard and monitor. Image courtesy of Lukas Biewald.

Finally, I connected the output to the Flite open source software package that does text to speech, so the robot can tell everyone what it’s seeing (see Figure 6).

Testing my robot

Here are my two homemade robots running deep learning to do object recognition.

Final thoughts

From 2003 to 2005, I worked in the Stanford Robotics lab, where the robots cost hundreds of thousands of dollars and couldn’t perform object recognition nearly as well as my robots. I’m excited to put this software on my drone and never have to look for my keys again.

I’d also like to acknowledge all the people that helped with this fun project. My neighbors, Chris Van Dyke and Shruti Gandhi, helped give the robot a friendly personality. My friend, Ed McCullough, dramatically improved the hardware design and taught me the value of hot glue and foam board. Pete Warden, who works at Google, helped get TensorFlow compiling properly on the Raspberry Pi and provided amazing customer support.

Article image: Eye of Providence. (source: Bureau of Engraving and Printing on Wikimedia Commons).
Read more…
3D Robotics

3689701687?profile=originalFrom The Verge:

Verizon will soon start offering data plans for drones, allowing pilots to connect their devices to the telecom's network to stream video and transmit other data to a computer or smartphone on the ground. The plans will start at 1GB for $25 a month and 10GB for $80 a month, according to The Wall Street Journal. While media transmission is one obvious use case, having working LTE mobile chips on drones could also in the future help companies pilot them remotely. Verizon says it expects commercial operations to be particularly interested in its drone data plans, including industries like energy, agriculture, and nature and wildlife preservation. Verizon will also test drones as aerial cell towers to patch holes in its network coverage.

The company is calling its new drone business Airborne LTE Operations (ALO) and hopes unmanned aerial vehicles could provide a revenue stream when FAA regulations are more lenient and piloting becomes a more mainstream commercial and hobbyist activity.

AT&T is also gearing up to test drones on its LTE 4G network, which are powered by Qualcomm's Snapdragon Flight platform

Read more…
3D Robotics

3689701866?profile=originalWhat DIY Drones were a half-decade ago, DIY autonomous cars are now. I'm part of the Self Racing Cars group in the SF Bay Area, who are building and racing autonomous vehicles from 1/10th scale to go-karts to full-size cars (Ford Fusion is the go-to vehicle, since it's the cheapest car that's fully drive-by-wire and can easily be made autonomous).

Above is the autonomous go-kart that Autodesk CEO Carl Bass and I built, with autonomy provide by a Pixhawk and the APM:Rover code, plus custom computer vision processing on a RaspberryPi.  (The gorilla is for ballast -- and looks)

If you're in the Bay Area, you can join the Meetup group here. First hackathon/meetup is on Oct 30 in Berkeley. 

Read more…
3D Robotics

A drone with insect-inspired folding wings

From Robohub:

When designing robots to help in the search for victims after a natural disaster, a number of features are important: robustness, long battery life and ease of transport. With this latest constraint in mind, a team from Floreano Lab, EPFL and NCCR Robotics will present their new drone with insect-inspired folding wings at IROS 2016.

What makes these wings different to previous solutions is the origami techniques used to produce it, creating the perfect folding structure. First, the research team looked for examples from nature which exhibit folding patterns with a high size reduction and one degree of freedom to fold the wing with single and intuitive movement in a short amount of time. Coleopterans (beetles) were found to not only have the perfect wings, but also control wing deployment from the base of the wing, making them easier to artificially replicate.

Through prototyping and modelling, the original coleopteran blueprints were adapted and updated. The artificial crease pattern achieves a significant size reduction. In the stowed configuration the wingspan is 43% and the surface is 26% of the respective dimensions in the deployed configuration. Despite the complexity of the patterns, the wing has a single degree of freedom and can be folded using only one simple movement.

Finding the crease pattern was only one of the issues that the research team hoped to solve. When using paper for origami, the thickness is negligible, however, when creating a wing, a thicker material must be employed in order to sustain the stresses created during flight. The thicker material is accounted for by creating a 3D folding pattern with tiles of different thickness. The addition of compliant and bistable folds made of pre-stretched latex ensures maximum durability and a smooth deployment.

The presented wings are 26g in weight, with dimensions of 115 x 215 x 40 mm when folded and 200 x 500 x 16 mm when deployed, giving 160 cm2 surface area and 989 cm3 volume when folded and 620 cm2 surface area and 1600 cm3 when deployed. The resulting drone has been tested against a comparable rigid wing in a wind-tunnel, and showed only marginally less good performance when considering lift/drag values.

The ability to create a lightweight, durable drone that is capable of being easily transported and quickly deployed moves us not only closer to commonplace use of robots in locating victims after natural disasters, but also in land and space exploration, aeronautics and civil inspections.

Read more…
3D Robotics

Open source Iridium modem

3689701767?profile=originalFrom sUASNews:

If over the horizon satellite communications are a requirement you are in luck! An Outback Challenge team from Holland just helped you out.
For participation to the https://uavchallenge.org/medical-express/ the http://mavlab.tudelft.nl/ team had to design a cheap and light weight iridium satellite communication module for there www.delftAcopter.nl.
As there is huge interest from the UAV community in using light weight and reliable satellite communication the http://mavlab.tudelft.nl/ made the iridium development open-source so the UAV community can benefit from this achievement.
For the opensource hardware and software see:

Read more…
3D Robotics

From IEEE Spectrum:

Just a few weeks ago, we posted about some incredible research from Vijay Kumar’s lab at the University of Pennsylvania getting quadrotors to zip through narrow gaps using only onboard localization. This is a big deal, because it means that drones are getting closer to being able to aggressively avoid obstacles without depending on external localization systems. The one little asterisk to this research was that the quadrotors were provided the location and orientation of the gap in advance, rather than having to figure it out for themselves.

Yesterday, Davide Falanga, Elias Mueggler, Matthias Faessler, andProfessor Davide Scaramuzza, who leads the Robotics and Perception Group at the University of Zurich, shared some research that they’ve just submitted to ICRA 2017. It’s the same kind of aggressive quadrotor maneuvering, except absolutely everything is done on board, including obstacle perception. It doesn’t get any more autonomous than this.

Let’s be clear about this: to autonomously fly through these gaps (which are only 1.5 times the size of the robot, with a mere 10 centimeter of clearance on each side), the quadrotor is using a 752 x 480-pixel monochrome camera with a 180-degree field of view lens, and a PX4FMU autopilot with an IMU and a smartphone-class single board Odroid XU4 computer running ROS. That’s it. After passing through the gap, the quadrotor also uses a downward-facing distance sensor and camera to stabilize itself. All of the sensing and computation is taken care of on board the quadrotor, meaning that you could do this exact same thing in your house.

Agressive quadrotorPhoto: Robotics and Perception Group at the University of ZurichThe quadrotor platform used in the experiments. (1) Onboard computer. (2) Forward-facing fisheye camera. (3) TeraRanger One distance sensor and (4) downward-facing camera, both used solely during the recovery phase. (5) PX4 autopilot. The motors are tilted by 15º to provide three times more yaw-control action, while only losing 3 percent of the collective thrust.

Most of this hardware is relatively standard stuff, although the overall platform is custom. The notable tweak is that the rotors have been tilted by 15 degrees, which triples the amount of yaw control without significantly affecting the available thrust. It’s critical to have powerful yaw control since the quadrotor reaches angular speeds of up to 400 degrees per second while approaching a gap. 

The actual process of flying through the window goes like this: first, the quadrotor locates the gap (of a known size*) with its onboard camera. It then computes a trajectory to pass through the gap that keeps the quadrotor as far as possible from the edges, which is what you want to do to not run into stuff, while also trying to keep the gap itself in view of the quadrotor’s camera as much as possible. This trajectory is very focused on gap traversal, so the starting point for it involves the quadrotor moving at high speed and possibly oriented a little bit sideways, so the system also needs to come up with a second trajectory that takes the robot from a stable hover into the gap traversal trajectory. Put these two trajectories together, and you’ve got your path through the gap.

Once the robot starts heading for the gap, it does its best to keep its camera pointed at the frame of the gap to continually update its state estimation relative to the space it’s trying to squeeze through and dynamically replan its trajectory as necessary. Assuming it makes it through successfully (which happens about 80 percent of the time), the final step is for the quadrotor to “catch” itself, recovering from whatever crazy speed and orientation it ends up in. It seems like this would be particularly tricky (and it is), but the researchers had already solved this problem in the context of stabilizing quadrotors that are thrown into the air to launch them

Just in case you were thinking that this kind of thing is easy, the researchers asked two Swiss professional drone-racing pilots to give it a try:

Skills like this are very cool (for humans and robots alike), and it’s hard to understate the importance of being able to run all sensing and computation on board the robot itself. This is a requirement for actually using these skills in a way that’s practical: it’s not just about getting drones to fly through windows, but more about teaching them to be able to reliably maneuver around all kinds of different obstacles, in any environment, from forests to urban areas to your living room.

For more details on this research, we spoke with Professor Scaramuzza:

IEEE Spectrum: What was the biggest challenge of this work, and how did you solve it?

Davide Scaramuzza: The biggest challenge was to couple perception and control, which are typically considered separately. Indeed, for the robot to localize with respect to the gap, a trajectory should be selected, which guarantees that the quadrotor always faces the gap and should be re-planned multiple times during its execution to cope with the varying uncertainty of the state estimate (the uncertainty increases quadratically with the distance from the gap) while respecting the vehicle dynamics. Furthermore, during the traverse, the quadrotor should maximize the distance from the edges of the gap to avoid collisions. It is not trivial to combine all these constraints into a single path-planning problem, since the set of "feasible trajectories" reduces significantly as the quadrotor approaches the gap. Also, the gap is no longer visible when the vehicle is close to the gap, which makes it necessary to execute the traverse without any visual feedback (i.e., blindly). 

We solved these problems using a two step approach. To traverse the gap, we compute a trajectory which can be executed blindly thanks to its short duration and to the fact that it requires precomputed constant inputs (namely, a collective thrust of given magnitude and zero angular velocities). To approach the gap, we use a trajectory generation method that allows us to evaluate a very large set of candidate trajectories; for each candidate trajectory, we compute the optimal vehicle orientation that allows the quadrotor to align its onboard camera as much as possible with the gap direction. Within a short amount of time then we select the best trajectory as the one that guarantees that the gap is always visible and the center of the gap is as close as possible to the center of the image. This method has the additional benefit of being extremely fast, allowing us to replan the approach trajectory during its execution to exploit the more accurate pose estimate we get when we are closer to the gap.

Can the drone fly through several gaps in a row? What are the constraints on its performance?

Yes, if we modify the approach to allow the drone to continue flying after the traverse rather locking into a hovering position. The main constraint is the agility of the vehicle, which can be translated into constraints on its weight/inertia. Using a small and very agile quadrotor, it gets easier to stabilize after passing through a gap before immediately approaching another one. On the other hand, if the vehicle is heavy, recovering after such an aggressive maneuver takes longer before the drone is ready to fly again towards the next gap. To summarize: as the vehicle mass shrinks, we can reduce the distance between the the gaps.

Can this research be applied to other kinds of high speed maneuvering, like avoiding tree branches, or urban obstacles like lamp posts?

Of course! Avoiding obstacles such as tree branches or lamp posts is our next challenge. These scenarios are conceptually similar and our approach can be adapted such that, beside being perception aware (i.e., the pole or tree always visible in the image), the distance from the pole is now minimized (to minimize the overall flight time) while ensuring that is there is no collision.

How difficult will it be for commercial or recreational drones to take advantage of this research?

It will not be difficult. Hardware-wise, we only need an onboard camera, an IMU, and computer. Nowadays, almost any commercially-available drone has this hardware and with the right algorithms one can take advantage of our research to let autonomous quadrotors safely avoid obstacles or, as in our case, literally pass through them.

What are you working on next?

We are planning to extend our work into different directions. One is about passing through multiple gaps without any stop; instead by allowing a smooth and fast behavior to traverse a gap after the other. Other possible extensions are passing through a swinging gap or passing through a static gap along with a suspended payload. Finally, also slalom among trees, poles, or other similar obstacles.

“Aggressive Quadrotor Flight through Narrow Gaps with Onboard Sensing and Computing,” by Davide Falanga, Elias Mueggler, Matthias Faessler and Davide Scaramuzza from the Robotics and Perception Group, University of Zurich, Switzerland, has been submitted to ICRA 2017 which will take place next May in Singapore.

RPG @ UZH ]

* So, I guess technically it could get slightly more autonomous than this.

Read more…
3D Robotics

From Hackaday: A history of drones

Out of respect for a great job, a verbatim history on drones from Hackaday:

In the early 1930s, Reginald Denny, an English actor living in Los Angeles, stumbled upon a young boy flying a rubber band-powered airplane. After attempting to help the boy by adjusting the rubber and control surfaces, the plane spun into the ground. Denny promised he would build another plane for the boy, and wrote to a New York model manufacturer for a kit. This first model airplane kit grew into his own hobby shop on Hollywood Boulevard, frequented by Jimmy Stewart and Henry Fonda.

The business blossomed into Radioplane Co. Inc., where Denny designed and built the first remote controlled military aircraft used by the United States. In 1944, Captain Ronald Reagan of the Army Air Forces’ Motion Picture unit wanted some film of these new flying targets and sent photographer David Conover to the Radioplane factory at the Van Nuys airport. There, Conover met Norma Jeane Dougherty and convinced her to go into modeling. She would later be known as Marilyn Monroe. The nexus of all American culture from 1930 to 1960 was a hobby shop that smelled of balsa sawdust and airplane glue. That hobby shop is now a 7-Eleven just off the 101 freeway.

Science historian James Burke had a TV wonderful show in the early 90s – Connections – where the previous paragraphs would be par for the course. Unfortunately, the timbre of public discourse has changed in the last twenty years and the worldwide revolution in communications allowing people to instantaneously exchange ideas has only led to people instantaneously exchanging opinions. The story of how the Dutch East India Company led to the rubber band led to Jimmy Stewart led to remote control led to Ronald Reagan led to Death of a Salesmanhas a modern fault: I’d have to use the word ‘drone’.

The word ‘propaganda’ only gained its negative connotation the late 1930s – it’s now ‘public relations’. The phrase ‘global warming’ doesn’t work with idiots in winter, so now it’s called ‘climate change’. Likewise, quadcopter pilots don’t want anyone to think their flying machine can rain hellfire missiles down on a neighborhood, so ‘drone’ is verboten. The preferred term is quadcopters, tricopters, multicopters, flying wings, fixed-wing remote-controlled vehicles, unmanned aerial systems, or toys.

I’m slightly annoyed by this and by the reminder I kindly get in my inbox every time I use the dreaded d-word. The etymology of the word ‘drone’ has nothing to do with spying, firing missiles into hospitals, or illegally killing American civilians. People like to argue, though, and I need something to point to when someone complains about my misuse of the word ‘drone’. Instead of an article on Hollywood starlets, the first remote control systems, and model aviation, you get an article on the etymology of a word. You have no one else to blame but yourself, Internet.

An Introduction, and Why This Article Exists

This article is purely about the etymology of the word ‘drone’. Without exception, every article and blog post I read while researching this topic failed to consider whether an unmanned or remotely piloted aircraft was called a ‘drone’ before its maiden flight, or while it was being developed. For example, numerous articles refer to the Hewitt-Sperry Automatic Airplane as the first ‘drone’. For the purposes of this article, this is patently untrue. The word ‘drone’ was first applied to unmanned aircraft in late 1934 or early 1935, and a World War I-era experiment could never be considered a drone by contemporaneous sources. Consider this article a compendium of the evolution of the word ‘drone’ over time.

Why this article belongs on Hackaday should require no explanation. This is one of the Internet’s largest communities of grammar enthusiasts, peculiarly coming from a subculture where linguistic play (and exceptionally dry sarcasm) is encouraged. Truthfully, I am so very tired of hearing people complain about the use of the word ‘drone’ when referring to quadcopters and other remote-controlled toys. To me, this article simply exists as something I can point to when telling off offended quadrotor pilots. I am considering writing a bot to do this automatically. Perhaps I will call this bot a ‘drone’.

The Source of ‘Drone’ c. 1935

quote-sack-full-of-beesBefore the word was used to describe aircraft, ‘drone’ had two meanings. First as a continuous low humming sound, and second as a male bee. The male bee does no work, gathers no honey, and only exists for the purpose of impregnating the queen. It’s not hard to see why ‘drone’ is the perfect word to describe a quadcopter — a Phantom is mindless, and sounds like a sack full of bees. Where then did the third definition of ‘drone’ come from, a flying machine without a pilot on board?

The most cited definition of ‘drone’ comes from a 2013 Wall Street Journal article [1] from linguist and lexicographer Ben Zimmer, tracing the first use of the word to 1935. In this year, US Admiral William H. Standley witnessed a British demonstration of the Royal Navy’s new remote-controlled aircraft for target practice. The aircraft used was based on the de Havilland Tiger Moth, a biplane trainer built in huge numbers during the interwar period, redesignated as the Queen Bee. The implication of Zimmer’s article is that the word ‘drone’ comes from the de Havilland Queen Bee. This etymology is repeated in a piece in the New York Times Magazine published just after World War II [2]:

Drones are not new; inventors were experimenting with them twenty-five years ago. Before the war, small specially built radio-controlled planes were used for anti-aircraft purposes – widely in England, where the name “drone” originated, less extensively here…. The form of radio control used in the experimental days was developed and refined so that it could be applied to nearly any type of conventional plane.

I found this obvious primary source for Ben Zimmer’s etymology of drone in five minutes, but it doesn’t tell anyone if the Queen Bee designation of a remote-controlled biplane came about from the word ‘drone’ or vice versa. This etymology doesn’t really give any information about the technical capabilities or the tactical use of these drones. The unmanned aircraft discussed in the New York Times article would be better called a cruise missile, not a drone. Was the Queen Bee an offensive drone, or was it merely a device built for target practice? These are questions that need to be answered if we’re going to tell the people flying Phantoms to buzz off with their drones.

The Queen Bee, with ChurchillThe Queen Bee, with Churchill

Biology sometimes mirrors linguistics, and the best place to look for the history of ‘drone’, then, is to look into the history of the Queen Bee. The Queen Bee – not its original name – was born out of a British Air Ministry specification 18/33. At the time, the Air Ministry issued several specifications every year for different types of aircraft. The Supermarine Spitfire was originally known to the military as F.37/34; a fighter, based on the thirty-seventh specification published in 1934. Therefore, the specification for a ‘radio-controlled fleet gunnery target aircraft’ means the concept of what a ‘drone’ would be was defined in 1933. Drones, at least in the original sense of a military aircraft, are not offensive weapons. They’re target practice, with similar usage entering the US Navy in 1936, and the US Air Force in 1948. The question remains, did ‘drone’ come before the Queen Bee, or is it the other way around?

The first target drone was built between late 1933 and early 1935 at RAF Farnborough by combining the fuselage of the de Havilland Moth Major with the engine, wings, and control surfaces of the de Havalland Tiger Moth [3]. The aircraft was tested from an airbase, and later launched off the HMS Orion for target practice. Gunnery crews noticed a particularly strange effect. This aircraft never turned, never pitched or rolled, and never changed its throttle position: this aircraft droned. It made a loud, low hum as it passed overhead. Drones are named for the hum, and the Queen Bee is just a clever play on words.

The word ‘drone’ does not come from the de Havilland Queen Bee, because the Queen Bee was originally a de Havilland Moth Major and Tiger Moth. ‘Queen Bee’, in fact, comes from ‘drone’, and ‘drone’ comes from the buzzing sound of an airplane flying slowly overhead. There’s a slight refinement of the etymology for you: the Brits brought the bantz, and a de Havilland was deemed a drone.

A ‘Drone’ is for Target Practice, 1936-1959

The word ‘drone’ entered the US Navy’s lexicon in 1936 [4] shortly after US Admiral William H. Standley arrived back from Europe, having viewed a Queen Bee being shot at by gunners on the HMS Orion. This would be the beginning of the US Navy’s use of the phrase, a term that would not officially enter the US Army and US Air Force’s lexicon for another decade.

Beginning in 1922, the US Navy would use an aircraft designation system to signify the role and manufacturer of any aircraft in the fleet. For example, the fourth (4) fighter (F) delivered to the Navy built by Vought (U) was the F4U Corsair. The first patrol bomber (PB) delivered by Consolidated (Y) was the PBY Catalina. In this system, ‘Drone’ makes an appearance in 1936, but only as ‘TD’, target drone, an airplane designed to be shot at for target practice.

A QB-17 drone, similar to what was used in Operation Aphrodite, at Holloman AFB, 1959. Source: United States Air ForceA QB-17 drone at Holloman AFB, 1959.

For nearly twenty years following the introduction of the word into military parlance, ‘drone’ meant only a remote controlled aircraft used for target practice. B-17 and PB4Y (B-24) bombers converted to remote control under Operation Aphrodite and Operation Anvil were referred to as ‘guided bombs’. Just a few years after World War II, quite possibly using the same personnel and the same radio control technology that was developed during Operation Aphrodite, war surplus B-17s would be repurposed for use as target practice, where they would be called target drones. Obviously, ‘drone’ meant only target practice until the late 1950s.

If you’re looking for a proper etymology and definition of the most modern sense of the word ‘drone’, there you have it. It’s a remote-controlled plane designed for target practice. For the quadcopter pilots who dabble in lexicography, have an interest in linguistic purity, and are utterly offended by calling their flying camera platform a ‘drone’, there’s the evidence. A ‘drone’ has nothing to do with firing weapons down on a population or spying on civilians from forty thousand feet. In the original sense of the word, a drone is simply a remote-controlled aircraft designed to be shot at.

Language changes, though, and to successfully defend against all critics of my use of the word ‘drone’ as applying to all remote controlled aircraft, I’ll have to trace the usage of the word drone up to modern times.

The Changing Definition of ‘Drone’, 1960-1965

A word used for a quarter century will undoubtedly gain a few more definitions, and in the early 1960s, the definition of ‘drone’ was expanding from an aerial target used by British forces in World War II to a word that could be retroactively applied to the German V-1, an aerial target used by the British forces in World War II.

The next evolution of the word ‘drone’ can be found in the New York Times, November 19, 1964 edition [5], again from Pulitzer Prize-winning author Hanson W. Baldwin. Surely the first reporter on the ‘drone’ beat has more to add to the linguistic history of the word. In the twenty years that passed since Mr. Baldwin introduced the public to the word ‘drone’, a few more capabilities have been added to these unmanned aircraft:

Drone, or unpiloted aircraft, have been used for military and experimental purposes for more than a quarter of a century.

Since the spectacular German V-1, or winged missile, in World War II, advances in electronics and missile-guidance systems have fostered the development of drone aircraft that appear to be almost like piloted craft in their maneuverability.

The description of the capabilities of drones continues on to anti-submarine warfare, battlefield surveillance, and the classic application of target practice. Even in the aerospace industry, the definition of ‘drone’ was changing ever so slightly from a very complex clay pigeon to something slightly more capable.

In the early 1960s, NASA was given the challenge of putting a man on the moon. This challenge requires docking spacecraft, and at the time Kennedy issued this challenge, no one knew how to perform this feat of orbital mechanics. Martin Marietta solved this problem, and they did it with drones.

spacedrone

US patent 3,201,065 solves the problem of docking two spacecraft and does it with a drone.

Orbital docking was a problem NASA needed to solve before getting to the moon, and the solution came from the Gemini program. Beginning with the Gemini program, astronauts would perform an orbital rendezvous and dock with an unmanned spacecraft launched a few hours or days earlier. Later missions used the engine on the Agena to boost their orbit to world altitude records. The first experiments in artificial gravity came from tethering the Gemini capsule to the Agena and spinning the spacecraft around a common point.

The unmanned spacecraft used in the Gemini program, the Agena Target Vehicle, was not a drone. However, years before these rendezvous and docking missions would pave the way to a lunar landing, engineers at Martin Marietta would devise a method of bringing two spacecraft together with a device they called a ‘drone’ [6].

Martin Marietta’s patent 3,201,065 used an autonomous, remote-controlled spacecraft tethered to the nose of a Gemini spacecraft. Laden with a tank of pressurized gas, a few thrusters, and an electromagnet, an astronaut would fly this ‘docking drone’ into a receptacle in the target vehicle, activate the electromagnet, and reel in the tether bringing two spacecraft together. Here the drone was, like the target drones of World War II, remote-controlled. This drone spacecraft never flew, but it does show the expanding use of the word ‘drone’, especially in the aerospace industry.

If you’re looking an unimaginably cool drone that actually took to the air, you need only look at the Lockheed D-21, a reconnaissance aircraft designed to fly over Red China at Mach 3.

The M-21 carrier aircraft and D-21 drone. The M-21 was a variant of the A-12 reconnaissance aircraft, predecessor to the SR-71 reconnaissance aircraft.The M-21 carrier aircraft and D-21 drone. The M-21 was a variant of the A-12 reconnaissance aircraft, predecessor to the SR-71 reconnaissance aircraft.

The ‘D’ in ‘D-21’ means ‘daughter’, and the carrier aircraft for this unmanned spy plane is the M-21, ‘M’ meaning ‘mother’. Nevertheless, the D-21 was referred to in contemporary sources as a drone. The D-21 was perhaps the first drone referred to as such that was a pure observation aircraft, meant to spy on the enemy.

The 1960s didn’t just give drones the ability to haul a camera over the enemy. 1960 saw the first offensive drone – the first drone called as such that was able to drop homing torpedos into the ocean above enemy submarines.

qh-50cThe Gyrodyne QH-50 – also know as DASH, the Drone Anti-Submarine Helicopter – was the US Navy’s answer to a problem. At the time, the Soviets were building submarines faster than the United States could build anti-submarine frigates. Older ships were available, but these ships weren’t large enough for a full-sized helicopter. The solution was a drone that could launch off the deck, fly a few miles to an interesting ping on the sonar, and drop a torpedo. The solution was the first offensive drone, the first unmanned aircraft capable of delivering a weapon.

The QH-50 was a relatively small coaxial helicopter piloted by remote control. It was big enough to haul one torpedo twenty miles away from a ship and have this self-guided torpedo take care of the rest.

The QH-50 was a historical curiosity born from two realities. The US Navy had anti-submarine ships that could detect Soviet subs dozens of miles away. These anti-submarine ships didn’t have torpedos with that range and didn’t have a flight deck to launch larger helicopters. The QH-50 was the result, but new ships and more capable torpedos made this drone obsolete in less than a decade. An otherwise entirely unremarkable weapons platform, the QH-50 has one claim to fame: it was the first drone, referred to as such in contemporary sources, that could launch a weapon. It was the first offensive drone.

The Confusion of the Tounges, c. 1965-2000

On June 13, 1963, a Reuters article reported a joint venture between Britain and Canada to build an unmanned spy plane, specifically referred to as an ‘unmanned aerial vehicle’ [7]. The reporter, with full knowledge of the previous two decades of unmanned aerospace achievement, said this new project was ‘commonly referred to as a drone.’ By the mid-60s, the word ‘drone’ had its fully modern definition: it was simply any unmanned aerial vehicle, used for any purpose, ostensibly controlled in any manner. This definition was being supplanted by several competing terms, including ‘unmanned aerial vehicle’ and ‘remotely piloted vehicle’.

The term ‘drone’ would be usurped in common parlance for the newer, clumsier term, ‘unmanned aerial vehicle.’ A word that once referred to everything from flying targets to spacecraft subsystems would now be replaced. The term ‘unmanned aerial vehicle’ would make its first public military appearance in the Department of Defense report on Appropriations for 1972. The related term ‘remotely piloted vehicle’ or RPV, would first appear in government documents in the late 1980s. From the word drone, a thousand slightly different terms are born in the 60s, 70s, and 80s. Even today, ‘unmanned aerial system’ is the preferred term used by the FAA. This phrase was created less than a decade ago.

Engineers built drones to surveil the Communist Chinese at Mach 3. Engineers patented a drone to dock two spacecraft together. Engineers built drones to hunt and sink submarines. The Air Force took old planes, painted them orange, and called them target drones. So the Lord scattered them abroad over the face of all the Earth, and they ceased calling their aircraft drones.

In the 1970s, 80s, and 90s, the term ‘drone’ would still be applied to target aircraft, and even today is still the preferred term for unmanned military aircraft used for target practice. Elsewhere in the military, the vast array of new and novel applications of unmanned aircraft heave meant new terms have cropped up.

Why these new terms were created is open to debate and interpretation. The military and aerospace companies have never shied away from a plague of acronyms, and a dizzying array of random letters thrown into a report is the easiest way of ensuring operational security. How can the enemy know what we’re doing if we don’t know ourselves? It’s questionable if the improved capabilities of drones, such as dropping torpedos or relaying video, can account for the vast array of acronyms — it appears these new acronyms were simply the creation of a few captains, majors, and engineers either at the Pentagon or one of a dozen aerospace companies. By the 1990s, the word drone was in a state of disuse, replaced by ‘UAV’, ‘RPV’, ‘UAS’, and a dozen other phrases synonymous with the word drone.

The Era of the Modern Drone, October 21, 2001 – Present

predator

The definitive image of the modern drone is that of the General Atomics MQ-1 Predator laden with a Hellfire anti-tank missile on each wing. The Predator is an unmistakable aircraft featuring a bulbous nose just barely large enough to house the satellite antennae underneath. A small camera pod hangs off its chin. The long, thin wings appear as if they were stolen off a glider. A small propeller is mounted directly on the tail, and the unique inverted v-tail gives the impression this aircraft can never land, lest it be destroyed.

The Predator program began in the mid-1990s and was from the get-go referred to as an Unmanned Aerial Vehicle, or UAV. This changed on October 21, 2001, in a Washingon Post article from Bob Woodward. In the article, CIA Told to Do ‘Whatever Necessary’ to Kill Bin Laden, Woodward reintroduced the word ‘drone’ into the vernacular [8]. The drone in question was a CIA-operated Predator equipped with, “Hellfire antitank missiles that can be fired at targets of opportunity” Woodward, either through conversations with military officials, remembering the old term for this type of aircraft, needing a new word to describe this weapon delivery system, or simply being fed up with the alphabet soup of acronyms, chose to use the word ‘drone’.

If you’re angry at the word ‘drone’ being applied to a Phantom quadcopter, you have two people to blame. The first is Hanson W. Baldwin, military editor to the New York Times. Over a career of forty years, he introduced the word ‘drone’ to describe everything to target aircraft to cruise missiles. The second is Bob Woodward of the Washington Post. The man who broke Watergate also reintroduced the word ‘drone’ into the American consciousness.

A Briefer History Of ‘Drone’, and an Argument for its Use

The word ‘drone’ was first applied to unmanned aircraft in late 1934 or early 1935 because biplanes flying low overhead sound like a cloud of bees. For twenty-five years, ‘drone’ applied only to aircraft used as target aircraft. Beginning in the late 1950s and early 1960s, the definition of ‘drone’ expanded to included all unmanned aircraft, from cruise missiles to spaceships. Around 1965, acronyms such as ‘UAV’, and ‘RPV’ took over as being either more descriptive or as a function of the military aerospace industry’s obsession with acronyms. In the late 1990s, the US Air Force and CIA began experimenting with Predator UAVs and Hellfire missiles. The first use of this weapons platform was mere weeks after the 9/11 attacks. This weapons platform became known as a Predator ‘drone’ in late 2001 thanks to Bob Woodward. Colloquially, the term ‘drone’ now applies to everything from unmanned military aircraft to quadcopters that fit in the palm of your hand.

The most frequently cited reason for not using the word ‘drone’ to describe everything from racing quadcopters to remote-controlled fixed wing aircraft orbiting a point for hours is linguistic purity. Words have meaning, so the argument goes, and it’s much better to use precise language to describe individual aircraft. A quadcopter is just that — a quadcopter. An autonomous plane used for inspecting pipelines is an unmanned aircraft system.

quote-drone-applied-to-every-aircraft-in-historyThe argument of linguistic purity fails immediately, as the word ‘drone’ was applied to every conceivable aircraft at some time in history. In the 1960s, a ‘drone’ could mean a spaceship or spy plane. In the 1940s, a ‘drone’ simply meant an aircraft that was indistinguishable in characteristics from a balsa wood, gas powered remote controlled airplane of today. Even accepting the argument of linguistic purity has consequences: ‘drone’ originally meant ‘target drone’, an aircraft flown only for target practice. Sure, keep flying, I’ll go get my 12 gauge.

The argument of not using the word ‘drone’ to apply to what are effectively toys on the basis of language being defined by common parlance fails by tautology. ‘Drone’, critics say, only apply to military aircraft used for spying or raining Hellfires down on the enemy. It’s been this way since 2001, and since language is defined by common usage, the word ‘drone’ should not be applied to a Phantom quadcopter. This argument fails to consider that the word ‘drone’ has been applied to the Phantom since its introduction, and if language is defined by common usage than surely a quadcopter can be called a drone.

Instead of linguistic trickery, I choose to argue for the application of ‘drone’ on a philosophical basis. You are now reading this article on Hackaday, and for the thirty years, a ‘hacker’ is someone who breaks into computer systems, steals money from banks, leaks passwords to the darknet, and other illegal activities. Many other negative appellations apply to these activities; ‘crackers’ are those who simply break stuff, ‘script kiddies’ are responsible for the latest DDOS attack. Overall, though, ‘hackers’ is the collective that causes the most damage, or so the dictionary definition goes.

Obviously, the image of ‘hacking’ being only illegal or immoral is not one we embrace. The word is right there at the top of every page, and every word written here exudes the definition we want. ‘Hacking’, to us, is firmware tomfoolery, and electronic explorations of what should be possible but isn’t available to the public. We own the word ‘hack’ in every word we publish by extolling the virtues of independent study and discovery.

Everyone here learned a very long time ago you don’t impress people with pedantry. You won’t convert anyone from believing hackers stole aunt Mable’s identity to believing ‘hack’ is an inherently neutral term simply by telling them. Be the change you want to see in the world or some other idiotic phrase from a motivational poster, but the point remains. It’s always better to own a term than to insufferably deny it. It’s a lesson we’ve learned over the last decade, and hopefully one the drone community will soon pick up.

Read more…
3D Robotics

I was one of the backers of the Pozyx ultrawideband positioning system Kickstarter campaign, and after a pretty bumpy start (although they shipped the product, the software and tech support was pretty terrible for six months), they've finally started delivering on their promise with new firmware and active support and it's now worth considering for indoor (no GPS) drone or rover use. 

From the video above:

In the demo, the Pozyx developers kit is used to accurately track the position of the tag with 6 fixed anchors, with an update rate of 40Hz. The resulting positioning data is directly used by the Unity Engine to move the ball (by reading data through the serial port).

Read more…
3D Robotics

Lego kit makes DIY drones easier for kids

3689701135?profile=originalWhat's interesting about the new Flybrix kit ($149) is that it has an original Arduino-based flight controller. Pretty hard core code (Kalman filters and such) so that might be a little gnarly for younger kids, but the frame looks fun to put together. 

Flybrix Kits come complete with all the components you need to make and fly rebuildable, crash-friendly quadcopters, hexacopters and octocopters from one kit. Flybrix is designed for anyone who is interested in building and flying their own creations.  There's nothing like the thrill of seeing what you made, take to the skies.  The included Flybrix Flight Control App for Android and iOs is easy as 1, 2, 3 to get your designs airborne!

So much more than a toy.

Highly capable hardware and open-source software make for a building, flying, crashing and rebuilding experience that’s not only fun, but also highly flexible; no matter your level of expertise. That’s what makes Flybrix different from typical ready-to-fly drone toys.  It's the ability to experiment, create, customize, and learn in a hands-on way.

Friendly for non-technical makers and new pilots. 

The basic snap-and-go experience has deep play-value.  Simply building and flying stays fresh and engaging time after time, especially as a project that parents can do with kids.  The more you play the more you learn about engineering, physics, design, and geometry. You'll be airborne in a snap!

Tech savvy builders and amateur aviators can get creative - quickly.

Intermediates can dive deeper into the creative experience by designing new airframes and tuning them for flight. The Flybrix Chrome Extension Configuration Software makes adjusting settings and motor tuning easy.  It opens the path of creative problem solving, deeper tuning, flight logic, and control theory.

For true techies and pro pilots, the sky is the limit.

The Flybrix brain is a 96Mhz ARM® Cortex®-M4 processor that's Arduino compatible.  Our custom PCB includes a barometer, a magnetometer, several indicator LEDs, ADC converters, SD card slot and bluetooth.  It's also possible to add Wi-Fi and GPS modules.

The Flybrix code is all open-source, so it’s infinitely tweakable. Add a GPS module, experiment with LQR control theory and alternative state estimation, or wire up some extra LEDs and do some flight-path light painting.  It's fair to say, the sky is the limit. Check out our code on Github.

Flybrix is for the young and young at heart!

Flybrix is designed for pilots 14 years and older.  As with using any drone, adult supervision, following general safety guidelines such as eye protection, safe battery handling, and using common sense is advised.  

Parents often ask if kids younger than 14 can play with Flybrix.  The answer is it's up to the parent to decide.  With younger kids one-on-one parent participation, not just supervision, is required. Flybrix is a fantastic "together time" project where everyone has fun and everyone learns together.   

Read more…
3D Robotics

From NewAtlas:

We have seen a few different takes on Vertical TakeOff and Landing (VTOL) drones over the last few years. The idea behind such approaches is to harness the typically longer range and greater payload capacity of fixed-wing drones and mix it with the superior agility of multicopters, allowing them to take off and land in tight spaces.

Some of these have been developed for military purposes, such as the HQ UAV and the Batwing-like AirMule, but others, like the VTOL Kestrel and SkyProwler are aimed more at hobbyists. In developing the delftAcopter, the researchers have set out to build a drone that can be used to carry medical supplies to tough-to-reach areas.

The electric drone takes the form of a miniature biplane, an aircraft design that uses two wings stacked on top of one another which became popular in the early years of aviation following its success at the hands of the Wright brothers. While some VTOL drones use tilting propellors to switch from vertical to horizontal movement, the delftAcopter itself changes orientation as it makes that transition.

Prior to takeoff, it sits upright with the propellor spinning horizontally, just like a helicopter. Then as it reaches the desired height, it shifts its position by 90 degrees so that the propellor is facing forward and it is thrust in that direction, allowing it to zip along at up to 107 km/h (66 mph). With a 10,000 mAh battery onboard, the aircraft can fly for up to 60 minutes on a single charge.

The delftAcopter is capable of entirely autonomous flight, including takeoff, forward flight transition and landing. It can travel beyond the operator's line of sight and maintain a connection through an Iridium satellite connection, which the researchers actually claim allows it to be controlled from anywhere on the planet.

It uses Parrot's S.L.A.M.dunk developer kit along with a fish-eye stereo camera to gather video, and uses an inertial measurement unit (IMU) and GPS to track its position during flight. The craft weighs 4 kg (8.8 lb) and also features obstacle avoidance and the ability to pick out safe landing zones.

The team is set to put the delftAcopter through its paces at the upcoming 2016 Outback Medical Challenge. The event takes place in Australia and tasks competitors with building an autonomous aircraft capable of retrieving a blood sample from a stranded person located at an inaccessible site around 30 km (18 mi) away.

Drones have emerged as tools with great potential when it comes to search, rescue and disaster relief situations. Various drones have been tested for these purposes in the US, the Swiss Alps and across Africa, a particularly suitable candidate due to rough terrain and the lack of paved roads and infrastructure to move cargo by land. The delftAcopter will have its chance to demonstrate its wherewithal at the Outback Medical Challenge between September 27 to 29.

Read more…
3D Robotics

Wirelessly powered quadcopter

There have been a lot of quadcopters that get power wirelessly with lasers, but this is the first one I've seen at small scale that uses radio. From Hackaday:

[Sam M] wrote in with a quick proof-of-concept demo that blows our socks off: transferring enough power wirelessly to make a small quadcopter take flight. Wireless power transfer over any real distance still seems like magic to us. Check out the videos embedded below and you’ll see what we mean.

What’s noteworthy about this demo is that neither the transmitter nor the receiver are particularly difficult to make. The transmitting loop is etched into a PCB, and the receiver is made of copper foil tape. Going to a higher frequency facilitates this; [Sam M] is using 13.56 MHz instead of the kilohertz that most power-transfer projects use. This means that all the parts can be smaller and lighter, which is obviously important on a miniature quadrotor.

wirelessly-powered-quadrotor-uylhy8abhiqmp4-shot0009High-frequency power switching puts real demands on the transistors, though, and the one [Sam M] is using is cutting-edge and specifically designed for this application. You’re not going to get far with junk-bin parts at high frequencies. In fact, the whole inverter that drives the coil is a custom design, and is extremely well detailed in [Sam]’s research paper, available here. (PDF)

High-power and high-frequency can still benefit from having a wire to run along, but transmitting a few watts across thin air like this is a sweet demo. Thanks for sharing, [Sam]!

Read more…
3D Robotics

Training autonomous cars with Grand Theft Auto

This is super cool -- training with scaleable simulations is the future of robotic AI.  From Hackaday:

For all the complexity involved in driving, it becomes second nature to respond to pedestrians, environmental conditions, even the basic rules of the road. When it comes to AI, teaching machine learning algorithms how to drive in a virtual world makes sense when the real one is packed full of squishy humans and other potential catastrophes. So, why not use the wildly successful virtual world of Grand Theft Auto V to teach machine learning programs to operate a vehicle?

Half and Half GTAV Annotation Thumb

The hard problem with this approach is getting a large enough sample for the machine learning to be viable. The idea is this: the virtual world provides a far more efficient solution to supplying enough data to these programs compared to the time-consuming task of annotating object data from real-world images. In addition to scaling up the amount of data, researchers can manipulate weather, traffic, pedestrians and more to create complex conditions with which to train AI.

It’s pretty easy to teach the “rules of the road” — we do with 16-year-olds all the time. But those earliest drivers have already spent a lifetime observing the real world and watching parents drive. The virtual world inside GTA V is fantastically realistic. Humans are great pattern recognizers and fickle gamers would cry foul at anything that doesn’t analog real life. What we’re left with is a near-perfect source of test cases for machine learning to be applied to the hard part of self-drive: understanding the vastly variable world every vehicle encounters.

A team of researchers from Intel Labs and Darmstadt University in Germany created a program that automatically indexes the virtual world (as seen above), creating useful data for a machine learning program to consume. This isn’t a complete substitute for real-world experience mind you, but the freedom to make a few mistakes before putting an AI behind the wheel of a vehicle has the potential to speed up development of autonomous vehicles. Read the paper the team published Playing for Data: Ground Truth from Video Games.

Before you think this could go horribly wrong, check out this mini-rally truck that taught itself how to powerslide and realize that this method of teaching AI how to drive could actually be totally awesome. Also realize that this research is just characterizing still-images at about 7 seconds per image. This is more than a couple orders of magnitude faster than real-world images — great for learning but we’re still very far away from real-time and real-world implementation.

We really hope that a team of research assistants were paid to play a lot of GTA V in a serious scientific effort to harvest this data set. That brings to mind one serious speed-bump. This game is copyrighted and you can’t just do anything you want with recordings of gameplay, and the researchers do mention that gameplay footage is allowed under certain non-commercial circumstances. That means that Uber, Google, Apple, Tesla, every major auto company, and anyone else developing autonomous vehicles as a business model will be locked out of this data source unless Rockstar Games comes up with a licensing model. Maybe your next car will have a “Powered by GTA” sticker on it.

[MIT Technology Review via Popular Science]

Read more…
3D Robotics

Great piece from NPR about the use of Pixhawk-powered fixed-wing drones to deliver drugs and medical samples. Last year the blood transport was tested and deemed safe. Now the microbial transport has also been tested and passed. 

Three years ago, Geoff Baird bought a drone. The Seattle dad and hobby plane enthusiast used the 2.5-pound quadcopter to photograph the Hawaiian coastline and film his son's soccer and baseball games.

But his big hope is that drones will soon fly tubes of blood and other specimens to Harborview Medical Center, where he works as a clinical pathologist running the hospital's chemistry and toxicology labs. In the near future, Baird and others say, drones could transform health care — not only in rural areas by bringing critical supplies into hard-to-reach places, but also in crowded cities where hospitals pay hefty fees to get medical samples across town during rush hour. By providing a faster, cheaper way to move test specimens, drones could speed diagnoses and save lives. "It's super exciting to me," Baird says.

The technology seems to be there. Drones are delivering pizza in New Zealand and taking condoms to parts of Ghana that lack reliable roads or access to birth control. Tech giants and big retailers, including Amazon and Wal-Mart, are testing drones fordeliveries and pickups.

However, "blood specimens are not like a book or a shoe," Timothy Amukele, an assistant professor of pathology at Johns Hopkins School of Medicine, said in a TED talk earlier this year. No one knew whether bumpy flights would hurt cells or otherwise make biological samples unsuitable for lab tests.

Drones In Flight

3689700448?profile=original

Amukele and his colleagues transport donated blood samples by drone in this video. The drones climbed to over 328 feet above ground and circled the field for six to 38 minutes.

So Amukele and co-workers conducted several experiments to find out. In their firststudy, published in PLOS ONE last July, the team collected several hundred blood samples from healthy volunteers. They drove the samples to a flight field an hour northwest of Baltimore, packed half of them into foam containers and flew them around in a drone for up to 40 minutes. The other samples sat. All specimens went back to the lab for 33 routine tests. The results were the same for each group, suggesting samples stay intact during drone flights.

In follow-up analyses, drone transport also seemed safe for samples containing microbes and for donated blood. The microbial study was published in August in theJournal of Clinical Microbiology; a manuscript on the blood products study is under review. (Videos of each experiment can be found here.)

"The results don't surprise me," says Bill Remillard, chief technical officer at TriCore Reference Laboratories in Albuquerque, N.M. "But until you do the science, you just don't know."

TriCore handles nearly three-quarters of New Mexico's clinical lab testing. And in a sparsely populated state, moving samples over large distances is expensive. TriCore spends $3.5 million per year. So after Remillard heard the results of Amukele's first drone experiment at a meeting last summer, the two started discussing a possible pilot study using drones to transport lab samples in New Mexico.

While Amukele's experiments show it's feasible to move lab specimens with drones, pilot studies in real clinical settings are still needed to work out logistics. Questions include how to request a drone, where it would land, who would pick up the samples and how often a drone would need new batteries.

Safety is another concern. Some drones drop cargo with parachutes or other release mechanisms, making it harder for people to tamper with the vehicles. But as far as how safe drones are, "those data don't yet exist," Amukele says. Though millions of drones have been sold worldwide, "we don't know how many crashes happen and how many are due to operator error," he says. The Federal Aviation Administration is starting to collect this data.

It's a promising development for an industry where legislation has lagged behind the fast-advancing technology. For years, the FAA had imposed a near-ban on commercial drones, only allowing them to fly if businesses applied for an exemption. But in June the agency announced a set of rules for companies to operate drones in the United States, and on Aug. 29 those regulations took effect. The FAA expects the number of registered commercial drones to jump 30-fold, from 20,000 to 600,000, within months.

"The rules had not been well defined. This is an attempt to define them," says Lawrence Williams, who heads business development at Zipline, a Silicon Valley startup making drones for medical applications. Zipline is focusing much of its effort in Rwanda, where less crowded skies, relative to the U.S., make it easier to negotiate drone delivery of blood samples.

Another drone startup, Vayu, whose CEO is a co-author on the PLOS ONE drone study, is also dipping into the international arena. In July, the Michigan-based company did a demo flight in Madagascar, carrying specimens from a remote village to a lab for testing. Vayu makes a quadricopter plane capable of vertical takeoff — an appealing feature for hospitals with limited landing space.

While it's easy to see how drones could improve health care in poor countries, Amukele thinks medical drone delivery could make a bigger splash in the U.S. Compared to Africa and developing countries, the U.S. does much more testing per person, he says, and many of the country's 200,000 medical labs are collection-only sites that rely on central labs for testing. So "there are likely to be more [medical drone users] in the U.S. than anywhere else," Amukele says.

As Zipline prepares to launch blood delivery drones in Rwanda, the company is also seeking regulatory approval for three projects using drones to bring medical supplies to underserved communities in the U.S.

One project would integrate drone delivery of medications with telemedicine appointments at a small clinic in rural Maryland. Another would use Zipline drones to link a large health care distribution center to hospitals and tribal clinics around Reno, Nev. And for the third project, the company would partner with a regional blood bank in Washington state, creating a plan to distribute blood to various hospitals and clinics in the event of earthquakes and other natural disasters.

Johns Hopkins was initially skeptical of Amukele's experiments — the review board thought his first proposal was a joke — but now the university is giving the pathologist space and funds to hire a drone engineer and continue researching medical delivery drones.

In Seattle, Baird is working with Amukele and aeronautical engineers at the University of Washington on their own drone proposal. Ideally their test flights would take samples from Seattle Children's to Harborview, a bustling facility that runs thousands of tests each day. However, that flight path would violate the FAA rule requiring drones to stay within the pilot's line of sight. So the initial plan is to run 2-mile line-of-sight flights between the children's hospital and UW Medical Center, Baird says.

Drones could be a huge help in poison emergencies, Baird says. In a typical scenario, a child gets rushed to the emergency room after accidentally swallowing some pills. Though routine tests can rule out some things, clinics often send samples to a centralized toxicology lab for confirmation and further testing. This can take hours. A drone could zip samples downtown in five to 10 minutes, Baird says, helping a child get diagnosed and receive medications more quickly.

He also envisions drones collecting samples from patients' homes and taking them to the hospital. You could prick your finger and rub the blood onto a card that a drone could fly back to the lab for testing, Baird says.

In the meantime, though, Zipline's U.S. projects remain on hold, awaiting the regulatory go-ahead, and the Seattle team continues studying maps and sketching flight routes for the small drone test it hopes to launch. The team has presented the plan to grant agencies and gotten positive responses — but no funding yet.

Like other technological advances, Baird suspects drones for medicine will "wait, wait, wait and then go very quickly."

Esther Landhuis is a freelance science journalist in the San Francisco Bay Area. Follow her at @elandhuis.

Read more…