Chris Anderson's Posts (2716)

Sort by
3D Robotics

3689720675?profile=originalAt this weels Interdrone conference Yuneec and Dronecode announced the new DroneCore SDK, which is now shipping on the new Dronecode-based Yuneec H520 commercial hexacopter. The above slide shows how the architecture works, but basically Dronecore replaces the old DroneKit SDK, and provides an easy-to use mobile (Android and iOS) and onboard (C++ and Python) interface to Dronecode/PX4-based vehicles. Of which there are many!

3689720756?profile=original

The library provides a simple core API for managing one or more vehicles, providing programmatic access to vehicle information and telemetry, and control over missions, movement and other operations.

Developers can extend the library using plugins in order to add any other required MAVLink API (for example, to integrate PX4 with custom cameras, gimbals, or other hardware over MAVLink).

DroneCore can run on a vehicle-based companion computer or on a ground-based GCS or mobile device. These devices have significantly more processing power that an ordinary flight controller, enabling tasks like computer vision, obstacle avoidance, and route planning.

The full reference is here.

Read more…
3D Robotics

In this episode, the Roswell Flight Test Crew speaks with John Leipper, the Solutions Architecture Manager for drone manufacturer Insitu. At the Future Farm Drone Rodeo in Pendleton, Oregon, Insitu conducted a drone swarm demonstration using three 3DR Solos – all controlled by a single pilot using a computer. Of course, to stay in compliance with FAA regulations, an individual pilot for each aircraft was on standby should immediate human intervention be required. The long-term goal is to make drones more efficient through automation, requiring less direct human input to gather data more quickly than would be possible with a single drone. Such a control system would also allow the drones to be operated remotely via the Internet or other networks.

Read more…
3D Robotics

In this episode, the Roswell Flight Test Crew speaks with John Leipper, the Solutions Architecture Manager for drone manufacturer Insitu. At the Future Farm Drone Rodeo in Pendleton, Oregon, Insitu conducted a drone swarm demonstration using three 3DR Solos – all controlled by a single pilot using a computer. Of course, to stay in compliance with FAA regulations, an individual pilot for each aircraft was on standby should immediate human intervention be required. The long-term goal is to make drones more efficient through automation, requiring less direct human input to gather data more quickly than would be possible with a single drone. Such a control system would also allow the drones to be operated remotely via the Internet or other networks.

Read more…
3D Robotics

Introducing OpenSolo!

3689719471?profile=original

Big news! Reposting from the 3DR blog. Also see the ArduPilot Team announcement here

When we launched Solo back in 2015, one of its selling points was that it was based on the open source ArduPilot software, the project that Jordi Munoz and I launched as a side-project way back in 2007 and then grew beyond our imagination in the able hands of the community.  The point of Solo was to package up this open stack in a polished, easy-to-use consumer product (like the DJI Phantom), treating the ArduPilot stack as an “open core” and extending its functionality with proprietary features much as companies do with Linux-based devices.

This worked very well as a product (Solo had some really innovative features, some of which are still unequaled) but less well as a business (we couldn’t make it cheaply enough to keep up with the rapid price declines in the consumer market, so we stopped making them at the end of 2015).  Now, two years later, 3DR has shifted its focus to the commercial market that exploded after the FAA launched its Part 107 commercial operator licensing program last year. But there are lots of Solos still out there, with great untapped potential — it’s just not our core business anymore.

So what to do? Open source the rest of it! We’ve heard loud and clear that the community wants a tried-and-true Ardupilot platform that can be extended without limit. The Ardupilot team has already embraced Solo and ported the latest flight code to it. But the custom 3DR WiFi control, telemetry, and video streaming technology, the “Artoo” controller and the “Shot Manager” mission control stack that runs on the onboard Linux processor were not open source, so the full potential of the drone remained locked.

No more. I’m delighted to announce that we’re now open sourcing almost all of the remaining code, including the SoloLink wireless stack, ShotManager, the high-level onboard mission scripting layer that gave Solo all of its “smart shots”, and a range of other packages include the code for the controller and the build tools.

The code has now been released in a new OpenSolo organization on Github, licenced under the permissive Apache 2.0 licence.

More details about what’s been released here:

solo-builder – scripts for configuring a virtual machine to build the Solo software

meta-3dr – the build recipes that assemble the complete Linux system for the Solo and Controller i.MX6 processors.

shotmanager – implementation of Solo’s Smart Shots.

sololink – 3DR software that runs on the i.MX6 processors, implementing things like video streaming, control, telemetry, pairing, logging, etc.

artoo – firmware for the STM32 microcontroller in the controller responsible for the inputs and screen.

solo-gimbal (coming soon) – firmware for the microcontrollers in the Solo Gimbal

Read more…
3D Robotics

Lots of news and updates from the Dronecode team this month: 

1) The Aerotenna OcPoC autopilot (video above) now supports the Dronecode/PX4 software stack!

● FPGA and dual-core ARM processors in OcPoC allow for real-time signal processing and for executing complicated algorithms, enabling exciting new possibilities for artificial intelligence, deep learning, and a truly autonomous and intelligent UAV

● With more than 30 programmable I/Os supporting most standard interfaces, OcPoC is
incredibly flexible, allowing free reign for your creativity

● OcPoC features industrial-grade redundancy, ensuring you can always count on your key
systems such as GPS, IMU, and more

● Flawless integration with Aerotenna microwave radar sensors, including uLanding radar altimeter and uSharp collision-avoidance sensor.

2) QGroundControl 3.2 is out!

3689718748?profile=original

Many inprovements and new features:

  • Settings
    • File Save path - Specify a save path for all files used by QGC.
    • Telemetry log auto-save - Telemetry logs are now automatically saved without prompting.
    • AutoLoad Plans - Used to automatically load a Plan onto a vehicle when it first connects.
    • RTK GPS - Specify the Survey in accuracy and Minimum observation duration.
  • Setup

    • ArduPilot only
      • Pre-Flight Barometer and Airspeed calibration - Now supported
      • Copy RC Trims - Now supported
  • Plan View

    • Plan files - Missions are now saved as .plan files which include the mission, geo-fence and rally points.
    • Plan Toolbar - New toolbar which shows you mission statistics and Upload button.
    • Mission Start - Allows you to specify values such as flight speed and camera settings to start the mission with.
    • New Waypoint features - Adjust heading and flight speed for each waypoint as well as camera settings.
    • Visual Gimbal direction - Gimbal direction is shown on waypoint indicators.
    • Pattern tool - Allows you to add complex patterns to a mission.
      • Fixed Wing Landing (new)
      • Survey (many new features)
    • Fixed Wing Landing Pattern - Adds a landing pattern for fixed wings to your mission.
    • Survey - New features
      • Take Images in Turnarounds - Specify whether to take images through entire survey or just within each transect segment.
      • Hover and Capture - Stop vehicle at each image location and take photo.
      • Refly at 90 degree offset - Add additional pattern at 90 degree offset to original so get better image coverage.
      • Entry location - Specify entry point for survey.
      • Polygon editing - Simple on screen mechanism to drag, resize, add/remove points. Much better touch support.
  • Fly View

    • Arm/Disarm - Available from toolbar.
    • Guided Actions - New action toolbar on the left. Supports:
      • Takeoff
      • Land
      • RTL
      • Pause
      • Start Mission
      • Resume Mission - after battery change
      • Change Altitude
      • Land Abort
      • Set Waypoint
      • Goto Location
    • Remove mission after vehicle lands - Prompt to remove mission from vehicle after landing.
    • Flight Time - Flight time is shown in instrument panel.
    • Multi-Vehicle View - Better control of multiple vehicles.
  • Analyze View - New

    • Log Download - Moved to Analyze view from menu
    • Mavlink Console - NSH shell access
  • Support for third-party customized QGroundControl

    • Standard QGC supports multiple firmware types and multiple vehicle types. There is now support in QGC which allows a third-party to create their own custom version of QGC which is targeted specifically to their custom vehicle. They can then release their own version of QGC with their vehicle.

3) Other Dronecode updates from our monthly newsletter:

DOCUMENTATION UPDATES.

Since our last newsletter we’ve added high level sidebar links between all our documentation libraries so, for example, it is much easier to find QGroundControl documentation when you’re in the PX4 User Guide (see the documentation update blog!)

We’ve also made some other significant additions, including:

WORKING GROUP UPDATES.

UX Working Group
Last month the UX WG started classifying members according to clusters (in order to guide the evolution of our platform roadmap):

  • Our initial classification document is here. This is a work in progress, but we would love your feedback (please comment within the document).
  • The UX WG created a survey to help understand what Dronecode members and the community are looking for from the project. The survey closed Friday, June 30. Results are being analyzed and will be presented next UXWG meeting in the end of July.
  • The WG is also looking at​ adding Google Analytics for the website and integrating with the PX4 analytics
  • We also intend to present a proposal for DC and projects branding.

Camera API Working Group

SDK Working Group

  • The SDK WG is loo​king at a Cloud SDK and an on device SDK for building applications that run on the target device or on a mobile device.
  • The ​WG wiki is now updated at https://wiki.dronecode.org/workgroup/dronecodesdk/start, and includes the first step of a comparative evaluation of our requirements and options.

Messaging Working Group

  • Collaboration between eProsima and Dronecode member companies, and PX4 community is working well
  • The UART bridge and UDP bridge are working so PX4 ORB topics can now be shared with external processes:
    • If PX4 is running on Linux, then the UDP bridge can be used to advertise topics via RTPS
    • If PX4 is running on a separate flight controller, the companion computer can get ORB topics over USB that are advertised via RTPS
  • 1st release of code scheduled for July 13, 2017
  • Code is at: https://github.com/eProsima/Firmware​

Safety Working Group

  • Dronecode has been asked to join the FAA’s Greetings UAS Safety Team and has accepted
  • Progress continues on an Intel BVLOS application using Dronecode with Airmap extensions

Code Quality Working Group

  • Lots of progress on improving the quality of the code via tools and scanning
  • New investigations into ways to improve code and reduce unit mismatch errors such as https://github.com/nholthaus/units
  • Current goals of the Cod​e Quality WG are:
    • Get improved code coverage in real missions
    • Add ROS tests
    • Add comprehensive tests that can be run for each PR but that are not merge gating
    • HIL
      • Mission tests should also work in HIL (may need restructuring)
    • Improve awareness of the testing already being done
    • Add summary page to tests being uploaded
    • Measure test coverage of the code base
      • Consider code restructuring to provide more clarity about what code is in a particular build and the level of coverage of that code​

 

CONTRIBUTIONS.

This month two new point release were made to the new PX4 v1.6 release. The project pulse shows we’ve merged 49 PRs (+4) and closed 57 issues (-170). More than 5110 lines were added and 2431 deleted.

l8t-U0i0rIfQutlZb-ppyshDnvXhNqku3RtzQ9Epbn3abHMQBWiRY3Bp-Bj7tB5OSV5BFvy6LBRClf-L3LBBirEl_Vc1Ad8tE2qiFrw7gBUA3DlUYZq6bxb5vba7WDcwFXjSonE2g_H0We_6OFHyNS2OikC_f8OIG28N_QI=s0-d-e1-ft#%3Ca%20href=https://gallery.mailchimp.com/24be540c6d064661ffb2cbe53/images/f9355eea-6fb1-4614-b940-6d18061bbc99.png" alt="" width="564" align="center" />

 

FLIGHT TESTING.

Flight Testing Stats (Jun 07 – Jul 04)

We built a QAV250 with Snapdragon board and PWM based ESC’s

 

DRONECODE PLATFORM IN THE REAL WORLD.

The following posts from the PX4 blog show new Dronecode Platform builds, features and uses.

Read more…
3D Robotics

3689718121?profile=original

It's customary and traditional that we celebrate the addition of every 1,000 new members here and share the traffic stats. We've now passed 85,000 members! We're also more than ten years old!

Rather than simply give the usual monthly traffic snapshot, I thought I'd give the data for the whole decade, which tells quite a story. 

  • First, some amazing totals:
    • More than 20 million users and 117 million pageviews over the decade. 
    • 13,400 blog posts
    • More than 60,000 discussion threads
    • Nearly a million comments
  • Second, the ups and downs of this industry. Over the ten years, we've gone from one of the few drone communities around to today, when there are hundreds of sites, most of them commercial, and drone users and developers are scattered amongst them. In the early 2010s, DIY Drones was in the top three results on Google for "drones". Now there are pages and pages of commercial sites before it. That's a natural thing and demonstrates classic maturing of an industry. The amateurs have given way to the pros.
  • Third, the related rise and fall of "DIY" in the drone industry. With the triumph of DJI and its Phantom (and now Mavic and Spark) lines, it's no longer necessary to build your own drone. This is a good thing (the same happened with PCs and all sorts of electronics before it), and many people still choose to do so anyway for fun (as they still do with PCs), but it's clearly gone back to a niche activity or one for developers, much as it was in the early days. 

Today, we're still a big community with healthy traffic (about 20,000 visitors and 35,000 page views a day). And we'll continue just as we are for many years to come. We won't be the biggest site in this space, but we'll continue to be one of the most interesting and a friendly, high-quality place to talk about ideas and projects that extend of potential of drones to change the world. And have fun doing it!

Read more…
3D Robotics

3689718007?profile=originalWhen we got started ten years ago, the annual AUVSI student drone competition was dominated by commercial autopilots, such as Piccolo. Now it's almost entirely open source autopilots, led by ArduPilot (14 of top 20) and Dronecode/PX4 (3 of top twenty). I'm super proud of this having co-founded ArduPilot and now leading Dronecode. Only one commercial autopilot in top twenty -- next year they will be gone entirely!

From sUAS News

3689718024?profile=original

Read more…
3D Robotics

3689717640?profile=original

I noticed that Digikey is now selling Honeywell's newest aerospace-grade IMUs, which cost $1,328 each (note that's just for the IMU; it's not a full autopilot). How do the specs of these aerospace IMUs compare to those we use here? Are they worth the extra money? 

In terms of overall approach, the Honeywell IMU seem very similar to modern autopilots such as Pixhawk 2.x and 3.3: they both have MEMS sensors with internal environmental isolation and temperature compensation.

As for the sensors themselves, I'm no expert on specs, so I'll just post the basics here, comparing the Honeywell sensor to the Pixhawk 3

3689717770?profile=original

On the face of it, the Invensense and ST sensors in the Pixhawk 3 appear at least as good, if not better. But I imagine that there are some other factors that may be more important, such as gyro drift and vibration filtering. The Honeywell specs in drift are shown here: 

3689717807?profile=original

Meanwhile the Invensense ICM-20602 sensor in the Pixhawk 3 gives its drift in different units: ±4mdps/√Hz. I really don't know how to compare those.

Finally, I'm sure that a lot of the performance depends on the software running on the Pixhawk boards, be it PX4 or APM, both of which use GPS to augment the raw IMU data to compensate for drift, along with a lot of other smart filtering. 

So for those IMU experts out there: how do you think these two approaches compare? Are aerospace-grade IMUs worth the extra money?

Read more…
3D Robotics

Intel cancels Edison, Joule boards

3689717677?profile=original

It was well known that Edison was going to be discontinued this year, but Joule, which was just released, is a surprise. This is bad news for any autopilot board that uses Edison, such as Pixhawk 2.1, which will now have to move to another companion computer. (I'd suggest Raspberry Pi). From Hackaday:

Sometimes the end of a product’s production run is surrounded by publicity, a mix of a party atmosphere celebrating its impact either good or bad, and perhaps a tinge of regret at its passing. Think of the last rear-engined Volkswagens rolling off their South American production lines for an example.

Then again, there are the products that die with a whimper, their passing marked only by a barely visible press release in an obscure corner of the Internet. Such as this week’s discontinuances from Intel, in a series of PDFs lodged on a document management server announcing the end of their Galileo (PDF), Joule (PDF), and Edison(PDF) lines. The documents in turn set out a timetable for each of the boards, for now they are still available but the last will have shipped by the end of 2017.

It’s important to remember that this does not mark the end of the semiconductor giant’s forray into the world of IoT development boards, there is no announcement of the demise of their Curie chip, as found in the Arduino 101. But it does mark an ignominious end to their efforts over the past few years in bringing the full power of their x86 platforms to this particular market, the Curie is an extremely limited device in comparison to those being discontinued.

Will the departure of these products affect our community, other than those who have already invested in them? It’s true to say that they haven’t made the impression Intel might have hoped, over the years only a sprinkling of projects featuring them have come our way compared to the flood featuring an Arduino or a Raspberry Pi. They do seem to have found a niche though where there is a necessity for raw computing power rather than a simple microcontroller, so perhaps some of the legion of similarly powerful ARM boards will plug that gap.

So where did Intel get it wrong, how did what were on the face of it such promising products fizzle out in such a disappointing manner? Was the software support not up to scratch, were they too difficult to code for, or were they simply not competitively priced in a world of dirt-cheap boards from China? 

Read more…
3D Robotics

Is it flattering that NASA uses a 3DR Y6 to teach "crash management" techniques? I'm going with yes! Register here

ASA’s Langley Research Center is offering a free informational webinar on its  autonomous crash management system for small UAVs that enables landing a malfunctioning unit to a safe and clear ditch site. The webinar will take place on July 25th @ 2PM (EDT).

The mission of the system, called Safe2Ditch, is emergency management to get the vehicle safely to the ground in the event of an unexpected critical flight issue. For example, a drone delivery flight that loses battery power before reaching desitnation.

Safe2Ditch uses intelligent algorithms, knowledge of the local area, the remaining control authority and battery life to select the safest landing location for a crippled UAV and steer it to the ground. The system helps minimize the risk of UAVs to people and property. This mission is performed autonomously, without any assistance from a safety pilot or ground station and all while residing on a small processor onboard.

During this free webinar, lead inventors Patricia Glaab and Louis Glaab will discuss this technology and its potential uses, followed by an open Q&A session.

Read more…
3D Robotics

From Nvidia: Here's the full paper.

Most drones would be lost without GPS. Not this one.

A drone developed by NVIDIA researchers navigates even the most far-flung, unmapped places using only deep learning and computer vision powered by NVIDIA Jetson TX1 embedded AI supercomputers.

Although initially designed to follow forest trails to rescue lost hikers or spot fallen trees, the low-flying autonomous drone could work far beyond the forest — in canyons between skyscrapers or inside buildings, for example — where GPS is inaccurate or unavailable.

“This works when GPS doesn’t,” said Nikolai Smolyanskiy, the NVIDIA team’s technical lead. “All you need is a path the drone can recognize visually.”

To keep costs low, researchers built their drone with off-the-shelf components. The drone navigates without GPS and relies instead on deep learningResearchers built their drone with off-the-shelf components to reduce costs.

No GPS? No Problem

Although the technology is still experimental, it could eventually search for survivors in damaged buildings, inspect railroad tracks in tunnels, check stock on store shelves, or adapted to examine communications cables underwater, Smolyanskiy said.

The team’s already trained it to follow train tracks and ported the system to a robot-on-wheels to traverse hallways. The drone also avoids obstacles like people, pets or poles.

“We chose forests as a proving ground because they’re possibly the most difficult places to navigate,” he said. “We figured if we could use deep learning to navigate in that environment, we could navigate anywhere.”

Unlike a more urban environment, where there’s generally uniformity to, for example, the height of curbs, shape of mailboxes and width of sidewalks, the forest is relatively chaotic. Trails in the woods often contain no markings. Light can be filtered through leaves; it also varies from bright sunlight to dark shadows. And trees vary in height, width, angle and branches.

Flight Record

To keep costs low, the researchers built their device using an off-the-shelf drone equipped with the NVIDIA Jetson TX1 and two cameras.

“Our whole idea is to use cameras to understand and navigate the environment,” Smolyanskiy said. “Jetson gives us the computing power to do advanced AI onboard the drone, which is a requirement for operating in remote environments.”

The NVIDIA team isn’t the first to pursue a drone that navigates without GPS, but the researchers achieved what they believe is the longest and most stable flight of its kind. Their fully autonomous drone flies along the trail for a kilometer (about six-tenths of a mile), avoiding obstacles and maintaining a steady position in the center of the trail.

Team member Alexey Kamenev played a big role in making this happen. He developed deep learning techniques that allowed the drone to smoothly fly along trails without sudden movements that would make it wobble. He also reduced the need for massive amounts of data typically needed to train a deep learning system.

In the video above, the drone follows a trail in the forest near the researchers’ Redmond, Wash., office. The areas in green are where the robot decided to fly and the red areas are those it rejected.

No Breadcrumbs Needed

The drone learned to find its way by watching video that Smolyanskiy shot along eight miles of trails in the Pacific Northwest. He took the video in different lighting conditions with three wide-angle GoPro cameras mounted on the left, center and right of a metal bar on a mini Segway.

In addition to their own footage, researchers trained their neural network — called TrailNet — on video recorded on trails in the Swiss Alps by AI researchers at Istituto Dalle Molle di Studi sull’Intelligenza Artificiale (IDSIA) in Zurich.

In fact, IDSIA’s work on drone forest navigation was one inspiration for NVIDIA’s autonomous drone team. The other inspiration was NVIDIA’s self-driving car, BB8.

Next Steps

The team now plans to create downloadable software for Jetson TX1 and Jetson TX2 so others can build robots that navigate based on visual information alone.

Long term, the idea is to tell the robot to travel between two points on any map — whether it’s a Google map or a building plan — and have it successfully make the trip, avoiding obstacles along the way.

For more information about the team’s work, see “Toward Low-Flying Autonomous MAV Trail Navigation using Deep Neural Networks for Environmental Awareness” or watch their talk at the GPU Technology conference.

 

Read more…
3D Robotics

New two-motor VTOL from Horizon

This kind of 2-motor vertical take-off plane was a PhD thesis 2 years ago, a TED talk 1 year ago & now it's a $150 toy. From Horizon Hobby:

Key Features

  • Multirotor versatility and sport plane agility
  • Takes off and lands vertically in small areas
  • Fly slow or fast and perform aerobatics in airplane mode
  • Can be hand launched and belly-landed like a conventional wing
  • Simple tail-sitter design and SAFE technology makes VTOL flying easy
  • Stability and Acro modes that provide a wide range of flight performance
  • Optional and patent-pending FPV camera and servo-driven mechanism (sold separately)
  • 280-size brushless motors compatible with 2S 450-800mAh LiPo batteries
  • Outstanding speed and climb performance
  • Lightweight and extremely durable EPO airframe
  • Colorful decal sheet with multiple trim scheme options
  • Ready to fly within minutes of opening the box
  • Propeller guards and vertical fins that are easy to install or remove
Needed to Complete
  • Full-range, 6+ Channel DSMX®/DSM2® transmitter
  • 450-800mAh 2S LiPo flight battery
  • 2S compatible LiPo charger
What's in the box?
  • (1) X-VERT VTOL Airplane
  • (1) 3-in-1 Receiver/ESC/Flight Controller Unit
  • (2) BL280 2600Kv Brushless Outrunner Motor
  • (4) Decal Sheets
  • (1) User Manual

Overview

The X-VERT™ VTOL gives you all the fun and versatility of a Vertical Take Off and Landing aircraft without the need for complex mechanics or fancy programming. It also makes the transition between multirotor and airplane flight as easy as flipping a switch. You can also take your flight experience to a whole different level using the optional and patent-pending FPV camera and servo-driven mechanism that transition automatically when the X-VERT does (FPV gear sold separately).

Sleek and Simple Design

A lot of VTOL aircraft require complex mechanisms like tilting wings and motors to achieve vertical and forward flight. The X-VERT park flyer's simple, tail-sitter design and SAFE® stabilization technology allow it to fly like an airplane or a multirotor using nothing more than differential thrust and its elevons. The simplicity of this design also makes the lightweight, EPO airframe remarkably durable.

Wide, Pilot-Friendly Flight Envelope

The light wing loading and efficient aerodynamics inherent in the aircraft's design play a big role in making it easy to fly, especially in airplane mode. Fast or slow, pilots will enjoy smooth, predictable response at any speed.

SAFE® Flight Control Software Makes it Easy

At the heart of it all is exclusive SAFE (Sensor Assisted Flight Envelope) flight control software that has been expertly tuned so almost any RC pilot can experience the fun of VTOL flight.

Automated Transition

Making the transition between multirotor and airplane flight is as simple as flipping a switch. The flight controller will automatically transition the aircraft from one to the other using SAFE technology to stabilize everything so you can relax and have fun.

3 Flight Modes

The advanced flight control software features three flight modes that, along with the model's light wing loading and efficient aerodynamics, give you a wide range of performance.

-   Multirotor Stability Mode

This mode allows you to take off and land vertically like a multirotor aircraft. It's also great for indoor flight. In this mode the aircraft's tail remains pointed at the ground while the flight controller uses a combination of differential thrust and elevons to control pitch, bank and rotation. Pitch and bank angles are limited, and SAFE technology will work to keep the model in a stable hover whenever you release the sticks.

-   Airplane Stability Mode

In this mode the aircraft responds to pitch, roll and yaw commands like a typical airplane. SAFE technology limits pitch and bank angles so new pilots can experience airplane flight without accidentally rolling upside down or losing control. It will also return the wings to level whenever the sticks are released.

-   Airplane Acro Mode with AS3X® Technology

In Airplane Acro Mode the model becomes an agile, fully aerobatic flying wing. There are no angle limits or self-leveling. The large, long-throw elevons will allow you to perform incredibly tight turns as well as a wide range of aggressive aerobatic maneuvers. And because you have the forward thrust of two brushless motors working for you, there is plenty of speed and power to spare. You can even use the differential thrust of the motors for yaw control to perform wild spinning and tumbling maneuvers.

As you fly, AS3X technology works behind the scenes to smooth out the effects of wind and turbulence. It feels completely natural, too. It won't interfere with or limit your control in any way. You simply enjoy a sense of stability and precision that makes you feel like you're flying a much bigger aircraft.

Customize Your Trim Scheme

The included decal sheet gives you multiple trim scheme themes to choose from - a military theme with bomb and rocket decals, and different sport themes with vibrant colors. You can even personalize your trim scheme by mixing and matching decals from different themes.

FPV Ready

You can also take your flight experience to a whole different level using the optional and patent-pending FPV camera and servo-driven mechanism that transition automatically when the X-VERT does (FPV gear sold separately).

Super-Simple Transmitter Setup

The model comes equipped with a Spektrum receiver that is built into the flight controller. It can be flown with any full-range, 6+ channel DSMX/DSM2 aircraft transmitter. No complex programming or setup is required. All you have to do is assign switches for the flight mode/transition changes and throttle arming.

Read more…
3D Robotics

Dronecode/PX4 1.6 code released!

3689717365?profile=original

From the Dronecode release post:

We’re very excited to announce the release of PX4 v1.6, the latest version of the Dronecode Flight Stack (PX4 Pro). This firmware represents a huge increase in both usability, functionality, and stability/robustness since our last significant delivery back in August 2016 (PX4 v1.5.0).

Just a few of the new features and enhancements in this release are:

  • New flight modes for Fixed Wing – Acro and Rattitude
  • New uLog logging format that directly logs uORB topics for more accurate time stamping. This is already supported for review and analysis here: http://review.px4.io
  • Improvements to camera triggering to make it easier to use and and provide better real-time feedback
  • Support for survey flights in multicopter and fixed wing with an intuitive UI
  • Temperature calibration and compensation
  • Support for MAVLink and PWM controlled gimbals
  • Support for generic helicopters and Blade 130 mixer
  • Improved robustness in EKA2 and hardening against marginal GPS reception.
  • Significant improvements to user experience for both the Qualcomm Snapdragon Flight and the Intel® Aero Ready to Fly Drone
  • Support for STM32F7 and NuttX Update to a recent release
  • New hardware support including the Crazyflie v2, FMUv4 PRO and FMUv5 (Special thanks to Drotek and Team Blacksheep for donating the FMUv4 and FMUv5 hardware!)

This is also the most tested and hardened PX4 release to date. The dedicated test team that have done hundreds of hours of testing, on all the major vehicle platforms and using all the main reference flight controller hardware.

A breakdown of the testing since the last stable release (1.5.5) is listed below:

  • 2257 commits tested.
  • 847 total flights on 12 different vehicles and 6 different flight controllers:
    • Pixhawk mini (DJI F450): 554
    • Pixhawk mini (Generic Quad): 11
    • Pixhawk 1 (DJI F450): 15
    • Pixhawk mini (Hexa): 11
    • Pixhawk mini (Phantom FW): 17
    • Pixhawk mini (QAV 250): 28
    • Pixracer (DJI F450): 34
    • Pixracer (Flipsport): 140
    • Pixhawk 3 Pro (DJI F450): 27
    • Dropix (VTOL): 1
    • Intel® Aero Ready to Fly Drone: 6
    • Snapdragon (200qx): 3
  • 6 releases tested: 1.6.0-rc11.6.0-rc21.6.0-rc31.6.0-rc41.6.01.6.1
  • 22 PR’s tested: 6362643864406505663367566777686268636920700370097017703670957260726572687274728172877346

The firmware is already available in QGroundControl (for access to the best UI you may choose to use the “daily build” here). We owe a huge debt of gratitude to the whole PX4 Development Team for this outstanding work.

Check out the release notes for more information

Read more…
3D Robotics

3689717122?profile=original
Awesome post from Patrick Poirier in the OpenMV forums (OpenMV is my favorite computer vision board, and what I use on our DIY Robocars autonomous racers).

--------

This project is a variation of the original Rand'ys Red Balloon Finder implementation.
Based on this blog : http://diydrones.com/profiles/blogs/red-balloon-finder , I modified the Python scripts making now possible to test on any ArduPilot based Quadcopter on a low budget , relatively easy to implement controller.

Code: Select all

     > #This is the configuration script, allowing usage the configuration file       > import balloon_config      > #We need these modules (installed with dronekit ) to control vehicle>     > from pymavlink import mavutil      > from dronekit import connect, VehicleMode, LocationGlobal        > # connect to vehicle with dronekit      > #MAIN      >             # only process images once home has been initialised      >             if self.check_home():       >                 # check if we are controlling the vehicle      >                 self.check_status()       >                 # look for balloon in image      >                 self.analyze_image()       >                 # search or move towards balloon      >                 if self.search_state > 0:      >                     # search for balloon      >                     self.search_for_balloon()      >                 else:      >                     # move towards balloon      >              self.move_to_balloon()# move towards balloon      > # move_to_balloon - velocity controller to drive vehicle to balloon      > # calculate change in yaw since we began the search      > # get speed towards balloon based on balloon distance      > # apply min and max speed limit      > # apply acceleration limit      > # calculate yaw correction and final yaw movement      > # calculate pitch correction and final pitch movemen      > # calculate velocity vector we wish to move in      > # send velocity vector to flight controller      >       send_nav_velocity(pitch_final, yaw_final, speed)       >     # complete - balloon strategy has somehow completed so return control to the autopilot                >      # stop the vehicle and give up control      >        # if in GUIDED mode switch back to LOITER  

OpenMV Script
The Red Balloon Finder is a typical colored BLOB Detector/Tracker. 
We are adding a serial output to PORT 3 so the x-y location and blob width & height can be transmitted to the RPI Zero.
uart_baudrate = 9600
uart = pyb.UART(3, uart_baudrate, timeout_char = 1000)
uart.write("%d ; %d ; %d ; %d \r\n " % (blob.cx(), blob.cy(),blob.w(),blob.h()))

Image

Some theory
In vision based systems, there are many types of hardware/software configuration tailored for sp[/img]ecific applications: Visual Servoing, Visual Odometry and Visual Simultaneous Localization And Mapping (SLAM). In this project we are using the former type of system: Visual Servoing that is designed to:
• Take off and Landing
• Obstacle Avoidance/Tracking
• Position and Attitude control
• Stabilization over a target

The main idea of Visual Servoing is to regulate the pose {Cξ,T } (position and orientation) of a robotic
platform relative to a target, using a set of visual features {f } extracted from the sensors.

Image
Source: Survey on Computer Vision for UAVs: Current Developments and Trends 

Randy's Target Tracking is an Image Based Visual Servoing (IVBS), where the 2D image features are used for the calculation and control values. We exploit a hybrid method where the size of the object is known -a priori- making the estimation of distance along the Z axis possible. In the example below, were the system is following a moving target at a fixed distance, we can relate the target position to the camera projected plane

Image
Source: 3D Object following based on visual information for Unmanned Aerial Vehicles

In this Tracker, we apply a color and shape (blob) filtering in order to extract a location on the camera plane.
We need to extract position and and estimate the distance by computing the knowned object size with the camera parameters: 
lens_mm = 2.8
lens_to_camera_mm = 22
sensor_w_mm = 3.984
sensor_h_mm = 2.952
x_res = 320
y_res = 240

The field of view derives both from the focal length of the lens and the size of the image sensor.
h_fov = 2 * math.atan((sensor_w_mm / 2) / lens_mm)
v_fov = 2 * math.atan((sensor_h_mm / 2) / lens_mm)
In degrees : h_fov = 70.85 , v_fov 55.60

Image
http://chrisjones.id.au/FOV/fovtext.htm


The values coresponding to the object positions and size are transmitted from OpenMV to RPI Zero using serial communication. The python scripting module OpenMV.py reads these values and pass them the Analyze_image() process within Balloon_Strategy_2.py in order to transform position and distance. Here is how we compute the distance:

Code: Select all

    # get distance     self.balloon_distance = get_distance_from_pixels(size, balloon_finder.balloon_radius_expected)        ====FROM find_balloon.py====      # define expected balloon radius in meters      self.balloon_radius_expected = balloon_config.config.get_float('balloon','radius_m',0.3)     # get_distance_from_pixels - returns distance to balloon in meters given number of pixels in image      #    size_in_pizels : diameter or radius of the object on the image (in pixels)     #    actual_size : diameter or radius of the object in meters (0.3)     def get_distance_from_pixels(size_in_pixels, actual_size):     # avoid divide by zero by returning 9999.9 meters for zero sized object      if (size_in_pixels == 0):         return 9999.9     # convert num_pixels to angular size     return actual_size / balloon_video.pixels_to_angle_x(size_in_pixels)           ====FROM Balloon_video.py====     # get image resolution     self.img_width = balloon_config.config.get_integer('camera','width', 320)     # define field of view     self.cam_hfov = balloon_config.config.get_float('camera','horizontal-fov', 70.85 )     # pixels_to_angle_x - converts a number of pixels into an angle in radians      def pixels_to_angle_x(self, num_pixels):         return num_pixels * math.radians(self.cam_hfov) / self.img_width






Building the Tracker System
Depending on the type and size of UAV you are planning to use, the implementation may vary. This build is for a Quadcopter 330 mm size, with a pixracer Flight Controler and a single 5Volts 3 Amps BEC. It is assumed that the UAV is completely functionnal, tested and tuned to be ''rock solid '' in Loiter & Guided mode.

Image



Loading the code, setting parameters & testing 
Test&Development environment:
⦁ RPI Zero to a powered USB hub, keyboard, mouse, WIFI, FTDI serial and HDMI monitor
⦁ Window based PC for OpenMV & Mission Planner
⦁ Linux (or VM) based PC for SITL


OpenMV script
Open IDE
Load the Script
Adjust the color filter using Tools/Machine Vision/ Threshold editor
Make sure you are using LAB color space
Cut the Values & Paste them in the OpenMV script. These values must be inserted in the appropriate thresholds filter code (generic red thresholds for our example):
Image
Run the test and confirm that it is tracking steadily
When satisfied, Tools/Save Open script to OpenMV 

Python Script
On your RPI, open a terminal window and clone the program
git clone https://github.com/patrickpoirier51/OpenTracker

_The code is still a WIP , sorry for the mess ;-)_

Move to the OpenTracker/script and with an editor (simply open the file with the file manager), adjust the parameters of ballloon_finder.cnf , if required. You can test if the OpenMV to RPI Zero connection is working correctly by launching OpenMV.py using command line : sudo python openmv.py or using IDLE2 editor and running script. Remove the # before print command in order to see the values on console. You can run commands without sudo but it might happen sometimes that you dont have all the priviledge to access codes and devices; its up to each users to test with or without sudo.

Hint: 
Ckeck for activity light on FTDI (Show / Hide object) == Knowing the pattern will be helpfull once you are testing live

Once satisfied, comment out the print command, save and you are ready for test.


Testing in SITL
Practice makes good and using SITL may save your day ;-)
Leaving the
You can connect the RPI Zero Ftdi Usb to Serial converter to a second FTDI USB to serial on a Linux based computer (dont forget to cross the XMIT to RX(
Launch SITL:
/home/Ardupilot\ArduCopter/$ sim_vehicle.py --console --map --aircraft=balloon --out udp:missopn.planner.pc.adress:14550 --out /dev/ttyUSB0,115200

On the RPI start the Tracker on within a terminal session with sudo python balloon_strategy.py or using IDLE2 editor and running script. Make shure that we connect using Mavlink Serial- USB
connect ( '/dev/ttyUSB0' , baud=115200)

You should see on the RPI console the connection to vehicle, the parameters initialisation sequence and the tracker waiting for command.

On SITL you can initiate this sequence:
⦁ mode loiter
⦁ arm Throttle
⦁ rc 3 1800
⦁ takeoff 30 (wait until it reach this altitude)
⦁ rc 3 1500 (keep altitude in LOITER)
⦁ mode guided

Once in guided, the RPI takes command and you will see the quadcopter start turning, serching for the object.
You can test by ''showing'' the object to the camera for a brief period of time, and you should see the " "Found Balloon at heading:%f Alt:%f Dist:%f" " message appearing. Hide the object and wait until the message "Balloon was near expected heading, finalizing yaw to %f" message appears , starting the final alignment to object , and then you can ''show'' the object and start chasing when you see "Finalized yaw to %f, beginning run"

Now, the simulated vehicle will follow the object on the camera, Looking at the OpenMV IDE you will be able to judge if the simulator react accordingly to the target position on the camera screen. Basically the vehicle will move left-right and up-down (altitude) according to where the object is located in reference to the center of the camera, if its dead center, the simulated quadcopter should go straight with no altitude change.

If you hide the object:

Code: Select all

     # if we lose sight of a balloon for this many seconds we will consider it lost and give up on the search      self.lost_sight_timeout = 3      the tracker will give up : "Lost Balloon, Giving up"     and will go through this sequence:     # complete - balloon strategy has somehow completed so return control to the autopilot               # stop the vehicle and give up control     # if in GUIDED mode switch back to LOITER

Once in LOITER you can run another test just by switching back to GUIDED.




Quadcopter Integration
Here is a picture showing the typical installation:


ArduPilot
Adjust the parameter of the Serial (UART - SPEED - MODE) corresponding to the actual speed (115200) and for MavLink protocol.

Configure the flight modes so its easy for you to switch from LOITER to Guided on the Transmitter.


Headless configuration on the RPI Zero
In order to get maximum speed and power, disable the desktop mode (configuration menu or raspi-config)
There are many ways to get the script running automatically, the simplest being launching it within /etc/rc.local
edit file sudo nano /etc/rc.local, and add this:
python /home/pi/OpenTracker/blob/scripts/balloon_strategy_2.py &

"&" is used to make run python in background
Save and reboot 
Check for activity light on FTDI (Show / Hide object) 



Live Testing
Choose best : Field/Wind/Brightness/Mid Day

Choose the appropriate Object for Target == Red Balloon 
Place Target at least 2 Meter from ground 
Final adjustment of Target with OpenMV IDE using Threshold utility
Set the best Color Filter Mask and Blob Size
Scan around and Make sure you dont get "False Positive" 

Start Quad copter
Check for activity light on FTDI (Show / Hide Target)

Arm
TakeOff Loiter
Switch to Guided (Ready to switch back)

Check Sequence, Analyse Results, Repeat.... Have Fun !!!

Read more…
3D Robotics

3689716863?profile=original

This week the Dronecode/PX4 team will release its biggest and best update, version 1.6. You can see a glimpse of the testing that went into it with our public test log server here

Along with all the automated code and flight simulator testing, there are more than 100 hours of real-world flight testing (above and beyond many times that many hours by the beta testers) by the full-time Dronecode Test Team, as show here.

3689716889?profile=original 

Read more…
3D Robotics

Comparing two low-cost scanning Lidars

3689716699?profile=original

Excerpted from a new post at our DIY Robocars sister site

It’s now possible to buy small scanning 2D Lidars for less than $400 these days, which is pretty amazing, since they were as much as $30,000 a few years ago. But how good are they for small autonomous vehicles?

I put two to the test: the RP Lidar A2 (left above) and the Scanse Sweep (right). The RP Lidar A2 is the second lidar from Slamtec, a Chinese company with a good track record. Sweep is the first lidar from Scanse, a US company, and was a Kickstarter project based on the Lidar-Lite 3 1D laser range finder unit that was also a Kickstarter project a few years ago (I was an adviser for that) and is now part of Garmin.

The good news is that both work. But in practice, the difference between them become very stark, with the biggest being the four times higher resolution of the RP Lidar A2 (4,000 points per second, versus Sweep’s 1,000), which makes it actually useful outdoors in a way that Sweep is not. Read on for the details.

First, here are the basic spec comparisons:

3689716763?profile=original

Bottom line: RP Lidar A2 is smaller, much higher resolution, and better range indoors (it’s notable that the real-world RP Lidar performance was above the stated specs, while the Scanse performance was below its stated specs). The Scanse desktop visualization software is better, with lots of cool options such as line detection and point grouping, but in practice you won’t use it since you’ll just be reading the data via Python in your own code. Sadly the Scanse code that does those cool things does not appear to be exposed as libraries or APIs that you can use yourself.

In short, I recommend the RP Lidar A2.

I tested them both in small autonomous cars, as shown below (RP Lidar at left)

3689716820?profile=original

Both have desktop apps that allow you to visualize the data. Here’s a video of the the two head-to-head scanning the same room (RP Lidar is the window on the right)

You can see difference in resolution pretty clearly in that video: the RP Lidar just has four times as many points, and thus four times higher angular resolution. That means it can not only see smaller objects at a distance, but the objects it does see have four times as many data points, making it much easier to differentiate them from background noise.

As far as using them with our RaspberryPi autonomous car software, it’s a pretty straightforward process of plugging them into the RaspberryPi via the USB port (the RP Lidar should be powered separately, see the notes below) and reading the data with Python.  My code for doing this is in my Github repository here.  We haven’t decided how best to integrate this data with our computer vision and neural network code, but we’re working on that now — watch this space

Read the rest here

Read more…
3D Robotics

3689716592?profile=originalAn Australian IT services company combined an Amazon Alexa and a service called "what3words" that translates GPS addresses into three unique words to create a MAVLink-compatible drone that can be commanded entirely by voice. The video can't be embedded, so see it here

DXC Labs has created an experimental voice-activated (Amazon Alexa) and cloud-controlled (AWS IoT) drone that uses three-word identifiers from what3words to provide precise location directions to within three square meters, anywhere in the world. This means the drone operator (such as a first responder or maintenance worker) can easily give a voice command such as “go to location: public warns artist” that will send the drone to a specific place on earth — in this case the historic shipwreck of the HMVS Cerberus, south of Melbourne, Australia. 

DXC Labs: Voice- and Cloud-Controlled Drone

Getting to Hard-to-Reach Places

 
The what3words service allows user-friendly routing of the computer-controlled drone to locations that may not have a conventional street address, such as plant and equipment locations, a missing person in a national park, a fire in a large campus, or any location whose street address is inaccurate or ambiguous. 

This greatly improves activities such as identifying disaster zone locations for first responders, inspecting power lines and oil rigs, making deliveries to hard-to-reach places, and traveling to any point in the world.

Read more…
3D Robotics

3689716615?profile=original

These aren't drones per se, but I can help but post this story by Owen Churchill in Sixthtone, which exemplifies the DIY spirit. 

CHONGQING, Southwest China — Dismembered remote-controlled airplanes lie strewn across an unmade mattress: a motor glued crudely into an improvised wooden housing, landing gear fashioned out of a metal ruler, and sheets of foam advertising board that will become wings, fuselages, and flaps.

This is the spare room of Hu Bo, an 18-year-old from a village two hours from Chongqing who has turned his hand to making flyable models of China’s home-grown aircraft. He started with decades-old military planes and has recently worked on the new flagbearer of the country’s civil aviation industry, the C919 passenger jet. Using self-taught techniques, open-source plans downloaded from the internet, and cheaply acquired or improvised parts, Hu sells his finished models online — unless he crashes them during the test flight. “My technique isn’t so good,” he tells Sixth Tone with a smile as he tinkers with his latest build, a 1.4-meter-wide, twin-propeller plane modeled on China’s Y-12 utility aircraft.

Like millions of other Chinese children, Hu grew up under the guardianship of his grandparents while his parents traveled in search of work, dividing his time between doing homework, playing with the family’s dog, and making intricate paper airplanes at the family home in Yangliu Village, a tiny hamlet around 100 kilometers west of Chongqing.

But now, having scraped together money to buy some basic tools, Hu has joined the ranks of China’s rising number of amateur aviation enthusiasts, spurred on by a huge yet inconsistently regulated drone industry and inspired by the increasing prowess of the country’s home-grown fleet of both military and civilian aircraft. A number of fifth-generation fighter jets are slated to enter service in the next few years; the maiden flight of the world’s widest seaplane — the AG600 — is scheduled for this year; and the country’s first large passenger jet in decades, the C919, took to the skies for the first time on May 5, catalyzing the emergence of a new generation of patriotic plane spotters despite its plethora of foreign parts.

For poor people like us, we have time but no money, so we have to make it ourselves.

Hu has never been on a plane, nor has he ever purchased a complete remote-controlled plane. While he was inspired to start building planes after seeing local friends discussing the latest and best modeling equipment on social media, he has little respect for people who throw money at the hobby. “They are all renminbi fliers,” he says, referring to China’s monetary currency. “For poor people like us, we have time but no money, so we have to make it ourselves. For them, they have money but no time, so they just buy everything outright.”

Buyers on Xianyu — the secondhand version of China’s premier online marketplace, Taobao — have already praised the caliber of Hu’s models, especially the C919. But at least for now, Hu has little interest in making a profit from his planes — if he believes the buyer is a genuine plane enthusiast who will cherish his models, he doesn't charge much more than the cost of the materials, making around 60 yuan (just under $9) for up to a week’s work.

It’s barely a living wage, and it does little to ease the acute financial pressure that his family currently faces. Six months ago, Hu and 22-year-old Xu Xifang became parents. Hu introduces Xu as his wife, but you won’t find their names on any of China’s marriage records: At 18, Hu is still four years away from the country’s marriageable age for men. Also in the family home is his 8-year-old sister and their 65-year-old grandmother, who works 12-hour shifts at a local waste collection plant for 50 yuan per day.

He may barely be covering his costs, but for Hu, building and flying planes has provided welcome respite to the desperate pursuit of livelihood that has defined his childhood. At 14, he quit school to join his parents in China’s southern Guangdong province to work in a brick factory, where he stayed for four years before returning to his home village to settle down and start a family.

“I don’t want my son to be like me,” Hu says, reflecting on his own experience of living without his parents. “No matter where I am, whether I do manual work or business, whether I have nothing at all, I will always stay beside him.”

Indeed, Hu’s pastime-cum-business has become something of a family affair, with his wife, Xu, helping out when she can with printing and cutting the plans. “At the beginning, I didn’t like that he was doing this — I didn’t think he had any hope,” she says as she slices around the cockpit of a C919. “But then I watched him doing it and saw that he was so happy, so I just let him keep going.”

As Hu squats on the floor putting the final touches on his Y-12, his 6-month-old son An An looks on, engrossed, even reaching out from time to time to prod a wing or grab a propeller. A cool breeze one recent afternoon brought a brief lull in the stifling heat and meant that Hu could take his son up into the hills that surround their home to witness the plane’s maiden flight. “The look in his eyes when he sees a model plane is different to when he sees other toys,” says Hu, back in the house after an accident-free test flight. “Today, I saw that look.”

My grandfather made wheelbarrows, my dad made little toys, and now I make airplanes. I think my son will take the next step and make satellites.

Hu has high hopes for his son. By the time Hu is 30, he wants to have built a plane that he himself can fly in, a project he says he will need his son’s help with. But his aerospace aspirations for An An don’t stop there. “My grandfather made wheelbarrows, my dad made little toys, and now I make airplanes,” he says. “I think my son will take the next step and make satellites.”

For now, Hu may have to concern himself with matters closer to home. Until recently, Hu had been free to fly what he wanted, where he wanted. But that looks set to change, as local authorities around the country scramble to limit the use of remote-controlled aircraft following a series of close encounters between drones and passenger jets. One of the most recent occurred at nearby Chongqing’s Jiangbei Airport, when two instances of drone interference were reported in one evening.

Hu calls it a “day of sadness” for model plane enthusiasts like him. Since the incident, local authorities have stipulated that all model plane fliers must submit their personal information and aircraft specifications to a local government representative. Such restrictions appear to be more stringent than the drone-focused regulations currently being rolled out nationwide, which do not mention fixed-wing model airplanes. Flying zones are also being restricted, and now the nearest designated area to Hu’s home is 50 kilometers away.

Despite his sadness, Hu calls the restrictions reasonable and vows that he would never flagrantly violate them. His recent test flight of the Y-12 in the nearby hills was an exception, he says, explaining that he made sure to control his height, speed, and distance. “Even a bird can be dangerous if it strikes an aircraft,” he says, “let alone a drone weighing 10, 20 kilograms.”

Hu Bo carries his model Y-12 aircraft after taking it out for a flight in Dazu County, Chongqing, May 16, 2017. Wu Yue/Sixth Tone

Hu Bo carries his model Y-12 aircraft after taking it out for a flight in Dazu County, Chongqing, May 16, 2017. Wu Yue/Sixth Tone

Regulations aside, there are other matters that stand in the way of Hu’s passion. His parents recently moved from Guangdong to Chengdu in Chongqing’s neighboring Sichuan province to join a relative’s wholesale grocery company. Hu has been earmarked as one of the company’s delivery drivers, and so has spent the last few weeks preparing for his driving test.

The day after the maiden flight of Hu’s Y-12, he and Xu made the 230-kilometer bus ride to Chengdu, leaving behind Hu’s grandmother and sister. True to his vow never to leave his son, Hu has taken An An with him; true to his passion, Hu has also taken some basic tools and the plans for a massive, 2.4-meter-wide model of China’s new military transporter, the Y-20, despite the fact that he’ll struggle to find anywhere in the metropolis of Chengdu to fly it.

“If I go a day without making planes, then I feel like a smoking addict who hasn’t had a cigarette,” says Hu. One day, he says, he’ll harness that addiction and open a small workshop with a few employees making several planes a day. “That way, my family can have a better life.”

Read more…
3D Robotics

From New Atlas, a writeup on new research from ETH and MIT:

Thanks to new research from MIT and ETH Zurich, however, it may soon be possible for drones to autonomously follow along with an actor, keeping their face framed "just right" the whole time – while also avoiding hitting any obstacles.

To utilize the experimental new system, operators start by using a computer interface to indicate who the drone should be tracking, how much of the screen their face or body should occupy, where in the screen they should be, and how they should be oriented toward the camera (choices include straight on, profile, three-quarter view from either side, or over the shoulder).

Once the drone is set in action, the computer wirelessly sends it control signals that allow it to fly along with the actor as they walk, adjusting its flight in order to maintain the shot parameters. This means that if the actor were to start turning their back on the drone, for instance, it would automatically fly around in front of them, to keep their face in the shot. Likewise, if they started walking faster, the drone would also speed up in order to keep them the same distance from the camera.

It's additionally possible for the aircraft to follow small groups of actors, working to keep that group framed a certain way within the shot. The user can stipulate one of those actors as the main subject, ensuring that the drone moves in order to keep other actors from blocking the camera's view of them.

The system utilizes algorithms that predict the actor's trajectory about 50 times a second, allowing the aircraft to effectively stay one step ahead of the action. This also allows it to correct its own flight path if its onboard sensors detect that it's heading toward a stationary obstacle, or if a moving obstacle (such as an actor) is on a collision course with it.

A team led by MIT's Prof. Daniela Rus will be presenting a paper on the research later this month at the International Conference on Robotics and Automation. The system is demonstrated in the video below.

Source: MIT

Read more…