Thomas J Coyle III's Posts (237)

Sort by
Admin

A big step forward for ArduSub

8f6d6aede996850938d6865813f23821bb36e50c.png

For those of you who are building your own UROV and would like to use ArduSub to control it, here is the path forward from the Developers of ArduSub:

 

By Rusty at Blue Robotics

ArduSub1 is the software at the heart of the BlueROV2. It's based on the solid foundation of the ArduPilot code1, which has been under development for years. ArduSub is open-source, fully featured, and growing rapidly.

Today we want to share some in-progress news that's been in the works for a long time: we're working on merging the ArduSub code into the main ArduPilot repository at github.com/ardupilot/ardupilot. What does mean? Well, up to this point, ArduSub has been developed in our own "branch" of the ArduPilot project. By merging into the main project, we'll join the list of official ArduPilot vehicle types: ArduPlane, ArduCopter, and ArduRover. We'll continue developing and maintain the code ourselves, but we'll be assisted by the awesome developers at the ArduPilot organization. This is also allow us to always be up to date with the latest features, improvements, and bugfixes contributed by the many maintainers.

1

At the moment, there is a pull-request for merging ArduSub into ArduPilot. You can keep track of that here.

For those of you interested in lots of details, here's the text of the pull request, which explains a lot of the work we've done on ArduSub in the past year:


ArduSub has been in development for just over a year. In that time, we have come a long way. It started by simply copying the ArduCopter directory and poking around to see what we needed to change in order to make our vehicle move around underwater. Once we had accomplished that, and as we became accustomed to the extensive codebase, we progressed by increasing and improving functionality. We had our first stable release right at the end of 2016. We versioned the release as 3.4, in line with where we picked up from Copter. We are currently working on 3.5-dev.

We ship our BlueROV2 running ArduSub on a Pixhawk, and the response from professionals in the marine industry has been overwhelmingly positive. In addition to the BlueROV2, we’ve designed ArduSub to be very flexible, and we have DIY ROV users around the world with different ROV designs and motor configurations. ArduSub is thoroughly documented at ArduSub.com1, and we have a very active ArduSub Gitter Channel.

From ArduCopter to ArduSub

The first hurdle was in figuring out how to make our vehicle actually move around underwater. The original development platform, the BlueROV1, has 6DOF, and while it can pitch and roll, it does not need to do so in order to translate in the x and y axes. Our solution was to subclass AP_MotorsMatrix with AP_Motors6DOF, overriding add_motor_raw to include the forward and lateral DOF that multicopters lack.

The second hurdle was acheiving the tantalizing prospect of holding depth with a positive or negatively buoyant vehicle. The onboard barometer is in a sealed compartment, and the pressure will obviously not correspond with altitude. The Bar30 pressure sensor, incorporates the MS5837 waterproof pressure sensor from Measurement Specialties, the same people who brought you the familiar MS5611. This sensor has almost exactly the same interface as the MS5611, which was a welcomed coincidence in the very early stages of development, when we were still learning how everything in ardupilot worked. We use the MS5611 driver to drive the external MS5837, and added a few members to the AP_Baro class in order to distinguish between an 'air' barometer and a 'water' barometer. Fortunately for us (and thanks to you guys), there was already support for multiple barometers and an option to set the primary barometer to use with the EKF. We also added a method to the EKF in order to internally set the baro_alt_noise parameter to a low value, because the pressure measurements underwater are very precise.

We have three supported flight modes, Manual (no stabilization), Stabilize, and Depth Hold. We have made progress in implementing more advanced position-enabled modes; we've even executed short missions in auto mode. We have also managed to create a working rudimentary model in SITL.

GPS receivers will not work underwater, so we have added an AP_GPS_MAVLINK class in order to support marine industry localization sensors. This class inherits AP_GPS_NMEA, and works by receiving raw NMEA sentence data from the telemetry connection in the form of the GPS_INJECT_DATA message. This was implemented before the AP_GPS_MAV type was added, and there is some overlap in terms of functionality. The advantage of AP_GPS_MAVLINK over AP_GPS_MAV is that the serial data (in the form of NMEA sentences) from a GPS system connected to a topside or companion computer can be sent directly over the MAVLink connection to the vehicle and parsed by the autopilot, with no need to parse the data at the origin before finally formatting the output as a GPS_INPUT MAVLink message. AP_GPS_MAVLINK also eliminates the requirement of reserving a UART for GPS input.

There are a few other minor additions to note:


  • The AP_JSButton library was added to handle joystick button mapping to various vehicle functions. - It is supported by QGC as well.

  • PosControl and Fence: added a minimum z limit in order to limit maximum depth

  • Added a leak detector library

  • Added a temperature sensor library

Hardware

ArduSub is used in conjunction with a hard-wired telemetry connection over a tether. This connection is implemented via a RS422 interface directly to the autopilot, or via UDP with MAVProxy running on a companion computer. Pilot input is expected to come over MAVLink via MANUAL_CONTROL messages, and RC input is not supported because RC signals will not penetrate water. Support for ArduSub has been integrated into QGroundControl, and we continue to contribute to QGroundControl in order to improve support for ArduSub as well as other features common to all vehicles.

We have tested ArduSub primarily on the Pixhawk 1, but we have some users on other autopilots including the Navio2 and BBBmini.

Where We’re Headed

ArduSub is being actively developed with a full time developer and several contributors around the world. We plan to continue adding new features and improvements and it’s very important to us to stick with ArduPilot’s original goal of being open source and highly capable. We think that ArduSub is already more capable and extensible than most other ROV control systems.

Read more…
Admin

Pixhawk 2.1 set to fly off shelves

3689703427?profile=original

By Gary Mortimer

From SUAS News

Not just flying, but driving and floating as well. Pixhawk 2.1 running Ardupilot software can be used to command all types of autonomous vehicle.

Shipping has started from Australia and distributors around the world.pixhawk2-1

Designed by Philip Rowse of ProfiCNC the Pixhawk 2.1 is a tale of two halves.

A cube contains an isolated and dampened IMU that is heated by a thermal resistor. This allows for consistent operations across a wide temperature range. It is particularly important in cold weather conditions.

The IMU solution is triple redundant, having.

3 Accelerometers

3 Gyroscopes

3 Magnetometers

2 Barometers

The cube essentially is the part that will keep your flying platform the right way up. It connects to a carrier board that provides the inputs and outputs to flying controls/motors and command and control (C2) links.

Cube and carrier board are sold as a combined kit for $238 or individually.

This allows end users to have the power of a triple redundant heated IMU in their own carrier boards for specialist applications.

The standard carrier board has a built in Intel Edison port to easily add a powerful companion computer. It has standard radio control in and out along with SBUS support.

You can connect two GPS for that extra sense of in-flight security, better still they can be RTK GPS.

Very unusual in this space, the Pixhawk 2.1 has two power inputs. Allowing you redundancy in case of one power system going down.

The Pixhawk 2 Suite comes as standard with.

  1. The Cube…. the brains behind the operation.
  2. A full carrier board.
  3. 1 Power Brick (two power bricks can be filtered for redundant power.)
  4. Cable set that allows you to connect to your old Telemetry module, GPS, and sensors

I was lucky enough to receive an alpha unit, and quite honestly lost my mind over the new connectors. A vast improvement over Pixhawk 1

Read more…
Admin

US Forest Service Clarifies Drone Use

No Drones Sign

From UAS Vision

In light of the increasing popularity of recreational drones, the U.S. Forest Service has released specific guidelines for recreational unmanned aircraft use over public lands:

– Avoid flying over federally-designated wilderness or primitive areas.

– As drones are considered both “motorized equipment” and “mechanical transport,” they cannot take off from, land in, or be operated from federally-designated wilderness areas.

– Avoid flying over noise-sensitive areas or populated sites, including rivers, campgrounds, trail heads and visitor centers.

– A drone may not be used to disturb or harass wildlife.

– No interference with official aerial activities over national forests, such as wildfire detection and suppression.

– Drone pilots must obey state privacy laws.

These guidelines only apply to hobby or recreation operations. Commercial operations include filming, still photography, survey, or any other endeavor for profit that involves use of a drone. These ventures may be allowable through a special use permit issued by the Forest Service.

Source:Ticker News

Read more…
Admin

Which came first, the drone or the PowerEgg?

The PowerEgg unfolds to reveal a fully-functional quadcopter

The PowerEgg unfolds to reveal a fully-functional quadcopter (Credit: PowerVision)

From Gizmag

By:  DAVID SZONDY  FEBRUARY 12, 2016

Conventional drones are often billed as portable, though they're also often a collection of rods, rotors, and other bits and pieces that are perfect for catching on things and getting tangled. To make taking drones into the backcountry a bit less onerous, Beijing-based Powervision Robot has taken the gubbins of a quadcopter and built them into a giant PowerEgg that folds up into one smooth package shaped like a cackleberry for transport.

The PowerEgg with rotors foldedThe PowerEgg rotor detailThe PowerEgg sealed up

The product of 18 months of development, the PowerEgg is PowerVision's first mainstream commercial drone and draws on technology developed for the company's industrial drones. According to the developers, the egg design is not only to allow the quadcopter to act as its own carrying case, but also for compactness and stability.

When switched off, the PowerEgg folds up into a smooth ovoid shell, but when ready for flight the sides split and unfold into landing gear and arms for the folding rotors. Meanwhile, the bottom of the egg opens to reveal a 360-degree panoramic 4K HD camera on a three-axis gimbal. According to PowerVision, the rotors are larger than usual for comparable drones, which required a degree of re-engineering.

Full article and video here

Read more…
Admin

FAA RESCINDS DRONE BAN AROUND DC

dc.jpg?w=800

From HACKADAY

by: Brian Benchoff

Late last year, the FAA expanded a Special Flight Rule Area (SFRA) that applied to Unmanned Aerial Systems, drones, and RC airplanes around Washington DC. This SFRA was created around the year 2000 – for obvious reasons – and applies to more than just quadcopters and airplanes made out of foam. Last December, the FAA expanded the SFRA from 15 nautical around a point located at Reagan National to 30 nautical miles. No remote-controlled aircraft could fly in this SFRA, effectively banning quadcopters and drones for six million people.

Today, the FAA has rescinded that ban bringing the area covered under the Washington DC SFRA to 15 nautical miles around a point inside Reagan National. This area includes The District of Columbia, Bethesda, College Park, Alexandria, and basically everything inside the beltway, plus a mile or two beyond. Things are now back to the way they were are few weeks ago.

The 30-mile SFRA included a number of model flying clubs that were shuttered because of the ban. DCRC is now back up. The Capital Area Soaring Association worked with the FAA and AMA to allow club members to fly.

Of course, limitations on remote-controlled aircraft still exist. For the most part, these are rather standard restrictions: aircraft must weigh less than 55 pounds, fly below 400 feet line of sight, and must avoid other aircraft.

Full article here

Read more…
Admin

In field tests, a quadcopter was slightly better than humans at finding and following a previously-unseen ...

In field tests, a quadcopter was slightly better than humans at finding and following a previously-unseen trail

(Credit: University of Zurich)

From gizmag

By BEN COXWORTH   FEBRUARY 10, 2016

It's becoming increasingly likely that in the not-too-distant future, a robot may be what finds you if you're trapped in rubble at a disaster site. Now, it's also looking like a drone might come to your aid if you should get lost in the woods. That's because scientists have developed machine learning-based software that already allows quadcopters to follow forest paths better than humans.

The program was created by researchers at the University of Zurich, the Università della Svizzera italiana, and the University of Applied Sciences and Arts of Southern Switzerland.

In the course of its development, team members spent several hours hiking along trails in the Swiss Alps, taking over 20,000 photos with a helmet-mounted camera as they did so. The software analyzed these images, using a deep neural network to teach itself the distinct (and sometimes subtle) features that differentiate a trail from the surrounding environment.

The system was then used in a quadcopter, which was equipped with two video cameras for stereoscopic computer vision. When placed on a previously-unseen trail, the drone was able to autonomously orient itself and follow the trail with an 85 percent accuracy rate – humans, by contrast, scored 82 percent.

It is hoped that eventually, multiple drones could be combined with human search-and-rescue teams, allowing a greater area to be covered within the same amount of time. Additionally, the aircraft could be used to check hazardous trails, minimizing the risk to human searchers.

Before that can happen, however, the system needs to be developed further. In its current form, for instance, it's still not capable of identifying objects as being humans when it finds them.

A paper on the research was recently published in the journal IEEE Robotics and Automation Letters.

Source: University of Zurich

Full article here

Read more…
Admin

drone_racing_league-shot0005_featured.png?w=800

From Hackaday

by:  James Hobson

Arguably, drones are one of the next big things that will revolutionize many industries. We’ve already seen them portrayed in many movies and TV shows in not-so-distant futures, and what with Amazon planning drone deliveries, we can’t imagine it’ll be long before they are a common sight flying around cities.

While drone racing remains underground in many places, the Drone Racing League is hoping to change that — and turn it into a real sport. In the recent article by The New Stack, they compare drone racing to the beginning of skateboarding back in the 90’s;

With a small group of people pushing the envelope and inventing every day.

Not for long though. DRL is making a huge push to turn this into a mainstream sport, and we gotta admit — we don’t mind. After all, this is like pod-racing on crack. Just take a look at the following promo video for their course the Gates of Hell: the Dream Takes Form.

That is some exciting stuff. We love the use of colored lights to indicate the map path — this could become an industry in itself, designing race tracks for drones! Imagine ones where the route changes mid-race, it’s all possible with a huge network of lighted gates… there’s so many possibilities…

[Thanks for the tip Destinyland!]

Full article here

Read more…
Admin

The sea-going robots are made using digital manufacturing techniques

The sea-going robots are made using digital manufacturing techniques

(Credit: Biomachines Lab)

From Giz Magazine

 DAVID SZONDY   FEBRUARY 2, 2016

Robots may be the wave of the future, but it will be a pretty chaotic future if they don't learn to work together. This cooperative approach is known as swarm robotics and in a first in the field, a team of engineers has demonstrated a swarm of intelligent aquatic surface robots that can operate together in a real-world environment. Using "Darwinian" learning, the robots are designed to teach themselves how to cooperate in carrying out a task.

A major problem facing the navies of the world is that as ships become more sophisticated they also become much more expensive. They are packed with highly trained personnel that cannot be put at risk, except in the most extreme circumstances, and even the most advanced ship suffers from not being able to be in two places at once.

One solution to this dilemma is to augment the ships with swarms of robot boats that can act as auxiliary fleets at much lower cost and without risk of life. The tricky bit is figuring out how to get this swarm to carry out missions without turning into a robotic version of the Keystone Cops. The approach being pursued by a team from the Institute of Telecommunications at University Institute of Lisbon and the University of Lisbon in Portugal is to rely on self-learning robots.

Led by Dr. Anders Christensen, the team recently demonstrated how up to ten robots can operate together to complete various tasks. The small robots are made of CNC-machined polystyrene foam and 3D-printed components at a materials cost of about €300 (US$330). The electronics pack include GPS, compass, Wi-Fi, and a Raspberry Pi 2 computer. However, the key is their decentralized programming.

"Swarm robotics is a paradigm shift: we rely on many small, simple, and inexpensive robots, instead of a single or a few large, complex, and expensive robots," says Christensen. "Controlling a large-scale swarm of robots cannot be done centrally. Each robot must decide for itself how to carry out the mission, and coordinate with its neighbors."

Instead of using a central computer or programming each robot individually, the swarm operates on what the team calls a Darwinian approach. In other words, each robot is equipped with a neural network that mimics the operations of a living brain. The robots are given a simple set of instructions about how to operate in relationship to one another as well as mission goals.

The robots are then allowed to interact with one another in a simulated environment and those that display successful mission behavior are allowed to proceed. The "fittest" robots from the simulations are then tested in the real world.

According to the team, the clever bit about the swarm is that, like schools of fish or flocks of birds, none of the robots know of or "care" about the other robots beyond their immediate neighbors. Instead, they react to what their immediate neighbors do as they determine the best way to fulfill their mission objectives such as area monitoring, navigation to waypoint, aggregation, and dispersion. In a sense, they learn to cooperate with one another.

The team is currently working on the next generation of aquatic robots with more advanced sensors and the ability to handle longer missions. Eventually, they could be used in swarms numbering hundreds or thousands of robots for environmental monitoring, search and rescue, and maritime surveillance.

The team's research is being peer reviewed and is available here.

The video below describes how the sea swarm works.

Full article here

Read more…
Admin

GoPro is in a massive tailspin

3689679991?profile=original

From CNN Money

  @DavidGoldmanCNN

Layoffs, new cameras and a strategy shift. Nothing seems to be working for GoPro, which has plummeted to yet another all-time low.

After posting a huge loss last quarter, the company fired its chief financial officer and slashed its product lineup to just three different cameras.

On a conference call with investors, Woodman acknowledged concerns that GoPro can't expand beyond its niche market of people who want action cameras. He said that he's not concerned about GoPro being a niche -- instead, he says the company is focused on helping make GoPros easier to use for existing customers.

"We recognize the need to develop software solutions that make it easier for our customers to offload, access and edit their GoPro content," he said.

He also said GoPro would only sell just three cameras in the future: Hero 4 Black, Hero 4 Silver, and Hero 4 Session. It's killing off its cheapest, $130, camera called the Hero. It will also stop making the Hero+ LCD and the Hero+.

But the company isn't totally giving up on expansion. GoPro will continue with its plans to launch the Hero 5 this year as well as its new Karma drone. It also is working on virtual reality products.

"In 2016 we are committed to delivering a breakthrough -- the breakthrough experience we've all be waiting for," Woodman promised. "It's against this commitment that you can judge our performance this year."

Full article and video here

Read more…
Admin

DIY Drones at 75,000 members!

3689680116?profile=original

It's customary and traditional that we celebrate the addition of every 1,000 new members here and share the traffic stats. This time it's a big 75,000!!!!

There were approximately 1.4 million page views in the last month! (we now get around 45,000 page views a day on average). It took us just 30 days to add these latest 1,000 members--we're averaging one new member every 43 minutes!

Thanks as always to all the community members who make this growth possible, and especially to the administrators and moderators who approve new members, blog posts and otherwise respond to questions and keep the website running smoothly.

Regards,

TCIII Admin

3689679961?profile=original

3689679978?profile=original

3689680158?profile=original

Read more…
Admin

FAROS navigates a smoke-filled and obstructed hallway

FAROS navigates a smoke-filled and obstructed hallway (Credit: KAIST)

From Gizmag

 BEN COXWORTH   JANUARY 20, 2016

Fires in high-rises can be particularly deadly. This is partially because of the buildings' "chimney" effect, along with the fact that it's just plain difficult for firefighters to reach the flames. With that in mind, researchers at the Korea Advanced Institute of Science and Technology (KAIST) have created the flying, wall-climbing, fire-resistant FAROS quadcopter. It's designed to ascertain the source of a fire as soon as possible, along with the locations of people trapped within the building.

A new-and-improved version of a 2014 project, FAROS (Fireproof Aerial RObot System) is intended to scout the interiors of buildings before firefighters enter, allowing them to save time and minimize risk by knowing in advance where to concentrate their efforts. It can autonomously fly down hallways, guided by a 2D laser scanner, an altimeter, and an inertial measurement unit containing accelerometers and gyroscopes. These sensors also allow it to estimate its location within the building.

In situations where its flight path is partly blocked, it can turn sideways and press itself feet-first up against the wall, then use its propellers to push itself along until it clears the obstacle – it's not unlike the wall-climbing system used by Disney Research's VertiGowheeled robot.

Using a thermal-imaging camera along with a dedicated image-processing system, FAROS can detect people, plus – if things aren't too far along – it can ascertain the fire's ignition point. This data is transmitted back to firefighters.

Additionally, FAROS features an aramid outer skin, protecting its electric and mechanical components from flames. There's also a thermoelectrically-cooled air gap between that skin and the drone itself, helping to isolate it from intense heat. In lab tests, the copter withstood butane gas and ethanol aerosol flames reaching over 1,000º C (1,832º F) for more than one minute.

It can be seen in action, in the video below.

Full article here

Read more…
Admin

Amazon's new Prime Air Delivery drone

Amazon's new Prime Air Delivery drone (Credit: Amazon)

From Gizmag

 ERIC MACK   JANUARY 19, 2016

You've heard of the Internet of Things – the generic name given to all the various networked sensors, machines, devices and even buildings in the world – but most of those "things" stay in one place for the most part. The world is primed for an explosion of autonomous ambulatory devices, which led a team of engineers from the University of Waterloo in Canada to draft a conceptual framework for an "Internet of Drones."

The authors of a paper on the concept (linked at the bottom of the page) lay out what is essentially a structure for how drone traffic could be managed. It combines elements of the current air traffic control system, cellular networks and the internet.

The paper proposes terminology for key components of the system, with airspace divided up into "zones," each managed by a "zone service provider" (ZSP) that operates their own section of airspace.

The zone service provider, which could be software-based rather than an actual human operator, is sort of like a combination of a cell tower and an air traffic controller for a specific airport. Drones and zone service providers communicate via the cloud to ensure that autonomous traffic flows through that zone safely, and according to whatever rules have been established for that zone. When a drone passes into a new zone, it is handed off in much the same way that a wireless device is transferred to a new cell tower as it travels.

The infrastructure can also allow for third parties outside the zone (such as administrators, retailers on either end of a delivery or possibly even consumers) to communicate with drones in flight. This would be particularly useful for proposed drone delivery services like those that Amazon, Google and others are working on.

The paper suggests that the existing cell network base stations could be used to actually deploy the system.

"Since these base stations are already deployed, the physical space is available and they are capable of running the ZSP software," it reads. "Therefore, they seem well positioned to implement ZSPs and provide wide network coverage for [the Internet of Drones]."

Within each zone, defined airways, intersections and nodes are established, which can be thought of as being similar to the system of roads, intersections and destinations that cars currently use on the ground. Even though drones could theoretically fly anywhere in the three-dimensional airspace, the idea is to establish designated airways and regulate traffic through them to avoid collisions. Drones would also be responsible for avoiding collisions with objects outside the system (such as birds) on their own, and keeping the ZSP advised of those maneuvers.

Plenty of others are also working on how the coming, drone-filled world will fly. NASA has been working with Exelis on another drone-tracking system, but it's not immediately clear if it could integrate with the newly-proposed architecture.

Full article here

Read more…
Admin

The technique was demonstrated within a

The technique was demonstrated within a "simulated forest"

From Gizmag

 NICK LAVARS    JANUARY 19, 2016

Drone technology sure is promising plenty, but before the public can really warm to the idea of unmanned vehicles zipping around in all directions they want to feel pretty confident that they won't crash into things. Among the many computer scientists working on this problem is a team of researchers from MIT, who have developed route-planning software for drones that allows them to make intricate turns to autonomously navigate tight spaces.

Scientists at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) made some headway in this area last year, when they showed off a fixed-wing drone capable of zooming through trees at 30 mph (48 km/h). But while this and other crash-avoidancesystems aim to guide an aircraft through a busy environment by steering it away from obstacles, the new solution instead guides them toward more favorable airspace, resulting in a drone better equipped to handle denser environments, albeit at slower speeds.

"Rather than plan paths based on the number of obstacles in the environment, it's much more manageable to look at the inverse: the segments of space that are 'free' for the drone to travel through," says Benoit Landry, lead author on a the new research paper. "Using free-space segments is a more 'glass-half-full' approach that works far better for drones in small, cluttered spaces."

To achieve this, Landry's team fitted a quadcopter with motion-capture optical sensors and an onboard inertial measurement unit to monitor the exact position of obstacles. They then devised an algorithm that detects free spaces within the environment, links them together and assigns a chain of flight maneuvers to culminate in a complete flight plan.

The technique was then demonstrated within a simulated forest, with the custom quadcopter darting in and around an obstacle course constructed from PVC pipe and strings. Measuring 3.5 in (8.9 cm) from rotor to rotor, the drone as capable of zipping through 10-square-foot gaps at speeds of up to one meter per second.

The technology offers an exciting glimpse of how drones may one day autonomously navigate everything from collapsed buildings to thick forests, but in its present form it is unable to plan its path in real time, requiring an average of ten minutes to chart its route prior to take off. But Landry says this preparation time can be reduced with a few modifications.

"For example, you could define 'free-space regions' more broadly as links between areas where two or more free-space regions overlap," he says. "That would let you solve for a general motion-plan through those links, and then fill in the details with specific paths inside of the chosen regions. Currently we solve both problems at the same time to lower energy consumption, but if we wanted to run plans faster that would be a good option."

Meanwhile, a separate project also carried out at CSAIL demonstrated a new approach to crash avoidance using a fixed-wing plane. While the obstacle course was a little less challenging, with only a single set of obstacles to fly through before hitting a safety net on the other side, the aircraft was able to chart its path in real-time, without any prior knowledge of the obstacles.

The researchers approached this by loading the plane with a set of 40 to 50 trajectories that it could fly along, which they describe as funnels. When the aircraft is fired from the launcher, it sifts through this preloaded catalogue in around 0.02 seconds to stitch together funnels that are free of obstacles and determine a safe route through.

Landry's research paper can be found here, while the second research paper detailing the "funnel libraries" can be found here.

Both drones can be seen in action in the videos below.

Full article here

Read more…
Admin

Landing on a roof of a moving car offers several advantages, say German Aerospace Centre engineers

Landing on a roof of a moving car offers several advantages, say German Aerospace Centre engineers

From Engineering and Technology Magazine

January 18, 2016

German engineers have managed to autonomously land a drone on a car travelling at 75km/h.

The car was fitted with a large roof rack for this purpose with an elastic net to allow the 20kg UAV with a three-metre wingspan to gently lower itself onto the roof-mounted platform without causing any damage to itself or the car.

The accomplishment was the first of its kind and required engineers from the German Aerospace Centre to place a number of optical markers on the landing platform to allow the drone’s tracking system to synchronise the plane's speed automatically with that of the car. The whole landing was controlled by a computer running a sophisticated set of algorithms.

The drone navigated into the landing position with an accuracy of 50cm. Once the aircraft and the car adjusted their speeds, the plane softly lowered itself into the net.

While the experiment, carried out at a Bavarian airfield, required a human driver to control the car based on instructions provided by the computer, in future, such an interaction could be carried out between a driverless car and a fully autonomous aircraft.

The researchers originally developed the technique to allow aircraft to land without landing gear. They were particularly interested in solar-powered planes used to survey the stratosphere, for which every pound carried extra is a huge challenge.

Without the landing gear, such research planes could carry more scientific instruments and better communication equipment, as well as stay airborne longer.

However, other types of unmanned planes could benefit from the technology as well, as it is easier to land this way in crosswind conditions. A drone equipped with this technology would be less dependent on weather for landing.

Full article here

Read more…
Admin

California Prepares New Drone Laws

3689678784?profile=original

From UAS VISION

Two California lawmakers have introduced two separate bills this week that would further regulate drones in America’s most populous state. If passed, one of the new state laws would require “tiny physical or electronic license plates” and inexpensive insurance, among other requirements. A second bill would compel drone pilots who are involved in incidents that damage property or injure people to leave their contact information—similar to what drivers must do following auto accidents.

The proposed laws are in response to a series of unfortunate mishaps involving drones across the Golden State in 2015: there were some unmanned aerial vehicles that got in the way of firefighting efforts, while another crashed into power lines in Hollywood, and yet another hit a baby in Pasadena.

The first bill, which was authored by Assemblyman Mike Gatto (D-Glendale), would require drone pilots to hold “inexpensive ($1, or so) insurance policies sold at the point-of-sale”—a press release compared it to automobile insurance.

Gatto’s bill, which has yet to be formally introduced with actual legislative text in the state assembly, would also require that all GPS-enabled drones “of a certain size” have an “automatic shut-off technology that would activate if approaching an airport.”

“I think 2015 showed us that in the era of democratized aviation, certain types of incidents will be fairly common,” he told Ars. “More and more people are buying these and that’s great. This is just like the 1920s when more and more people were buying cars, but I just think that we need some basic rules going forward.”

He expects the bill to be introduced next week.The second bill, written by Assembly member Ed Chau (D-Monterey Park), aims to counter “hit and run” drone accidents by ordering drone pilots to leave their identifying information in a conspicuous place at the scene of the accident.”Unfortunately, as the number of drones in the air will only increase in the coming years, we are going to see more and more accidents,” Chau said in a statement. “And even with world-class safety features and training, accidents are still going to happen, just like on our roadways. If a drone breaks down, runs out of power or crashes into something, the operator needs to do the responsible thing and come forward and identify himself to the victim and to the police. This bill will make that responsibility the law.”

Full article here

Read more…
Admin

With a body crafted from carbon fiber, Bird-X says the the drone has been designed to ...

With a body crafted from carbon fiber, Bird-X says the drone has been designed to mimic the appearance of a real-life predator bird

From Gizmag

By  NICK LAVARS  JANUARY 17, 2016

Scarecrows and other visually intimidating props placed on the ground can be useful in deterring birds within a certain radius, but one company is looking further afield by battling the pests in the sky. Designed to resemble a real-life bird of prey, the ProHawk UAV can be programmed to autonomously fly over a property and emit menacing predatory cries as it goes.

Pest control company Bird-X has been in the bird control game for more than 50 years, and the newly released ProHawk UAV is not the first time it has turned to the skies for its novel approach. In 2011 it launched BirdXPeller, a remote controlled aircraft that blasts bird-repelling sounds during flight to scare pests away from golf courses, crops and vineyards. But the latest addition to its fleet is the first time it has enlisted an autonomous quadcopter for the cause.

Fitted with GPS, the ProHawk UAV can be programmed to follow a specific flight path along waypoints and features a built-in sonic sound unit. This device stores a range of predator calls, prey bird distress cries and Canada goose cries, which are emitted en route to clear the area of unwanted pests.

With a body crafted from carbon fiber, Bird-X says the the drone has been designed in part to mimic the appearance of a real-life predator bird, further adding to its ability to leave terrified smaller birds running for cover.

The drone can be controlled with a handheld remote control if the user so wishes, but its potential to cut man hours by way of automated flight plans seems to be where its value lies. While the drone is capable of launching, patroling and landing on its own, users will still need to plug it in to charge it up. It does seem like the perfect candidate for a landing pad that doubles as a wireless charging station like the one launched by SkySense in 2014 – such an accessory could make the entire process autonomous.

The ProHawk UAV can be ordered now as an ongoing maintenance contract from Bird-X, with contacts details available via the source link.

You can see the drone in action in the video below

Source: Bird-X

Full article here

Read more…
Admin

/img/RosieAlamyENPA92-1452542705197-1452704447500.jpgFrom IEEE SPECTRUM

By Shahin Farshchi

This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE.

Science fiction authors love the robot sidekick. R2-D2, Commander Data, and KITT—just to name a few—defined “Star Wars,” “Star Trek,” and “Knight Rider,” respectively, just as much as their human actors. While science has brought us many of the inventions dreamed of in sci-fi shows, one major human activity has remained low tech and a huge source of frustration: household chores. Why can’t we have more robots helping us with our domestic tasks? That’s a question that many roboticists and investors (myself included) have long been asking ourselves. Recently, we’ve seen some promising developments in the home robotics space, including Jibo’ssuccessful financing and SoftBank’s introduction of Pepper. Still, a capable, affordable robotic helper—like Rosie, the robot maid from “The Jetsons”—remains a big technical and commercial challenge. Should robot makers focus on designs that are extensions of our smartphones (as Jibo seems to be doing), or do we need a clean-sheet approach towards building these elusive bots?

Take a look at the machines in your home. If you remove the bells and whistles, home automation hasn’t dramatically changed since the post–World War II era. Appliances, such as washing machines, dishwashers, and air conditioners, seemed magical after WWII. Comprised primarily of pumps, motors, and plumbing, they were simply extensions of innovations that came to bear during the industrial revolution. It probably doesn’t come as a surprise that industrial behemoths such as GE, Westinghouse, and AEG (now Electrolux) shepherded miniature versions of the machines used in factories into suburban homes. At the time, putting dirty clothes and dishes into a box from which they emerged clean was rather remarkable. To this day, the fundamental experience remains the same, with improvements revolving around reliability, and efficiency. Features enabled by Internet-of-Thingstechnologies are marginal at best, i.e., being able to log into your refrigerator or thermostat through your phone.

But before wondering when we’ll have home robots, it might be fair to ask: Do we even need them? Consider what you can already do just by tapping on your phone, thanks to a host of on-demand service startups. Instacart brings home the groceries; Handy and Super send professionals to fix or clean your home;Pager brings primary care, while HomeTeam does elderly care. (Disclosure: my company, Lux Capital, is an investor in Super, Pager, and HomeTeam.)So, again, why do we need robots to perform these services when humans seem to be doing them just fine? I don’t think anyone has a compelling answer to that question today, and home robots will probably evolve and transform themselves over and over until they find their way into our homes. Indeed, it took decades of automobiles until the Model T was born. The Apple IIs and PC clones of the early 1980s had difficulty justifying their lofty price tags to anyone who wasn’t wealthy, or a programmer. We need to expect the same from our first home bots.

So it might be helpful to examine what problems engineers need to crack before they can attempt to build something like Rosie the robot. Below I discuss five areas that I believe need significant advances if we want to move the whole home robot field forward.

The full article here goes on to discuss the following critical challenges: 1) We need human-machine interfaces, 2) Cheap sensors need to get cheaper, 3) Manipulators need to get a grip, 4) Robots need to handle arbitrary objects, and 5) Navigating unstructured environments needs to become routine.

Read more…
Admin

ROBOTIC SUITCASE FOLLOWS YOU AROUND

suitcase.jpg?w=585From Hackaday

by:Richard Baguley

I have something that follows me around all the time: my dog Jasper. His cargo-carrying capability is limited, though, and he requires occasional treats. Not so this robotic suitcase. All it needs, the designers claim, is an occasional charge and a Bluetooth device to follow.

Designed by NUA Robotics, this suitcase is equipped with powered wheels and a certain amount of smarts: enough to figure out the direction of a Bluetooth signal such as your cell phone and follow it. This is also accompanied by proximity sensors so it doesn’t bump into you or other people. When the built-in battery runs out, just pop put the handle and pull it yourself, and the regenerative motors will recharge the battery. There’s no indication on price, battery life or how much space is left to actually carry stuff yet, but the designers claim it could be out within the year. As someone who uses a walking stick, this sounds like a great idea. And if they can work out how to get it to walk the dog for me, that would be even better.

Now, who will be the first to build a clone of this in their basement? Bonus points if it’s a two-wheeled self-balancer.

Full article here

Read more…
Admin

1798OS_4782_Mastering%20ROS%20for%20Robotics%20Programming_0.jpgFrom ROS.org

By Tully Foote on January 13, 2016

From Lentin Joseph

Here is a new book for mastering your skills in Robot Operating System(R.O.). The book title is "Mastering R.O.S for Robotics Programming" and this is one of the advance book on R.O.S currently available in the market. 
This book discussing advanced concepts in robotics and how to implement it using R.O.S. It starts with deep overview of the R.O.S framework, which can give you a clear idea of how R.O.S really works. During the course of the book, you will learn how to build models of complex robots, and simulate and interface the robot using the R.O.S MoveIt! and R.O.S navigation stack.
After discussing, robot manipulation and navigation in robots, you will get to grip with the interfacing of I/O boards, sensors, and actuators to R.O.S. 
One of the essential ingredients of robots are vision sensors, and an entire chapter is dedicated to the vision sensors and its interfacing in R.O.S.
You can also see the hardware interfacing and simulation of complex robots in R.O.S and R.O.S Industrial. 
Finally, you will get to know the best practices to follow while programming in R.O.S.
There are 12 chapters and 481 pages on this book. The main contents of the book are given below
  1. Introduction to R.O.S and its package management
  2. Working with 3D robot modeling in R.O.S
  3. Simulating  robots using R.O.S and Gazebo
  4. Using the R.O.S MoveIt! And Navigation stack
  5. Working with Pluginlib, Nodelets and Gazebo plugins
  6. Writing R.O.S controllers and visualization plugin
  7. Interfacing I/O boards, sensors and actuators to R.O.S
  8. Programming Vision sensors using R.O.S, Open-CV and P.C.L
  9. Building  and interfacing differential drive mobile robot hardware in R.O.S
  10. Exploring the advanced capabilities of R.O.S MoveIt!
  11. R.O.S for Industrial Robots
  12. Troubleshooting and best practices in R.O.S
This book is written by Lentin Joseph who is the CEO/Founder of a Robotic startup called Qbotics Labs from India. He is also an author of a book called "Learning Robotics using Python" which is also about R.O.S. 
The book uses R.O.S Indigo and installed on latest Ubuntu L.T.S 14.04.03. The codes are also compatible with R.O.S Jade. 
The book is designed in such a way that even beginners can take up all topics. If you are a robotics enthusiast or researcher who wants to learn more about building robot applications using R.O.S, this book is for you. In order to learn from this book, you should have a basic knowledge in R.O.S, GNU/Linux, and C++ programming concepts. The book will also be good for professionals who want to explore more features of R.O.S.
The book is published by PACKT and here are the links to buy the book
You will get complete information about the book from book website
Full article here
If you want to master the Robot Operating System, this book seems like a good resource to start with.
Read more…
Admin

A quadcopter drone, equipped with a prototype of the range extender

A quadcopter drone, equipped with a prototype of the range extender (Credit: Intelligent Energy)

From Gizmag

By BEN COXWORTH  JANUARY 11, 2016

Last May, we heard about how Horizon Energy Systems was developing a hydrogen-powered quadcopter that could (hopefully) stay airborne for hours at a time. Soon, however, that drone may not be the only one doing so. That's because Intelligent Energy has unveiled a fuel cell/battery range extender that could become standard equipment on third-party drones.

The idea behind the system is that an onboard hydrogen fuel cell generates electricity, and that electricity is in turn used to charge a linked battery. Not only should this greatly extend how long the drone can go between recharges, but those recharges should only take a couple of minutes – the fuel cell will simply need to be refilled, using a dedicated cartridge.

According to the Intelligent Energy's head of PR, Debbie Hughes, flight times of drones using the system will vary depending on the size of the aircraft. "You could have more fuel on a larger drone meaning a longer flight time, but we believe in the region of two hours," she tells us.

In flight tests of the system, it delivered power consistently enough to allow an onboard camera to record video continuously, with no interruptions.

The range extender was unveiled last week at CES, as was Walkera's methanol-fueled range extender for drones. Intelligent Energy is now seeking commercial partners, who would be interested in integrating the technology into their aircraft.Full article here

Read more…