Shannon Morrisey's Posts (42)

Sort by

CadSoft Eagle V6 XML Support

PT_101602.jpgFor those into board design, Eagle is a nice free (for non-commercial use) option which has, until now, lacked XML structure.  Eagle will soon support importing & translating from other formats among other productivity improvements.  Until now, this compatibility issue has prevented myself and likely many others from using Eagle as a primary PCB design tool.  I believe the DIYD crew also uses Eagle for designing Arduboards.

From CadSoftUSA via Make

Read more…

Industrial Robotics 2.0

ABB, the world's biggest industrial automation company, is promoting a very impressive humanoid concept platform called FRIDA.  I want it real bad.  From IEEE Spectrum Automation Blog

 

"Traditional industrial robots are big, expensive, and hard to integrate into existing manufacturing processes. They're also difficult to reprogram when production changes become necessary and they can't safely share spaces with human workers. This barrier to entry has kept small and medium companies "robot-less" -- at a time when robots, more than ever, could boost productivity and ameliorate labor shortages…

To make it even safer, its motors have limited drive power and soft pads cover its body. The robot has 7-axis arms, each with a servo gripper for small-part handling. Inside the torso is a control system based on ABB's IRC5 industrial controller.  So what can FRIDA do? One scenario ABB envisions is using it to bring more automation to the fast-paced, and mostly human-powered, assembly lines found in the electronics industry."

Read more…

Wireless mobile robot design integrates motor controller and Intel Atom motherboard.

From Design News: Roboteq Inc., a developer of motor controllers for the mobile robotics industry, announced the publication of a WiFi robot design platform featuring the Roboteq AX3500 dc motor controller and an Intel Atom processor-based Mini-ITX motherboard. 

The robot is a battery-operated, 4 wheel-drive unit built on a 1.5 x 2 feet (46 x 61 cm) aluminum frame with WiFi connectivity and a video camera. The robot can feed live video and can be remotely operated via the Internet. The robot is a technology platform that users interested in robotics can easily replicate to add functionality and intelligence. 

Use of the Intel Atom motherboard in the design allows robotics software written for the PC to run on the robot. Microsoft, for example, has released free development tools that can be downloaded to develop this type of robotics application. The Microsoft Robotics Developer Studio 2008 R3 (Microsoft RDS) is a Windows-based environment for academic, hobbyist and commercial developers to easily create robotics applications across a wide variety of hardware. RDS 2008 R3 can be downloaded at no charge at www.microsoft.com/robotics. 

Detailed assembly instructions for the robot, plus mechanical CAD drawings, wiring diagrams and software can be downloaded free of charge from Roboteq's web site. No license or royalties are needed for their use, and a 3D animation illustrates the step-by-step construction of the chassis. 

 WiFi Robot Design PlatformStep-by-step instructions show how to use a Roboteq motor controller and an Intel Atom low power motherboard to build a wireless LAN remotely operated mobile robot. Source: Roboteq Inc.

Click here for larger image. 

The AX3500 motor controller uses two channel outputs to control the motors that power and steer the robot by varying the speed and direction of the motors at each side of the chassis. The controller also has outputs for up to eight RC servos, allowing the control of simple robotic arms and other accessories. The motor controller connects to the Intel Atom motherboard via its RS232 port.

The Intel D510M motherboard was selected because of its 100 percent passive cooling, low power consumption, balanced features set, excellent performance and low cost. Measuring 17 x 17cm, the Mini-ITX form factor is ideally suited to mobile robotic designs. The motherboard runs Windows 7 booting from a SATA hard drive or solid state drive but alternate operating systems such as Linux can also be used. The PC-compatible platform enables significant computational functionality and flexible software development options.

The motherboard consumes only 800mA from the robot's 24V batteries, ensuring several hours of continuous operation depending on motor usage. A power converter ensures proper operation whether the batteries are fully charged or partially depleted.

Another important element of the design is the power supply. An adapter plugs into the ATX power slot in the motherboard, so users can feed from 6-30V dc to regulate a clean supply for the motherboard, disk drive and the RC output for driving the servos. The motherboard, adapter and controller combination provide an integrated solution from an electronics point of view.

Because the design platform has offers ample compute power, the ability to control eight motors, plus integrate vision, it provides a portable and flexible system that can be adapted for a wide range of applications.

 

Read more…

DIY Transistors and OLEDs on a Makerbot

3689381979?profile=originalFrom MrKimRobotics: When you can print electric traces and semiconductors, a lot of things change.  Obviously, this is a long way from printing out the kind of high-density computronium you get from TSMC, IBM and Intel, but you do get some nice benefits.  For one, no fab plant to send parts to.  While it’s entirely possible to fab low tech devices in, say, a pizza oven, organic semiconductors are going to be a better path for DIYers, due to the dramatically reduced overhead (if higher per-unit cost) and due to the lack of an anneal, which can be pretty hard to do with a pizza oven.  Also, all-low-temperature manufacture means you can mix the fabbing process with plastics and other delicate materials.

Read more…

Visual Autolanding System for Rotorcraft

Yet another visual auto pilot system looking beyond the limitations of accelerometers and gyroscopes.

From The Engineer: The technology of unmanned aerial vehicles (UAVs) has advanced so far that they’re now mainstream technology, particularly in the military arena. Despite this, they are still developing, and a the major gaps in the use of one of the most versatile varieties of UAV may soon be closed.

Helicopter UAVs (HUAVs) have all the advantages of manned helicopters, such as hovering flight and the vertical take-off and landing which allows them to be operated from virtually any area. However, landing an unmanned helicopter isn’t easy, and it’s the one part of their operation which cannot be carried out autonomously. A trained operator has to land the craft by remote control, and remote landing of a helicopter is particularly tricky. A cushion of air builds up underneath the craft as it descends, and this has to be spilled, by slowing down the rotors at a controlled rate, before it can settle on the ground. Any mistakes tend to lead to the helicopter tipping over, which is catastrophic, as the large amount of energy stored in the spinning blades will then be dissipated by smashing the whole craft (and probably any equipment on board) to pieces.

Engineering electronics developer Roke Manor has been developing a system to automate HUAV landing, and it’s based on a technology which is familiar to millions — but not from where they might expect. Who could think that the system which tells tennis umpires whether a ball is in or out, or shows TV viewers whether a delivery is LBW, might guide a drone helicopter in to land?

The Hawk-Eye system was developed by Roke in the 1990s, and works by analysing the data from fixed cameras to extrapolate information about how a ball is moving in three dimensions. The HUAV landing system works in exactly the same way, but in reverse — the camera is mounted on the moving helicopter, and its on-board data processes use its images to work out the motion of the helicopter in relation to its landing position, which is generally (but not always) fixed.

In fact, as Roke’s Future Technology Manager Peter Lockhart explained, the system was developed for use on any UAVs, as they all need to be remote-piloted for landing, unless they are to be recovered by parachute. ‘But we were testing this on our own site at Roke, and it just isn’t big enough to fly fixed-wing UAVs. As it happens, helicopters are the hardest to land anyway, so that suited both our purposes — we could control the experimental and testing phase without going offsite and tackle the most challenging aspect of the problem.’

Ed Sparks, consultant engineer at Roke and one of the original developers of Hawk-Eye, said that the relationship between the two systems is direct: ‘Hawk-Eye tells you where the ball is, we look at the landing pad and work out where we are in relation to it.’

The visual processing system works out the helicopter’s roll, pitch and yaw in relation to the ground. There are distinct advantages to using this system rather than accelerometers and gyroscopes, which give an absolute measurement of orientation, Sparks explained. ‘With accelerometers, gravity is a very large acceleration which is applied constantly while the craft is flying, so to prevent you from confusing gravity with your own motion, you need an extremely accurate accelerometer,’ he said. ‘Also, the accelerometer tells you your attitude relative to where you started, so if it’s running throughout an hour’s flight, that’s an hour’s worth of errors it could have accumulated.’

The visual system, on the other hand, is unaffected by these sorts of errors. ‘You turn it on when you’re starting to make your landing approach, and you see exactly what it sees on the ground,’ Sparks said. ‘The landing system measures your position relative to the specified landing spot, from exactly where you are to exactly where you want to be, so it’s minimising the errors from the word go.’

One of the most important criteria for developing the system was that it had to be entirely self-contained on board the HUAV. ‘We don’t want any reliance at all from information passed up from the ground,’ Sparks said. This meant that all the image processing hardware had to be on board as well as the camera itself. The camera and the UAV itself were off-the-shelf products, and Roke brought in SME UAV manufacturer Blue Bear Systems, which developed a new variant of a lightweight computing platform with bespoke video units to house Roke’s image processing software. The team also worked with the aeronautics department of Bristol University, which has a long history of working with autonomous systems, to work on the control theory for the system, in particular the algorithms which take the visual measurements and turn those into guidance commands for the helicopter.

Another partner in the collaboration was MBDA, a large aerospace manufacturer, which brought its expertise on flight control algorithms to bear on solving the problem of what happens when the landing platform is moving as well as the UAV —if it has to land on a flat-bed truck, for example. ‘They do a lot of work on controlling two platforms to optimum use,’ Sparks said. Roke acted as the system integrator as well as providing the UAV itself and the image processing know-how.

The result is a system which allows the UAV to land in any conditions where the ground is visible. ‘Basically, we can operate in any conditions in which a piloted helicopter can operate,’ said Lockhart. ‘Landing at night isn’t a problem. Thick fog would be, but you wouldn’t be flying in those conditions anyway.’

The system requires no human intervention at all to land, and in many cases the UAV will have a camera trained on the ground in any case, as many UAV flights are for reconnaisance purposes. However, among the possible applications for this system is the unmanned, autonomous resupply of troops in difficult-to-reach locations, reducing the risk to helicopter pilots and other personnel.

The next phase of the research is aimed at making the system even more user-friendly, with features such as point-and-click navigation. ‘And unskilled operator could just click on an area he or she wanted to investigate, or wanted to designate as a landing area,’ Lockhart said.

The Roke team was particularly pleased with the speed of the project. ‘We’ve brought a lot together in a very short time,’ Sparks said. ‘We started in the spring of 2009, and we were landing the helicopter by the summer. In the autumn we did our first landing on a moving target. We’re now in a position to start selling the system, and we have a number of leading UAV vendors who have it in trials to decide whether they want to install it on their platforms. These UAVs are multi-use, so the first application depends on who buys it, but it’s likely to be defence or police."

Read more…

Autopilot imitates honey bees for aircraft aerobatics

From The University of Queensland:Australian scientists have developed a novel autopilot that guides aircraft through complex aerobatic manoeuvres by watching the horizon like a honey bee.

Allowing aircraft to quickly sense which way is “up” by imitating how honeybees see, engineers and researchers at The Vision Centre, Queensland Brain Institute and the School of Information Technology and Electrical Engineering at The University of Queensland have made it possible for planes to guide themselves through extreme manoeuvres, including the loop, the barrel roll and the Immelmann turn, with speed, deftness and precision.

“Current aircraft use gyroscopes to work out their orientation, but they are not always reliable, as the errors accumulate over long distances,” said Vision Centre researcher Saul Thurrowgood.

“Our system, which takes 1000ths of a second to directly measure the position of the horizon, is much faster at calculating position, and more accurate.”

“With exact information about the aircraft's surroundings delivered in negligible time, the plane can focus on other tasks.”

The group first “trained” the system to recognise the sky and the ground by feeding hundreds of different landscape images to it and teaching to it compare the blue colour of the sky with red-green colours of the ground.

Simple, low resolution cameras that are similar to a bee's visual system are then attached to the aircraft, allowing the plane to take its own landscape pictures to identify the horizon while flying.

“Imagine a plane that has eyes attached to each side at the front – the wide-angle camera lenses provide a view of 360 degrees.”

Mr Thurrowgood says that the challenge was to figure out the optimal resolution of images that will allow the system to both locate the horizon quickly and not compromise the accuracy of its information.

“The measurement process can certainly be quickened – we only have to adjust the cameras to take images with a smaller resolution,” he says. “However, it won't produce the same quality of data, so the key is to find an optimal resolution where you have both speed and quality.”

Testing the aircraft in an air field, the unmanned plane was directed to perform three aerobatic movements, the barrel roll, Immelmann turn and a full loop.

“We had two pieces of evidence that it worked out – first, the plane didn't crash and second, the system's identification of the horizon matched with what we measured ourselves.”

Mr Thurrowgood says that the system can potentially be adapted for all types of aircraft – including military, sporting and commercial planes.

“We have created an autopilot that overcomes the errors generated from gyroscopes by imitating a biological system – the honeybees,” says Professor Mandyam Srinivasan.

“Although we don't fully understand how these insects work, we know that they are good at stabilising themselves while making complicated flight manoeuvres by watching the horizon.”

“This project required tremendous effort, as separating the sky from the ground visually is not always as easy as we imagine – it can be difficult to pick out the horizon, so my hat's off to Mr Thurrowgood for achieving this.”

The group will be presenting their paper UAV attitude control using the visual horizon today at the Eleventh Australasian Conference on Robotics and Automation. Videos of the test flights are also available from the group.
Read more…

Creepy/Cool Animal Inspired Robotic Designs


Festo is a German robotics company promoting a number of weird and wonderful robots inspired by nature. My personal favorite is this particularly unsettling elephant trunk - brings to mind sentinels from The Matrix et al.

From Singularity Hub: "Robots are ripping off nature! …Good idea. Festo, a multinational robotics firm based in Germany, has made some of the most amazing looking and fun biologically inspired robots out there. We’ve shown you their Air-Penguins and their Elephant-Arms but there are so many other Festo creations yet to be seen. Luckily, the robotics company seems to be going through their back catalog, and they just released videos of their research efforts from 2006 to 2008. These bots may be a few years old but they are absolutely cool to watch. Check out the Air-ray, the Bionic Air-fish, the Aqua-jelly and more in the videos below. Why do I get the feeling that Festo is building the robotic equivalent of Noah’s Ark?

Nature is one of the best engineers around, so it’s no surprise that some of the smartest robot experts are looking to biology to inspire their innovations. Festo is one of the world leaders in automation, with millions of parts installed in factories all over the globe. Their animal inspired robots are created by the efforts of their Bionic Learning Network. This collection of research groups from academia and industry is part advanced research initiative, part education organization. While many of their bionic bots have practical applications, they definitely seem willing to explore far off the beaten path even if there’s not much monetary incentive. That’s totally fine by me. I love the grace with which their robots fly through the air and water, and I can’t wait to see what they copy from nature next. Hopefully they’ll stay away from the predators of the world. Sharks are bad enough, but robot sharks? That would just be asking for a robot-apocalypse.

The first video is a little long, so here’s a guide:
1:30 Air-ray
2:20 Bionic Air-fish
3:05 Humanoid
4:56 Air-acuda
5:48 Aqua-ray





The next video recaps some things from the first. Go ahead and skip to 1:14 to check out the Aqua-jelly.



Here’s more of the Humanoid, so you can see how all of its pneumatic muscles move and work.
Read more…

From FastCompany: "This week marks the third anniversary of ROS (Robot Operating System), an open-source software platform for the robotics industry developed by Stanford and Silicon Valley robotics research lab Willow Garage. In that short time, ROS has skyrocketed in popularity. Robot hardware manufacturers, commercial research labs, and software companies
are all adopting the platform. And ROS is just getting started.


"We set out at the beginning with a commitment to open source," says Steve Cousins, President and CEO of Willow Garage. "In order to get an industry going in personal robotics, it's going to take the ability for a lot of people to experiment. An open platform makes it easy for people to tinker and innovate."

ROS has succeeded beyond Willow Garage's wildest dreams. There are more than 50 public ROS repositories featuring open-source libraries and tools, more than 1,600 software packages, and at least 50 robots around the world using the platform, including underwater vehicles, boats, space rovers, lawnmowers, helicopters, cars, indoor robots, outdoor robots, and more (the Anybots QB robot recently covered by Fast Company doesn't use ROS, however).

The platform probably won't stop growing anytime soon. "When you're growing exponentially, the future is really hard to predict. One of the really powerful things about open source is that by giving up control, you allow the community to do much more than you could possibly do yourself," says Ken Conley, a senior software engineer at Willow Garage.

Willow Garage does have one big hope for ROS: that it will take on a life of its own, outside of the nurturing Willow Garage environment. The company is in the beginning stages of developing an independent ROS Foundation inspired by the Mozilla Foundation, Apache Software Foundation, and the GNOME Foundation. "We're talking about this with a number of government agencies and robotics companies," Cousins says. "It will be an independent organization funded by the community, chartered with moving ROS forward."


Read more…

From Xconomy: "A couple of startup companies set a world aviation record last night.

But they were pretty low-key about it. As I walked into the Future of Flight Aviation Center in Mukilteo, WA, a half hour north of Seattle, I saw little activity. It was after hours, and the hangar-like building was nearly deserted except for the futuristic planes suspended from the ceiling—Burt Rutan’s “Quickie” and a Beechcraft Starship—and part of a Boeing 787 Dreamliner fuselage on the display floor. It was a bit like “Star Wars” meets “Night at the Museum.”

Tom Nugent, the co-founder and president of Kent, WA-based LaserMotive, greeted me and said they were almost ready for showtime. A small team of engineers divided its attention between the back of a command truck and the adjacent trailer that held the laser optics equipment that would make the show possible. Two German guys who hadn’t slept in days (and were still on Munich time) were sprawled out on deck chairs in front of computer monitors like they were playing a video game. One held a remote controller that he used to guide a “quadrocopter”—a small, 1-kilogram, square-shaped flying contraption with blinking lights and four spinning rotors—made by their company, Ascending Technologies.

Jan Stumpf and Michael Achtelik, the co-CEOs of Ascending Technologies, partnered with LaserMotive to perform this feat last night. The goal: to use a laser to power an aircraft in continuous flight for about 12 hours (far longer than its battery would last without recharging, which is only about five minutes). That would be a world record, by a long shot, for the longest free flight of an electric vehicle.

Indeed, this demonstration is a big deal for the future of electric planes, said Barry Smith, the executive director of the Future of Flight facility. Imagine putting a laser on top of every cellular tower, he said, so that certain types of unmanned aerial vehicles (UAVs) would never need to land to recharge or refuel. That could potentially revolutionize communications, surveillance, and security and defense applications. Longer term, it could even impact the long-held dream of powering manned aircraft with electricity instead of jet fuel—though that is very far off.

For now, Nugent says, “The significance is we’re going to show this quadrocopter, and any aerial vehicle [of this size], will be able to fly effectively forever. It’s no longer limited by battery capacity.”

LaserMotive has done smaller flight tests before, but not on a free-flying vehicle like this. The company is best known for winning the $900,000 NASA Power Beaming Challenge last year, in one of the levels of the “Space Elevator Games.” That involved using a laser to power a climbing robot up a cable to a certain height (1 kilometer) at a certain speed (about 9 mph). But lately the company has been targeting UAVs as a big commercial application of its wireless power technology. (The next level of the NASA challenge, which was supposed to happen later this year, is still up in the air, so to speak.)

“Goggles on!” someone shouted, and we all complied. That meant the infrared laser, which puts out about 200 watts of light power, was switching on. The beam was directed using a series of mirrors and optics and shot out the top of the trailer. You couldn’t see it with the naked eye except for a reddish halo on the 50-foot ceiling. At the same time, the quadrocopter lifted off (under its own battery power), guided by Stumpf, and floated up to meet the beam, about 30 feet off the ground (see left).

“Not centered,” Nugent said. Then the computer vision system of LaserMotive’s setup kicked in. Software and cameras aligned with the path of the laser beam tracked the vehicle’s position, and positioned the beam so it hit the photovoltaic cells on the underside of the craft; those solar cells transformed the laser’s energy into electricity to continuously charge the quadrocopter’s battery.

With that, all human corrections fell away, and it was just a drone hovering eerily in space, rotors humming quietly. It swayed a few feet from side to side, and the laser tracked it. It was about 7:40 pm.

This is the boring part, Nugent said. And boring is good. Exciting is bad. For the next 12 hours, if all went well, nothing more would happen. The craft would stay up all night (as would the crew),and sometime after 7:30 am, it would come in for a choreographed landing in front of 50-odd media and dignitaries. But anything could happen overnight—mirrors in the optical system could overheat and malfunction, or something in the craft or its solar cells could break, or software could crash. There’s no way to know except to do it.

In the meantime, Nugent filled me in on the business prospects of LaserMotive, which he co-founded in 2007. The company is out fundraising—talking with angel investors, angel groups, and venture capitalists—as well as trying to land more contracts with corporate and government partners. One new market has emerged: beaming power to cellular communication towers in places where running a new power line or otherwise upgrading power equipment is too expensive. As for UAVs, Nugent said, the plan is to show potential customers (presumably UAV companies and government labs) that the power-beaming approach works in flight—perhaps at distances up to a kilometer or two. The first applications might be in disaster relief or military scouting operations.

I also took the opportunity to ask Jordin Kare, the co-founder of LaserMotive and a laser expert who worked on the “Star Wars” missile defense system in the 1980s, about the broader significance of what he was watching. “This is the first combination of power and control and duration,” Kare said. “What it really marks is being able to take an off-the-shelf vehicle and power it with a laser so it can do a lot more…The prospect of being able to keep airplanes and communication systems up in the sky forever is an amazing thing.”

On the practical side, Kare said an important factor in all this is how efficient laser systems have become. Although the current demo only converts about 10 percent of the power needed to drive the laser into flying the quadrocopter, it could be more like 20 percent once the team optimizes the technology. And beyond that, Kare thinks there might be some new way, some approach he hasn’t thought of yet, to make the craft’s solar cells better at squeezing more electricity out of the beam.

Until then, this world aviation record will have to do. This morning, in a quintessentially rainy Northwest setting, the quadrocopter came in for its landing a little after 8 am to a chorus of applause. Now maybe these guys can get some sleep—and get ready for the next big challenge in power beaming, whatever that might be."


Read more…

20100719132540-1.jpg

A team of researchers from the Massachusetts Institute of Technology (MIT) has recently demonstrated an innovative landing control system that allows planes to land similarly to birds. Since this method allows quick changes of speed and fast landings, it might improve future aircraft...



From The Future of Things: "Airplanes’ landing procedure is common knowledge. First, the plane maneuvers slowly into an approach pattern; then, there is the long descent, and finally, the wheels touch the ground and the pilot applies the brakes until the plane comes to a full stop. In comparison, birds switch from barreling forward at full speed to lightly touching down on a target as narrow as a telephone wire. Now, a team of scientists tried to understand this difference in order to improve landing technique for airplanes.

According to this recent study, birds’ ability to land precisely depends on a complicated physical phenomenon called "stall." In fluid dynamics, a stall is a reduction in the lift coefficient generated by an airfoil as angle of attack increases. This occurs when the critical angle of attack of the airfoil is exceeded; typically, about 15 degrees, but it may vary.

When a commercial airplane is changing altitude or banking, its wings are never more than a few degrees away from level. Within that narrow range of angles, the airflow over the plane's wings is smooth and regular, like the flow of water around a small, smooth stone in a creek bed. However, when a bird approaches its perch it will tilt its wings back at a much sharper angle; it makes the airflow over the wings turbulent, and it creates large vortices — whirlwinds — from behind the wings. The effects of the vortices are hard to predict; for instance, if a plane tilts its wings back too far, it can stall and fall out of the sky (hence its name).

The newly designed control system is based on mathematics, and developed by MIT associate professor Russ Tedrake, a member of the Computer Science and Artificial Intelligence Laboratory, and Rick Cory, a PhD student in Tedrake's. Their challenge was to describe the stall phenomenon mathematically; although most engineers understand it, modeling it is time-consuming, in terms of computation.

The developed model enabled Professor Tedrake and Cory to guide a foam glider to its perch, but executing a fluid landing was not that simple "It gets this nominal trajectory," Cory explains. "It says, 'If this is a perfect model, this is how it should fly.' But, because the model is not perfect, if you play out that same solution, it completely misses."

The imperfections of the initial model drove Cory and Tedrake to develop a set of error-correction controls that could nudge the glider back onto its trajectory when location sensors determined that it had deviated from it. By using innovative techniques developed at MIT's Laboratory for Information and Decision Systems, they were able to calculate precisely the degree of deviation that the controls could compensate for.

The control system ends up being a bunch of tubes pressed together like a fistful of straws. The addition of the error-correction controls make the trajectory look like a tube snaking through space; the center of the tube is the trajectory calculated using Cory and Tedrake's model, and the radius of the tube describes the tolerance of the error-correction controls. Once the glider launches, it just keeps checking its position and executing the command that corresponds to the tube in which it finds itself. The ultimate solution found allows the glider to constantly be in one of the possible courses; if it goes so far off course that it leaves one tube, it will still find itself in another.

A cruising plane tries to minimize its drag coefficient—the measure of air resistance against a body in flight. Usually, when an aircraft is trying to slow down, it tilts its wings back in order to increase drag. Ordinarily, they cannot tilt back too far, for fear of stall. However, because Cory and Tedrake's control system takes advantage of stall, the glider has a drag coefficient that is four to five times that of other aerial vehicles

There are several potential applications for the new system. For one, the U.S. Air Force has been interested in the possibility of unmanned aerial vehicles (UAVs) that could land in confined spaces; therefore, it has been funding and monitoring Tedrake and Cory’s research. "What Russ and Rick and their team is doing is unique; I don't think anyone else is addressing the flight control problem in nearly as much detail," says Gregory Reich of the Air Force Research Laboratory. More accolades come from Boeing, who gave Cory the 2010 Engineering Student of the Year Award for the system’s development.

Still, current design interferes with the military’s plans: In the conducted experiments Cory and Tedrake used data from wall-mounted cameras to gauge the glider's position, and the control algorithms ran on a computer on the ground that transmitted instructions to the glider. "The computational power that you may have on board a vehicle of this size is really, really limited," Reich says.

Despite the drawbacks mentioned Tedrake is optimistic, saying that in a few years’ time computer processors will be powerful enough to handle the control algorithms. In the meantime, his lab has already begun to address the problem of moving the glider's location sensors onboard. On a humorous note, Cory concludes: "I visited the Air Force, and I visited Disney, and they actually have a lot in common: the Air Force wants an airplane that can land on a power line, and Disney wants a flying Tinker Bell that can land on a lantern."

TFOT has also covered CyberQuad, a UAV capable of vertical take-off and landing, and Deep Throttling, a technology developed by NASA to make smoother landings.

For more information about the technology that allows planes to land like birds, see MIT’s press release."
Read more…

Latest UAV Replaces Flaps With Jets

A novel UAV has successfully demonstrated 'flapless flight' in the UK.

From Dailymail: "A British unmanned plane that uses jets of air to fly instead of conventional ‘flaps’ has made aviation history.

The experimental unmanned air vehicle (UAV), called DEMON, uses blown jets of air to control the plane’s movement in flight rather than conventional mechanical elevators and ailerons.

Experts say this will make it much easier to maintain as there are far fewer moving parts and gives the aircraft a more stealthy profile.

DEMON made its historic flight at Walney Island in Cumbria on Friday 17th September and was developed by Cranfield University with BAE Systems and nine other UK universities.

DEMON’s trial flights were the first ‘flapless flights’ ever to be authorised by the UK Civil Aviation Authority."

article-0-0B5E7A85000005DC-954_634x367.jpg


article-0-0B5E7A7F000005DC-775_634x286.jpg

Read more…

fd4362b3-7173-4854-b242-50c097f2828e.Full.jpg

All patent graphics: USPTO [moderator edit]

Some unusual new design schemes for UAVs were recently granted US patents.

"Designs for morphing and articulating UAVs are among some of the latestest patents approved by the US Patent and Trademark Office. One is a Boeing patent for a lifting-body UAV with telescoping wing.

c022cbbb-c9b7-4ef1-a65c-ec837d74dfd5.Large.jpg

Concentric wing sections extend for a low-speed, high-lift configuration and retract flush with the airframe for high speed and low lift. Russian aircraft designer Ivan Makhonine flew the telescoping-wing MAK-10 in 1931, but Boeing's patent goes a step further and envisions extendable foreplanes, vertical tails - and variable-geometry telescoping wings...

fd4362b3-7173-4854-b242-50c097f2828e.Large.jpg

Another Boeing patent is for a solar-powered UAV capable of continuous operation at northern latitudes and during winter months, when sunlight is in short supply.

703495f5-4b37-4b54-8fc2-bd1fa7df4dae.Large.jpg

The aircraft has a planar "solar sail", with solar cells on one side, mounted so it can rotate around the aircraft's roll axis to track the elevation of the Sun while the vehicle remains horizontal. The X-tail also has solar cells on one side and rotates to track the Sun while providing pitch and yaw control of the vehicle.

1b40fdab-23f9-4b96-ac99-b7cd39c8228c.Large.jpg

One final patent, awarded to Virginia-based Geoffrey Summer, is for a "skybase" system of forming a high-latitude UAV from multiple smaller aircraft. These "modular flyers" would be air-launched individually and would join up, wingtip to wingtip, to form a larger "articluated-wing"vehicle - the more that join the higher it can fly.

6fb6abc0-d81a-43cb-ba36-00b366a89de3.Large.jpg

As illustrated above (right), individual flyers could fail and the skybase would reform and keep flying. Additionally, individual flyers could be detached from the formation and despatched to take a closer look at a target before returning to rejoin the skybase."

Read more…

Onboard Visual Navigation on a Quadrotor


Swiss engineers demonstrate onboard visual navigation scheme on the Pixhawk quadrotor. Rapid disturbance response is achieved through low CPU usage and 30ms latency from shutter to control output.


Here, UPenn brings it to life with indoor navigation - another ROS compatible project. This demo involves processing much larger data sets and relies on some remote computing to crunch the numbers: http://diydrones.com/profiles/blogs/autonomous-multifloor-indoor

Read more…