All Posts (14056)

Sort by
This was my today´s proyect, to connect ardustation with xbee and the laptop at the same time, displaying a flight data from a GPS simulated Ardupilot and also to perform a 3 waypoint mission and test the antenna tracker feature...............I had a great time today..... Here is the video....
Read more…
3D Robotics

Controlling the AR.Drone with Urbi

From I Heart Robotics: "..Urbi is a parallel and event driven scripting language and is being used for a varity of robots including the Nao. As an example of what Urbi can do, here is a neat demo of programming a ball tracker for the AR.Drone in 25 lines of code.

If you have an extra hour this Google TechTalk presents some of the ideas behind Urbi, though there have been some improvements since the talk. For example, the Urbi code base was recently opensourced with the GNU AGPL License.



Overall Urbi presents some interesting ideas for enabling control of the scheduler, and improving parallelism in a scripting language. The GUI tools look interesting and provide a graphical visualization of the state machines.

It also supports several library bindings enabling building of QT applications and integration with ROS."
Read more…
3D Robotics

InvenSense releases IMU software for Android

From an Invensense press release this morning:

" InvenSense, Inc., the leading solution provider of MotionProcessorsfor consumer electronics, today announced its MPL 3.2 motion processing library software support for Android 2.3 Gingerbread. InvenSense MPL 3.2 software and the companion MotionProcessor product families provide a complete solution that delivers nine-axis sensor fusion data encompassing 3-axis gyroscopes, accelerometers and magnetometers to new Application Programming Interface (API) structures in Android Ginger
bread. The new APIs (quaternion, rotation matrix, linear acceleration and gravity) for the first time allow application developers to fully leverage the benefits of the gyroscope together with the accelerometer and magnetometer. The MPL 3.2 software eliminates the challenges of integrating multiple motion sensors into Android by connecting directly to the Gingerbread sensor hardware abstraction layer (HAL) and delivering 9-axis sensor fusion data to the new APIs without the need for complex, processor-intensive motion algorithm processing on the application processor. This is accomplished by pairing the MPL with a companion MotionProcessor device with its embedded DMP and integrated motion algorithms to offload 9-axis sensor fusion processing from the application processor. The InvenSense MPL software provides the fastest time-to-market MotionProcessing solution for Android platforms."

Here's a video explaining how it works:
Read more…
Elbit Systems Successfully Completes Skylark(r) I-LE UAS Test Flight with Horizon's New AEROPAK Fuel Cell Power System Skylark(r) I-LE is one of less than a handful of UAS to perform a fully operational test flight using the AEROPAK fuel cell, including an operational payload, take-off and recovery

Singapore, December 7, 2010 -Horizon Energy Systems announced the successful test flight of its AEROPAK fuel cell power system onboard Elbit Systems Skylark(r) I-LE UAS (Unmanned Aircraft System). An official test flight was recently carried out in Israel. The test flight was without precedent, since it is the first ever test of a fully operational system using the Horizon AEROPAK fuel cell power system, including take-off and recovery with an operational payload integrated onboard. Equipped with the AEROPAK fuel cell system, Skylark(r) I-LE will offer its users enhanced flight duration, doubling the current endurance of the UAS.(...) Read more

http://www.hes.sg/files/AEROPAKELBIT.pdf
Read more…

I have noticed that many people aren't aware of the full ARC/FAA document released in April 2009. It is 74 pages long, as opposed to the handful of pages on the AMA website that many seem to be using. Lots of info on what may be coming our way, esp. when I doubt that recreational users wil have less limitations imposed than commercial operators (IE: 400 ft ceiling and 1500' lateral distance)

Get it HERE..

Read more…

AMA official response to NYC FPV video

New York City First Person View Video

A recent video posted to YouTube depicts a First Person View (FPV) video flight of an unmanned aircraft over New York Bay; Long Island; the Brooklyn and Manhattan Bridges; Liberty Island and the Statue of Liberty; over and in close proximity to buildings, occupied vehicles and water vessels; and directly over unprotected people. The nature of the flight was outside the realm of recreational aeromodeling activity as defined by the AMA Safety Code and posed a significant threat to people and property.

Although AMA recognizes the ingenuity and creativity of this activity, it does not condone the manner in which this flight was conducted and the threat it posed to the public.

AMA has provided specific guidelines for FPV activity for its members. These guidelines and related safety considerations can be found in the AMA Safety Code and in AMA document #550, “First Person View (FPV) Operations.”

– Rich Hanson, AMA Government Relations and Regulatory Affairs Representative

First Person View (FPV) Operations

1. An FPV-equipped model must be flown by two AMA members utilizing a buddy-box system. The pilot in command must be on the primary transmitter, maintain visual contact, and be prepared to assume control in the event of a problem.
2. The operational range of the model is limited to the pilot in command’s visual line of sight as defined in the Official AMA National Model Aircraft Safety Code (see Radio Control, item 9).
3. The flight path of model operations shall be limited to the designated flying site and approved overfly area.
4. The model weight and speed shall be limited to a maximum of 10 pounds and 60 miles per hour.


Read more…

From this interview in 2009. Totally unambiguos statement on model aircaft (recreational UAS):

“The FAA certainly doesn’t want to regulate model aircraft,” Talbert said. "Model aircraft are not to exceed 55 pounds or fly at an altitude of more than 400 feet. They must remain within the line of sight and not operate within three nautical miles of an airport or heliport", he said.

Seems to have been (and still is) the concensus at the FAA for years regardless of the amateur arguements that AC 91-57 is to be ignored. Note that he is giving a weight limit as well.

Read more…
Developer

Testing a DIYDrones magnetometer in HK's GCS

Testing a DIYDrones magnetometer in HK's GCS from Pete Hollands on Vimeo.


I test the DIY Drones magnetometer ( HMC5843 magnetometer breakout board ) using Happy Killmore's Ground Control Station connected using SERIAL_UDB_EXTRA telemetry format, to a UAV DevBoard V3 running
MatrixPilot (revision 729 of trunk).

photos:
picasaweb.google.com/​peter.hollands/​Twinstar2Build#5547623052830239122

UAVDevBoard / MatrixPilot:
code.google.com/​p/​gentlenav/​wiki/​Home?tm=6

HK GCS:
code.google.com/​p/​happykillmore-gcs/​

DIY Drones Store:
store.diydrones.com/​HMC5843_Triple_Axis_Magnetometer_p/​br-hmc5843-01.htm

Self Calibrating Mathematics, and software to integrate via I2c written by William Premerlani:-
diydrones.com/​profile/​WilliamPremerlani

Source Code:
code.google.com/​p/​gentlenav/​source/​browse/​trunk/​libUDB/​magneto_udb.c



Read more…
3D Robotics

PhuBar: an open source heli gyro

Here's a nice little open source project that replaces the flybar in helis with a little gyro board. From the writeup:


"PhuBar is a stabilizer device for micro-sized R/C helicopters. It is based on the Parallax Propeller processor and the ITG-3200 gyro chip. It is not intended to be a UAV controller or inertial navigation unit. It simply provides stabilization similar to that of a physical weighted flybar, making small helicopters easier to fly, and making the physical flybar unnecessary."

Read more…
3D Robotics

Autonomous quadrotor navigation with a Kinect

We knew this day was coming, but it's great to see it so soon: UC Berkeley has a Kinect onboard a Quad doing optical obstacle avoidance.

From the video description:

"This work is part of the STARMAC Project in the Hybrid Systems Lab at UC Berkeley (EECS department).http://hybrid.eecs.berkeley.edu/

Researcher: Patrick Bouffard
PI: Prof. Claire Tomlin

Our lab’s Ascending Technologies [1] Pelican quadrotor, flying autonomously and avoiding obstacles.

The attached Microsoft Kinect [2] delivers a point cloud to the onboard computer via the ROS [3] kinect driver, which uses the OpenKinect/Freenect [4] project’s driver for hardware access. A sample consensus algorithm [5] fits a planar model to the points on the floor, and this planar model is fed into the controller as the sensed altitude. All processing is done on the on-board 1.6 GHz Intel Atom based computer, running Linux (Ubuntu 10.04).

A VICON [6] motion capture system is used to provide the other necessary degrees of freedom (lateral and yaw) and acts as a safety backup to the Kinect altitude–in case of a dropout in the altitude reading from the Kinect data, the VICON based reading is used instead. In this video however, the safety backup was not needed.

[1] http://www.asctec.de
[2] http://www.microsoft.com
[3] http://www.ros.org/wiki/kinect
[4] http://openkinect.org
[5] http://www.ros.org/wiki/pcl
[6] http://www.vicon.com"


(via Trossen Robotics)

Read more…
Developer

Improved Altitude Estimate

The barometric sensor alone in most cases provides noisy and drift prone measurements. I have been trying to come up with a simple way to implement a fixed gain filter to utilize the acceleration measurements from the accelerometers to help clean up the noise and also add a sanity check via sensor fusion.


The cool part about the filter is that it also estimates vertical velocity which is useful for altitude damping and control. The basic idea behind the filter is the Z or vertical component of the acceleration is derived using the body rotations to compare with the barometric measurement. The noise of the barometric sensor is rejected whenever the accelerometers disagree. The down side to this approach is it depends upon the accelerometers being less noisy than barometric sensor and also not being on a vibration intensive vehicle. The theory works but actual implementation is somewhat hardware dependant.

The real plus to this filter is the velocity estimate can be used for altitude damping and control. Instead of taking the derivative of the barometric sensor which is going to significantly increase the noise in the derivative term, you get a much cleaner vehicle state estimate with this approach.

The code is fairly simple but may require tuning based on units and hardware. I took the fixed gain approach to keep it simple and as a result looks like the following:

The thing to note here is these are all relative to the NED reference frame (ie: acc_z is not the body z acceleration it is the NED z acceleration)

Enjoy-

-Beall

Read more…

ArduCoptor Parts List (BOM) Example

ArduCopter individual components - $524.38
* 1x Frame - Buy here for $150.00 (This Frame includes Power distribution board)
* 1x Power distribution board
* 4x Motors 850kv - Buy here for $16.00 ea. ($22 option available)
* 4x ESCs 20Amp - Buy here for $16.00 ea.
* 4x Propellers 10x45 - Buy here for $9.00 for 2
* 1x ArduPilot Mega Flight controller board - Buy here for $59.95
* 1x ArduPilot Mega OilPan IMU board - Buy here for $159.95
* 2x 3 row pins for APM - Buy here for $1.99 ea
* 2x Conn Header Male 36. Straight - Buy here for $2.25 ea.

ArduCopter Quad v1.0 KIT (I believe this kit includes the above components along with the GPS option 1 below)
* ArduCopter Quad v1.0 KIT, Full Electronics - Buy here for $499
* Requires TX RX and Lipo Battery

Magnetometer - More info

option1 - $46.85
* 1x Magnetometer - Buy here for $44.90
* 1x 15cm cable - Buy here for $1.95

GPS - More info

Option1 (Media tek) - $39.90
* 1 x MediaTek 3329 GPS - Buy here for $37.95
* 1 x 10 cm Cable (for GPS) - Buy here for $1.95

Telemetry - More info

Option1 (Xbee 900 mhz) - $130.80
* 1x Xbee pro 900 w/wire antenna - Buy here for $42.95
* 1x XBee-PRO 900 RF module - Buy here for $44.95
* 1x 900MHz Duck Antenna RP-SMA - Buy here for $7.95
* 1x XBee Adapter kit - Buy here for $10.00
* 1x XBee Explorer USB - Buy Here for $24.95

Accessories

* FlySky FS-TH9X 2.4G 9CH Radio (TX / RX) - Buy here for $59.99
* Lipo Battery - Buy here for $23.18
* Lipo Battery charger

Read more…

Hortens and APM

Hi,

I want Aurdupilot Mega to control a Horten airplane. My aim is to use APM in this 1:5 semiscale Model of a Schapel SA-882 (see http://www.twitt.org/schapel.htm for a description of the original)





The Model has a span of about 2,08 m, made of playwood with silk covering. It was originally a microturbine airplane (with Martin Lambert's 20N Kolibri), had a TOW of about 4,9 kg (including 1,2 litres of Petroleum). I exchanged the microturbine by an electric motor Scorpion S-4025-01driving a 13 x 8 " propeller on 6S Lipo. Now it has a TOW of about 4 kg. I still have to do the maiden flight with that setup...

Before I go into using APM as an autopilot I want to use ist as a logger to analyse the design. I've been doing some work on the code to include the changes I need (also added functionality for the Attopilot current sensor). And wrote a program to extract the data from the logs. It's the first time I ever work with a microcontroller, but thanks to Arduino it's the same as programming in C++ :D.

Yesterday I installed the APM into the smaller sister of the above airplane. Namely, into a 1:8 model of the Schapel. This small model was designed to be a speed model, in contrast to the larger one which was designed for more acrobatic movements. Nevertheless, the small one reaches speeds of about 250 km/h on a 800 W setup (a friends one, not mine), while the large one reached also speeds way above 200 km/h with the turbine.



As always, the first time one makes errors. I installed the APM flipped, so that the yaw information is mirrored. Nevertheless, it was yesterday a rough windy ride and I was happy to make 4 logs during motor and gliding flight. I was also sort of "rusty", due to the bad weather here in Germany it's about 2 Month that I flew. The sight was also bad and it was freezing (-3 C). So, I was very happy to not have broken either the airplane or the APM. Here is one plot



The figure displays a gliding situation. The phugoid oscillation is clearly displayed (long period oscillation in altitude and velocity). In theory, the period should depend only on gravity and flight speed, which is more or less confirmed by the data. A strong oscillation in pitch is seen up to 880 s. I was probably still correcting the plane's attitude after turning. Afterwards we find two oscillations. One has a period of about 1.5 s (data between 885 and 890 s) and the other one is faster with a period about 0.7 to 0.75 s (see data between 898 and 900 s). The faster one is probably the so called short period oscillation in angle of attack, and the larger one is the second harmonic of this oscillation. To display the faster one better, a higher rate as 10 Hz would be needed. I would have to turn on the 50 Hz logging.


BTW the measurement shows a lift coefficient CL of about 0.6 and a gliding ratio of about 6 (i.e,. CD=0.1), which is not fantastic, but for a small airplane it is also not too bad. I guess I was flying the plane near stall, as my simulations tell me that stall is at about CL=0.6. Essentially the measurement confirms my simulations, as the gliding ratio was calculated to be 9 at stall (not including the cabin). Next time I'll trim the airplane to 60 - 65 km/h, where it should have theoretically a gliding ratio of 16 (realistic probably 12 or 13).


I will keep you informed on further progress ;)


Regards,


Andrés





Read more…

Visual Autolanding System for Rotorcraft

Yet another visual auto pilot system looking beyond the limitations of accelerometers and gyroscopes.

From The Engineer: The technology of unmanned aerial vehicles (UAVs) has advanced so far that they’re now mainstream technology, particularly in the military arena. Despite this, they are still developing, and a the major gaps in the use of one of the most versatile varieties of UAV may soon be closed.

Helicopter UAVs (HUAVs) have all the advantages of manned helicopters, such as hovering flight and the vertical take-off and landing which allows them to be operated from virtually any area. However, landing an unmanned helicopter isn’t easy, and it’s the one part of their operation which cannot be carried out autonomously. A trained operator has to land the craft by remote control, and remote landing of a helicopter is particularly tricky. A cushion of air builds up underneath the craft as it descends, and this has to be spilled, by slowing down the rotors at a controlled rate, before it can settle on the ground. Any mistakes tend to lead to the helicopter tipping over, which is catastrophic, as the large amount of energy stored in the spinning blades will then be dissipated by smashing the whole craft (and probably any equipment on board) to pieces.

Engineering electronics developer Roke Manor has been developing a system to automate HUAV landing, and it’s based on a technology which is familiar to millions — but not from where they might expect. Who could think that the system which tells tennis umpires whether a ball is in or out, or shows TV viewers whether a delivery is LBW, might guide a drone helicopter in to land?

The Hawk-Eye system was developed by Roke in the 1990s, and works by analysing the data from fixed cameras to extrapolate information about how a ball is moving in three dimensions. The HUAV landing system works in exactly the same way, but in reverse — the camera is mounted on the moving helicopter, and its on-board data processes use its images to work out the motion of the helicopter in relation to its landing position, which is generally (but not always) fixed.

In fact, as Roke’s Future Technology Manager Peter Lockhart explained, the system was developed for use on any UAVs, as they all need to be remote-piloted for landing, unless they are to be recovered by parachute. ‘But we were testing this on our own site at Roke, and it just isn’t big enough to fly fixed-wing UAVs. As it happens, helicopters are the hardest to land anyway, so that suited both our purposes — we could control the experimental and testing phase without going offsite and tackle the most challenging aspect of the problem.’

Ed Sparks, consultant engineer at Roke and one of the original developers of Hawk-Eye, said that the relationship between the two systems is direct: ‘Hawk-Eye tells you where the ball is, we look at the landing pad and work out where we are in relation to it.’

The visual processing system works out the helicopter’s roll, pitch and yaw in relation to the ground. There are distinct advantages to using this system rather than accelerometers and gyroscopes, which give an absolute measurement of orientation, Sparks explained. ‘With accelerometers, gravity is a very large acceleration which is applied constantly while the craft is flying, so to prevent you from confusing gravity with your own motion, you need an extremely accurate accelerometer,’ he said. ‘Also, the accelerometer tells you your attitude relative to where you started, so if it’s running throughout an hour’s flight, that’s an hour’s worth of errors it could have accumulated.’

The visual system, on the other hand, is unaffected by these sorts of errors. ‘You turn it on when you’re starting to make your landing approach, and you see exactly what it sees on the ground,’ Sparks said. ‘The landing system measures your position relative to the specified landing spot, from exactly where you are to exactly where you want to be, so it’s minimising the errors from the word go.’

One of the most important criteria for developing the system was that it had to be entirely self-contained on board the HUAV. ‘We don’t want any reliance at all from information passed up from the ground,’ Sparks said. This meant that all the image processing hardware had to be on board as well as the camera itself. The camera and the UAV itself were off-the-shelf products, and Roke brought in SME UAV manufacturer Blue Bear Systems, which developed a new variant of a lightweight computing platform with bespoke video units to house Roke’s image processing software. The team also worked with the aeronautics department of Bristol University, which has a long history of working with autonomous systems, to work on the control theory for the system, in particular the algorithms which take the visual measurements and turn those into guidance commands for the helicopter.

Another partner in the collaboration was MBDA, a large aerospace manufacturer, which brought its expertise on flight control algorithms to bear on solving the problem of what happens when the landing platform is moving as well as the UAV —if it has to land on a flat-bed truck, for example. ‘They do a lot of work on controlling two platforms to optimum use,’ Sparks said. Roke acted as the system integrator as well as providing the UAV itself and the image processing know-how.

The result is a system which allows the UAV to land in any conditions where the ground is visible. ‘Basically, we can operate in any conditions in which a piloted helicopter can operate,’ said Lockhart. ‘Landing at night isn’t a problem. Thick fog would be, but you wouldn’t be flying in those conditions anyway.’

The system requires no human intervention at all to land, and in many cases the UAV will have a camera trained on the ground in any case, as many UAV flights are for reconnaisance purposes. However, among the possible applications for this system is the unmanned, autonomous resupply of troops in difficult-to-reach locations, reducing the risk to helicopter pilots and other personnel.

The next phase of the research is aimed at making the system even more user-friendly, with features such as point-and-click navigation. ‘And unskilled operator could just click on an area he or she wanted to investigate, or wanted to designate as a landing area,’ Lockhart said.

The Roke team was particularly pleased with the speed of the project. ‘We’ve brought a lot together in a very short time,’ Sparks said. ‘We started in the spring of 2009, and we were landing the helicopter by the summer. In the autumn we did our first landing on a moving target. We’re now in a position to start selling the system, and we have a number of leading UAV vendors who have it in trials to decide whether they want to install it on their platforms. These UAVs are multi-use, so the first application depends on who buys it, but it’s likely to be defence or police."

Read more…