All Posts (14048)

Sort by

From this interview in 2009. Totally unambiguos statement on model aircaft (recreational UAS):

“The FAA certainly doesn’t want to regulate model aircraft,” Talbert said. "Model aircraft are not to exceed 55 pounds or fly at an altitude of more than 400 feet. They must remain within the line of sight and not operate within three nautical miles of an airport or heliport", he said.

Seems to have been (and still is) the concensus at the FAA for years regardless of the amateur arguements that AC 91-57 is to be ignored. Note that he is giving a weight limit as well.

Read more…
Developer

Testing a DIYDrones magnetometer in HK's GCS

Testing a DIYDrones magnetometer in HK's GCS from Pete Hollands on Vimeo.


I test the DIY Drones magnetometer ( HMC5843 magnetometer breakout board ) using Happy Killmore's Ground Control Station connected using SERIAL_UDB_EXTRA telemetry format, to a UAV DevBoard V3 running
MatrixPilot (revision 729 of trunk).

photos:
picasaweb.google.com/​peter.hollands/​Twinstar2Build#5547623052830239122

UAVDevBoard / MatrixPilot:
code.google.com/​p/​gentlenav/​wiki/​Home?tm=6

HK GCS:
code.google.com/​p/​happykillmore-gcs/​

DIY Drones Store:
store.diydrones.com/​HMC5843_Triple_Axis_Magnetometer_p/​br-hmc5843-01.htm

Self Calibrating Mathematics, and software to integrate via I2c written by William Premerlani:-
diydrones.com/​profile/​WilliamPremerlani

Source Code:
code.google.com/​p/​gentlenav/​source/​browse/​trunk/​libUDB/​magneto_udb.c



Read more…
3D Robotics

PhuBar: an open source heli gyro

Here's a nice little open source project that replaces the flybar in helis with a little gyro board. From the writeup:


"PhuBar is a stabilizer device for micro-sized R/C helicopters. It is based on the Parallax Propeller processor and the ITG-3200 gyro chip. It is not intended to be a UAV controller or inertial navigation unit. It simply provides stabilization similar to that of a physical weighted flybar, making small helicopters easier to fly, and making the physical flybar unnecessary."

Read more…
3D Robotics

Autonomous quadrotor navigation with a Kinect

We knew this day was coming, but it's great to see it so soon: UC Berkeley has a Kinect onboard a Quad doing optical obstacle avoidance.

From the video description:

"This work is part of the STARMAC Project in the Hybrid Systems Lab at UC Berkeley (EECS department).http://hybrid.eecs.berkeley.edu/

Researcher: Patrick Bouffard
PI: Prof. Claire Tomlin

Our lab’s Ascending Technologies [1] Pelican quadrotor, flying autonomously and avoiding obstacles.

The attached Microsoft Kinect [2] delivers a point cloud to the onboard computer via the ROS [3] kinect driver, which uses the OpenKinect/Freenect [4] project’s driver for hardware access. A sample consensus algorithm [5] fits a planar model to the points on the floor, and this planar model is fed into the controller as the sensed altitude. All processing is done on the on-board 1.6 GHz Intel Atom based computer, running Linux (Ubuntu 10.04).

A VICON [6] motion capture system is used to provide the other necessary degrees of freedom (lateral and yaw) and acts as a safety backup to the Kinect altitude–in case of a dropout in the altitude reading from the Kinect data, the VICON based reading is used instead. In this video however, the safety backup was not needed.

[1] http://www.asctec.de
[2] http://www.microsoft.com
[3] http://www.ros.org/wiki/kinect
[4] http://openkinect.org
[5] http://www.ros.org/wiki/pcl
[6] http://www.vicon.com"


(via Trossen Robotics)

Read more…
Developer

Improved Altitude Estimate

The barometric sensor alone in most cases provides noisy and drift prone measurements. I have been trying to come up with a simple way to implement a fixed gain filter to utilize the acceleration measurements from the accelerometers to help clean up the noise and also add a sanity check via sensor fusion.


The cool part about the filter is that it also estimates vertical velocity which is useful for altitude damping and control. The basic idea behind the filter is the Z or vertical component of the acceleration is derived using the body rotations to compare with the barometric measurement. The noise of the barometric sensor is rejected whenever the accelerometers disagree. The down side to this approach is it depends upon the accelerometers being less noisy than barometric sensor and also not being on a vibration intensive vehicle. The theory works but actual implementation is somewhat hardware dependant.

The real plus to this filter is the velocity estimate can be used for altitude damping and control. Instead of taking the derivative of the barometric sensor which is going to significantly increase the noise in the derivative term, you get a much cleaner vehicle state estimate with this approach.

The code is fairly simple but may require tuning based on units and hardware. I took the fixed gain approach to keep it simple and as a result looks like the following:

The thing to note here is these are all relative to the NED reference frame (ie: acc_z is not the body z acceleration it is the NED z acceleration)

Enjoy-

-Beall

Read more…

ArduCoptor Parts List (BOM) Example

ArduCopter individual components - $524.38
* 1x Frame - Buy here for $150.00 (This Frame includes Power distribution board)
* 1x Power distribution board
* 4x Motors 850kv - Buy here for $16.00 ea. ($22 option available)
* 4x ESCs 20Amp - Buy here for $16.00 ea.
* 4x Propellers 10x45 - Buy here for $9.00 for 2
* 1x ArduPilot Mega Flight controller board - Buy here for $59.95
* 1x ArduPilot Mega OilPan IMU board - Buy here for $159.95
* 2x 3 row pins for APM - Buy here for $1.99 ea
* 2x Conn Header Male 36. Straight - Buy here for $2.25 ea.

ArduCopter Quad v1.0 KIT (I believe this kit includes the above components along with the GPS option 1 below)
* ArduCopter Quad v1.0 KIT, Full Electronics - Buy here for $499
* Requires TX RX and Lipo Battery

Magnetometer - More info

option1 - $46.85
* 1x Magnetometer - Buy here for $44.90
* 1x 15cm cable - Buy here for $1.95

GPS - More info

Option1 (Media tek) - $39.90
* 1 x MediaTek 3329 GPS - Buy here for $37.95
* 1 x 10 cm Cable (for GPS) - Buy here for $1.95

Telemetry - More info

Option1 (Xbee 900 mhz) - $130.80
* 1x Xbee pro 900 w/wire antenna - Buy here for $42.95
* 1x XBee-PRO 900 RF module - Buy here for $44.95
* 1x 900MHz Duck Antenna RP-SMA - Buy here for $7.95
* 1x XBee Adapter kit - Buy here for $10.00
* 1x XBee Explorer USB - Buy Here for $24.95

Accessories

* FlySky FS-TH9X 2.4G 9CH Radio (TX / RX) - Buy here for $59.99
* Lipo Battery - Buy here for $23.18
* Lipo Battery charger

Read more…

Hortens and APM

Hi,

I want Aurdupilot Mega to control a Horten airplane. My aim is to use APM in this 1:5 semiscale Model of a Schapel SA-882 (see http://www.twitt.org/schapel.htm for a description of the original)





The Model has a span of about 2,08 m, made of playwood with silk covering. It was originally a microturbine airplane (with Martin Lambert's 20N Kolibri), had a TOW of about 4,9 kg (including 1,2 litres of Petroleum). I exchanged the microturbine by an electric motor Scorpion S-4025-01driving a 13 x 8 " propeller on 6S Lipo. Now it has a TOW of about 4 kg. I still have to do the maiden flight with that setup...

Before I go into using APM as an autopilot I want to use ist as a logger to analyse the design. I've been doing some work on the code to include the changes I need (also added functionality for the Attopilot current sensor). And wrote a program to extract the data from the logs. It's the first time I ever work with a microcontroller, but thanks to Arduino it's the same as programming in C++ :D.

Yesterday I installed the APM into the smaller sister of the above airplane. Namely, into a 1:8 model of the Schapel. This small model was designed to be a speed model, in contrast to the larger one which was designed for more acrobatic movements. Nevertheless, the small one reaches speeds of about 250 km/h on a 800 W setup (a friends one, not mine), while the large one reached also speeds way above 200 km/h with the turbine.



As always, the first time one makes errors. I installed the APM flipped, so that the yaw information is mirrored. Nevertheless, it was yesterday a rough windy ride and I was happy to make 4 logs during motor and gliding flight. I was also sort of "rusty", due to the bad weather here in Germany it's about 2 Month that I flew. The sight was also bad and it was freezing (-3 C). So, I was very happy to not have broken either the airplane or the APM. Here is one plot



The figure displays a gliding situation. The phugoid oscillation is clearly displayed (long period oscillation in altitude and velocity). In theory, the period should depend only on gravity and flight speed, which is more or less confirmed by the data. A strong oscillation in pitch is seen up to 880 s. I was probably still correcting the plane's attitude after turning. Afterwards we find two oscillations. One has a period of about 1.5 s (data between 885 and 890 s) and the other one is faster with a period about 0.7 to 0.75 s (see data between 898 and 900 s). The faster one is probably the so called short period oscillation in angle of attack, and the larger one is the second harmonic of this oscillation. To display the faster one better, a higher rate as 10 Hz would be needed. I would have to turn on the 50 Hz logging.


BTW the measurement shows a lift coefficient CL of about 0.6 and a gliding ratio of about 6 (i.e,. CD=0.1), which is not fantastic, but for a small airplane it is also not too bad. I guess I was flying the plane near stall, as my simulations tell me that stall is at about CL=0.6. Essentially the measurement confirms my simulations, as the gliding ratio was calculated to be 9 at stall (not including the cabin). Next time I'll trim the airplane to 60 - 65 km/h, where it should have theoretically a gliding ratio of 16 (realistic probably 12 or 13).


I will keep you informed on further progress ;)


Regards,


Andrés





Read more…

Visual Autolanding System for Rotorcraft

Yet another visual auto pilot system looking beyond the limitations of accelerometers and gyroscopes.

From The Engineer: The technology of unmanned aerial vehicles (UAVs) has advanced so far that they’re now mainstream technology, particularly in the military arena. Despite this, they are still developing, and a the major gaps in the use of one of the most versatile varieties of UAV may soon be closed.

Helicopter UAVs (HUAVs) have all the advantages of manned helicopters, such as hovering flight and the vertical take-off and landing which allows them to be operated from virtually any area. However, landing an unmanned helicopter isn’t easy, and it’s the one part of their operation which cannot be carried out autonomously. A trained operator has to land the craft by remote control, and remote landing of a helicopter is particularly tricky. A cushion of air builds up underneath the craft as it descends, and this has to be spilled, by slowing down the rotors at a controlled rate, before it can settle on the ground. Any mistakes tend to lead to the helicopter tipping over, which is catastrophic, as the large amount of energy stored in the spinning blades will then be dissipated by smashing the whole craft (and probably any equipment on board) to pieces.

Engineering electronics developer Roke Manor has been developing a system to automate HUAV landing, and it’s based on a technology which is familiar to millions — but not from where they might expect. Who could think that the system which tells tennis umpires whether a ball is in or out, or shows TV viewers whether a delivery is LBW, might guide a drone helicopter in to land?

The Hawk-Eye system was developed by Roke in the 1990s, and works by analysing the data from fixed cameras to extrapolate information about how a ball is moving in three dimensions. The HUAV landing system works in exactly the same way, but in reverse — the camera is mounted on the moving helicopter, and its on-board data processes use its images to work out the motion of the helicopter in relation to its landing position, which is generally (but not always) fixed.

In fact, as Roke’s Future Technology Manager Peter Lockhart explained, the system was developed for use on any UAVs, as they all need to be remote-piloted for landing, unless they are to be recovered by parachute. ‘But we were testing this on our own site at Roke, and it just isn’t big enough to fly fixed-wing UAVs. As it happens, helicopters are the hardest to land anyway, so that suited both our purposes — we could control the experimental and testing phase without going offsite and tackle the most challenging aspect of the problem.’

Ed Sparks, consultant engineer at Roke and one of the original developers of Hawk-Eye, said that the relationship between the two systems is direct: ‘Hawk-Eye tells you where the ball is, we look at the landing pad and work out where we are in relation to it.’

The visual processing system works out the helicopter’s roll, pitch and yaw in relation to the ground. There are distinct advantages to using this system rather than accelerometers and gyroscopes, which give an absolute measurement of orientation, Sparks explained. ‘With accelerometers, gravity is a very large acceleration which is applied constantly while the craft is flying, so to prevent you from confusing gravity with your own motion, you need an extremely accurate accelerometer,’ he said. ‘Also, the accelerometer tells you your attitude relative to where you started, so if it’s running throughout an hour’s flight, that’s an hour’s worth of errors it could have accumulated.’

The visual system, on the other hand, is unaffected by these sorts of errors. ‘You turn it on when you’re starting to make your landing approach, and you see exactly what it sees on the ground,’ Sparks said. ‘The landing system measures your position relative to the specified landing spot, from exactly where you are to exactly where you want to be, so it’s minimising the errors from the word go.’

One of the most important criteria for developing the system was that it had to be entirely self-contained on board the HUAV. ‘We don’t want any reliance at all from information passed up from the ground,’ Sparks said. This meant that all the image processing hardware had to be on board as well as the camera itself. The camera and the UAV itself were off-the-shelf products, and Roke brought in SME UAV manufacturer Blue Bear Systems, which developed a new variant of a lightweight computing platform with bespoke video units to house Roke’s image processing software. The team also worked with the aeronautics department of Bristol University, which has a long history of working with autonomous systems, to work on the control theory for the system, in particular the algorithms which take the visual measurements and turn those into guidance commands for the helicopter.

Another partner in the collaboration was MBDA, a large aerospace manufacturer, which brought its expertise on flight control algorithms to bear on solving the problem of what happens when the landing platform is moving as well as the UAV —if it has to land on a flat-bed truck, for example. ‘They do a lot of work on controlling two platforms to optimum use,’ Sparks said. Roke acted as the system integrator as well as providing the UAV itself and the image processing know-how.

The result is a system which allows the UAV to land in any conditions where the ground is visible. ‘Basically, we can operate in any conditions in which a piloted helicopter can operate,’ said Lockhart. ‘Landing at night isn’t a problem. Thick fog would be, but you wouldn’t be flying in those conditions anyway.’

The system requires no human intervention at all to land, and in many cases the UAV will have a camera trained on the ground in any case, as many UAV flights are for reconnaisance purposes. However, among the possible applications for this system is the unmanned, autonomous resupply of troops in difficult-to-reach locations, reducing the risk to helicopter pilots and other personnel.

The next phase of the research is aimed at making the system even more user-friendly, with features such as point-and-click navigation. ‘And unskilled operator could just click on an area he or she wanted to investigate, or wanted to designate as a landing area,’ Lockhart said.

The Roke team was particularly pleased with the speed of the project. ‘We’ve brought a lot together in a very short time,’ Sparks said. ‘We started in the spring of 2009, and we were landing the helicopter by the summer. In the autumn we did our first landing on a moving target. We’re now in a position to start selling the system, and we have a number of leading UAV vendors who have it in trials to decide whether they want to install it on their platforms. These UAVs are multi-use, so the first application depends on who buys it, but it’s likely to be defence or police."

Read more…
Developer

Here a project of a Coanda Effect Saucer (CES) stabilized with a 9D0F IMU. The CES UAV, propelled by an electric engine, uses the Coanda effect to take off vertically, fly, hover and land vertically (VTOL). There is no big rotor like on an helicopter and the flight is very stable and safe for the surrounding.

The Coanda Effect has been discovered in1930 by the Romanian aerodynamicist Henri-Marie Coanda (1885-1972). He has observed that a steam of air (or a other fluid) emerging from a nozzle tends to follow a nearby curved surface, if the curvature of the surface or angle the surface makes with the stream is not too sharp.

I use my firmware AutoStab v4.0 installed the ArduIMU+ V2 flat with a special mixer for this device.

Stay tuned on this blog, more to come soon,

Jean-Louis


Read more…

Quad Flight Testing

Here is a bit of video I have put together of my quad flight testing. Have been getting about 7mins of flight time with the very inefficient props I have at the moment. Will be putting larger two blade props on it when they turn up.



The flight I did after this video was a little less successful, doing some more aggresive flying I found out a heavy quad does not like high rates if decent, It seemed to get a lot of prop wash spilling around and being drawn back into the props, and stalling them (not very fun :-o) It got away from me after that, APM could not keep it level, nor could I and I dumped it into a tree at about 30-40kph :-(
The result was this:


It looks pretty bad, but after checking it all I only broke one prop and the plastic joiner in the middle.

So if you have a heavy or larger quad be careful with the decent rates particularly with no or slow forward speeds.


Read more…

DIY Radio Altimeter ::Updated::

bottom.jpg



Here is a DIY radio altimeter designed by Matjaž Vidmar. It is a good start for anyone looking at a non-GPS method of finding altitude while landing.

There are no schematics or source code available on the site, but it is a good design reference and it does give the PCB layout of the microwave circuit. That alone helps
immensely as that is the most difficult part of an RF design.

::Update::
Found another vertical radar project on his website with much more documentation.
Read more…


Hi,

This might be a bit off from the main interest on this forum, but then, you never know.

For my part, the thrill for UAV's was originally the interst for what they could be used for, not the UAV's themselvs. But like most newcomers , I soon learned to tune the expectations for small UAV's down a level or two. Strangely enough, it wasn't all that easy. Lot's of the functionality requested isn't there (yet), more skills then initially thought of are needed, and so on.

After the T3-4 (map a quarter of a mile) competition, I started to explore what software that was around, with regard to image processing and creation of digital elevation models (dem's) and orthophotos. What I discovered was that there were quite a few, but with a price tag usually starting at about 5K USD or more, besides not neccessarily particulary easy to use. Well, there were also open source and low cost softwares, some with limited/specialized functionality, and often with high requirements for the users qualifications.

Well, not to make this an essay, I will jump to the point. After I got aware of MS Photosynth and similar software, things started to lighten. With Photosynth (and similar), it is basically possible to extract the same kind of information as from high cost laser-scans. Not the same amount of data, but in many respects with comparable possibilities for utilizations. The problem seemed however to be that Photosynth pointclouds were not georeferenced, and could therefore not be directly imported into other software for further processing.

Therefore, when I finally got some spare time, I started to develope a software/tool for exactly this, georeferencing of pointclouds. By now, I have a functional software, but still need to do more testing. So, I want to get in touch with others who have the same interest in processing of aerial photos, and who do use UAVs for this purpose.

If so, and if you have some time spend on experimenting with new software, please send me a PM.

So for the software, what does it do?

PC-GoeRef (Point Cloud GeoReferencing)

PC-GeoRef is like the name suggests, a tool for georeferencing of 3D pointclouds. I have only tested the software on pointclouds derived withMS Photosynth so far, but in theory it should work on pointclouds fromother sources as well. (Probably a need for writing new routines forreading the different file-formats).

Basically, the process consists of two parts;

1. Identifying points in the pointcloud that corresponds to a set of points with known coordinates, GCP's (Ground Control Points).

2. Performing a coordinat transformation on the pointcloud, in order to georeference the pointcloud.

The idea is to use the (georeferenced) pointclouds as basis for building real world digital elevation models, orthophotos and for furterprocessing in other software. Eg. in LIBLAS tools, GRASS GIS, AutoCad,ArcGIS, etc. (Points are simple objects and can relatively easily beexported to most fileformats).

The process:

The process starts by aquiering a pointcloud, eg. by using Photosynth, and some tool like PhotosynthPointCloudExporter for retrieving the points. The synth shown below was made as part of the T3-4 competition here at DIYDrones and can be viewed here: http://photosynth.net/view.aspx?cid=1d31dfca-a6b5-41ef-9e08-7717b5bf1a49


Notice the red dot on the roof of the small house on the pickture below, (added in MS Pain after the comp). That is one of the GCP's



By using a bright color which differs from the surroundings, there is a good chance for Photosynth to notice it, and to store it during theimage matching process. Since Photosynth also stores the color of thepoints that are registered, that can be used to finding the gcp in thepointcloud, even if the pointcloud consists of several 1000 points.

This is where PC-GeoRef comes into the pickture. The process starts by opening a pickture with a gcp marker in it, and clicking on themarker in order to learn it's color by RGB-value.


Then the pointcloud-file are read and filtered by color. These pointsthen become candidates for automatic matching with GCP's with knowncoordinats. (The red lines to the left have the same colour as thefiltered points from the pointcloud).


Next, a test-transformation is performed on the matched points, and last a transformation of the whole pointcloud is carried out. This iscommonly called a Helmert- or Affine-tranformation. I am using a custommethod for this transformation. It is currently using 7-parameter(translation x, y ,z, rotation about x-, y-, z-axis and one scale forall directions). The method can however easily be extended to individualscaling of all individual axis. (For that I will need more matchedpoints then the 3 I found in my first successfull test-synth).



So for the results: As can be seen from the example above (all numbers are in meters), I found an almost incredible high accuracy inthe pointcloud derived from Photosynth. In this first test, I actuallyfound the differences in distance measurements to be in the field ofplus/minus 30cm, for GCP's spaced more then 100m apart. I will offcourceneed to do more testing before I can tell what accurracy that can beexpected, but the first results were indeed interesting.

What's next:

- More testing, (alfa/beta-testers wanted...)

- Writing file-export to a few formats,

- Cleaning up the code,

- Demonstrating production and use of dem's

- etc,

- Then possibly sale of the software, (low cost ;-)



Read more…
Moderator
It will appear on the right shortly but too good not to repost here for those that might be interested that don't read the feeds.

Certainly something to think about over the holidays.

Ascending Technologies GmbH from Germany is looking to sponsor an international UAV research team in participating in any international UAV Contest (e.g. IARC) in 2011, or in demonstrating their groundbreaking developments in the field of UAVs in any public event in 2011.

The sponsored team will receive 15,000 Euros of hardware, software, individual modifications and support from Ascending Technologies GmbH’s multi-rotor product line.

The nominee shall be at the forefront setting the benchmark in UAV research in 2011. This sponsorship will enable the team to focus on developing their own ideas instead of trying to get an UAV flying.

A short description of the mission objectives, from the challenge which is planned to be attend, or what the project is about and a brief abstract for publication on www.asctec.de is required. The full application must be sent via email to Ascending Technologies by the 13th of February 2011.

Ascending Technologies is a young and highly innovative manufacturer of flying multi-rotor platforms for university & research applications. The company is known for its experience in design and production of high quality UAVs worldwide in most research laboratories and R&D departments in the area of aerial robots.

In 2008, Ascending Technologies won several awards in the European Micro Air Vehicle Conference (EMAV). Last year, the winner of the International Aerial Robotics Competition (IARC) used the AscTec Pelican aerial robot. In October 2010 a new endurance world record in laser-powered flight had been achieved in cooperation with LaserMotive LLC, Seattle. The AscTec UAV flew for over 12 hrs.


They are top banana people, so would be a great way to get to work with them.

Read more…

QGroundControl with Google Earth 3D Trees

image-2-1024x654.png

It was just one of these casual quick hacks (about 10 lines of code), but QGroundControl now also allows to use Google Earth for visualization, including the new trees feature. We're just in the beginning of the integration, a binary including this feature should be available next week.

While Google Earth is very shiny, it does not allow adding additional visualization components with the same degrees of freedom as OpenSceneGraph and osgEarth. We will therefore push the OpenSceneGraph/osgEarth integration further, as osgEarth also allows for better offline use without internet connection. As our MAVs collect aerial images, osgEarth allows to directly render them into the scene (as 3D mapping), a feature missing in Google Earth. But since we now support the best of both worlds, QGroundControl users have all choices at hand.


Read more…

In this FAA UAS risk analysis they state unambiguosly that AC 91-57 MUST be complied with for model aircraft (recreational UAVs). So much for all that "it is a request not a law" arguement. Apparently the AMA rules have been telling people to break the law for the last 30 years.

Read page 115 of this 2008 FAA document.

"Unmanned aircraft flown for hobby purposes or

recreation. Model aircraft must comply with

Advisory Circular (AC) 91-57, Model Aircraft

Operating Standards, published in 1981"

Read more…