All Posts (14048)
The guys at iFixit have done it again: a great step-by-step teardown of the Parrot AR.Drone. Here's just one sample page:
Front of motherboard:
- http://static.ifixit.net/static/images/guide/bullets.gif); position: absolute; top: 4px; left: 3px; background-position: -1px -13px; background-repeat: no-repeat no-repeat; ">
Parrot 6 ARM9 468 MHz processor.
- http://static.ifixit.net/static/images/guide/bullets.gif); position: absolute; top: 4px; left: 3px; background-position: -1px -25px; background-repeat: no-repeat no-repeat; ">
ROCm Atheros AR6102G-BM2D b/g Wi-Fi module.
- http://static.ifixit.net/static/images/guide/bullets.gif); position: absolute; top: 4px; left: 3px; background-position: -1px -37px; background-repeat: no-repeat no-repeat; ">
Micron OGA17 D9HSJ
- http://static.ifixit.net/static/images/guide/bullets.gif); position: absolute; top: 4px; left: 3px; background-position: -1px -49px; background-repeat: no-repeat no-repeat; ">
Vertical camera.
- http://static.ifixit.net/static/images/guide/bullets.gif); position: absolute; top: 4px; left: 3px; background-position: -1px -1px; background-repeat: no-repeat no-repeat; ">
On the back:
- http://static.ifixit.net/static/images/guide/bullets.gif); position: absolute; top: 4px; left: 3px; background-position: -1px -61px; background-repeat: no-repeat no-repeat; ">
Micron 29F1G08AAC
If you're at all handy with 3D drafting tools, you can have the most amazing objects printed for you at Shapeways. Above is just one example, a cockpit for a P-51 Mustang.
You can print in a wide range of materials, from flexible plastic to stainless steel. And it's remarkably cheap, usually around $2-$3 per cubic centimeter.
Shapeways can accept output from many 3D authoring tools, but I prefer Alibre, which is designed for physical objects and is affordable ($99). Lots of people use the free Sketchup, too, although because it's designed for virtual objects it can be a little tricky to ensure that your design will print properly. Others use everything from Blender (open source, crazy hard to use) to Solidworks ($5,000!).
Here's another example: an EasyStar FPV cockpit mount that Jason Short designed and 3D printed. Pretty cool, huh?
Coming out of the 3D printer:
Then there's MAVlink. Their Comm Protocol page http://pixhawk.ethz.ch/wiki/mavlink/#mavlink_packet_documentation shows a link to a SAE AS5669A document found here: http://www.sae.org/servlets/productDetail?PROD_TYP=STD&PROD_CD=AS5669A in which they want $63 to purchase a standards PDF which still won't tell me what I need to know to decode their checksum. Apparently, they're using a crc16 checksum which includes bit shifting. A task not easily handled by .NET. So I've downloaded an converted 4 different checksum functions (1 in C, 2 in C# and 1 in VB6) to VB.NET. None of which will successfully generate the checksums I'm getting from the latest MAVlink APM output.
So I refer back to the MAVlink Comm Protocol document. Perhaps I'm looking at the wrong data. In the case of NMEA, they don't include $ or * in the checksum calculation. In uBlox, they don't include the leading uB in the checksum... but what do you know. They make no mention in the specification as to what they don't include. Is it everything? Do they leave off the "U"... no way to tell so I guess we assume everything is included. But so far, no luck in cracking this nut.... I've got about 8 hours in it so far.
So some might say, skip the Checksum for now. Just read the "U" as the header and the next byte is the payload length...and go from there. Well this is an EPIC FAIL of the MAVlink specification. In a purely homogenous environment where ONLY MAVlink messages exist, worst case (assuming no transmission fails) you'll only loose the first message by false positive on the header. Then it gets pitched on the next pass and you're off and running. "U" is recognized, length of message, checksum matches, all is good. Decode, discard, on to the next. Trouble is if you're not in a 100% MAVlink environment or you have to be ready to handle other protocols, then a "U" has a 1 in 255 change of occurring in a non MAVlink sentence. By having a 2 character header (like SiRF, MediaTek and uBlox), there is a 1 in 65,025 chance of having a false positive. By having a 3 character header (like ArduPilot, NMEA and UDB) there's a 1 in 16,581,375 chance of a false positive.
Does anyone have any clue how to correctly calculate a crc16 checksum in VB.NET? There are lots of "lightening fast" crc16 routines out these that make use of external C DLL's or some assembler code. I'm not interested in lightening fast... I just need it to work.
My father, who is a county land surveyor, pointed this article out to me. Looks like a neat system.
"asked our readers to select the one they believed would have the most impact on the surveying and mapping professions.The results are in. Out of a total of 697 votes, the Gatewing X100 Unmanned Airborne Vehicle is the winner with 48 percent of the votes"http://www.pobonline.com/Articles/Features/BNP_GUID_9-5-2006_A_10000000000000950297
Some Specs:
Weight: 2100g (4.6lbs) Without Flight battery
Span: 102cm (40")
Airframe: Black Anodized Aluminium square tube
Motors: Turnigy 42-50-600kv
Props: Master Airscrew 12x6 3 Blade
ESC's: 60amp Hobbyking SS ESC
Batterys: 3 or 4cell 5A/hr Lipo Flight Battery and 2cell 1800mah Lipo for Avionics
I hope to get some flying videos in the next day or two.
It arrived today, and I'm afraid it's another case of "close, but no cigar". It does get some things right: ailerons, a brushless motor and a decent-sized rudder, which are all missing features with the EasyStar.
It also avoids the errors of some of the other EasyStar clones, such as the AXN Floater, and at least has the servo bays on the outside of the body, not taking up room in the cockpit. (Sadly it doesn't come with servos).
But it fails badly in the design of the fuselage, which is considerably more narrow than the EasyStar. The useful width of the cockpit is 43mm, as opposed to the EasyStar's 55mm. According to flight reports, this thin nose is also vulnerable to crashes and breaks off a lot.
Here's a picture of the EasyStar (a very battered one), the Dynam HawkSky and the Phoenix nose-to-nose:
Of all the EasyStar clones, I still prefer the HawkSky, despite it having a kind funky plastic power pylon that vibrates and makes noise when flying. It's got lots of room and comes with servos already installed. Shame you can't buy it without that 72Mhz 4Ch RC gear, which you simply have to throw out when it arrives and replace with proper 7ch 2.4Ghz stuff.
Someday either Multiplex will finally update the EasyStar and ship it with a proper brushless motor and prop, or the cloners will get it right. In the meantime, I'll keep looking.
Allowing aircraft to quickly sense which way is “up” by imitating how honeybees see, engineers and researchers at The Vision Centre, Queensland Brain Institute and the School of Information Technology and Electrical Engineering at The University of Queensland have made it possible for planes to guide themselves through extreme manoeuvres, including the loop, the barrel roll and the Immelmann turn, with speed, deftness and precision.
“Current aircraft use gyroscopes to work out their orientation, but they are not always reliable, as the errors accumulate over long distances,” said Vision Centre researcher Saul Thurrowgood.
“Our system, which takes 1000ths of a second to directly measure the position of the horizon, is much faster at calculating position, and more accurate.”
“With exact information about the aircraft's surroundings delivered in negligible time, the plane can focus on other tasks.”
The group first “trained” the system to recognise the sky and the ground by feeding hundreds of different landscape images to it and teaching to it compare the blue colour of the sky with red-green colours of the ground.
Simple, low resolution cameras that are similar to a bee's visual system are then attached to the aircraft, allowing the plane to take its own landscape pictures to identify the horizon while flying.
“Imagine a plane that has eyes attached to each side at the front – the wide-angle camera lenses provide a view of 360 degrees.”
Mr Thurrowgood says that the challenge was to figure out the optimal resolution of images that will allow the system to both locate the horizon quickly and not compromise the accuracy of its information.
“The measurement process can certainly be quickened – we only have to adjust the cameras to take images with a smaller resolution,” he says. “However, it won't produce the same quality of data, so the key is to find an optimal resolution where you have both speed and quality.”
Testing the aircraft in an air field, the unmanned plane was directed to perform three aerobatic movements, the barrel roll, Immelmann turn and a full loop.
“We had two pieces of evidence that it worked out – first, the plane didn't crash and second, the system's identification of the horizon matched with what we measured ourselves.”
Mr Thurrowgood says that the system can potentially be adapted for all types of aircraft – including military, sporting and commercial planes.
“We have created an autopilot that overcomes the errors generated from gyroscopes by imitating a biological system – the honeybees,” says Professor Mandyam Srinivasan.
“Although we don't fully understand how these insects work, we know that they are good at stabilising themselves while making complicated flight manoeuvres by watching the horizon.”
“This project required tremendous effort, as separating the sky from the ground visually is not always as easy as we imagine – it can be difficult to pick out the horizon, so my hat's off to Mr Thurrowgood for achieving this.”
The group will be presenting their paper UAV attitude control using the visual horizon today at the Eleventh Australasian Conference on Robotics and Automation. Videos of the test flights are also available from the group.
When the system developers incorporate the changes to the software release, Power efficiency in watts/distance could be displayed and if the battery capacity is entered, "est. time remaining" could be available.
I used the AttoPilot 90A/50V Voltage/Current Sensor with Connectors
AttoPilot Voltage/Current Sensor with connectors
The sensor is also available from SparkFun http://www.sparkfun.com/products/9028
SparkFun Power sensor board (Same as AttoPilot board)
The sensor provides a scaled output for battery voltage and current. The outputs are scaled for a 12 bit 3.3V ADC. Full range is 51.8 Volts and 89.4 Amps.
The connector that is wired into the AttoPilot sensor is pinned for a direct connection to the APM shield.
To prep the shield I left the first two resistor positions open (no resistors installed) and shorted the last two positions.
I installed a 3 pin header. The 4th pin from the bottom of the row is ground (the black lead). The next pin up is battery voltage (the red lead). The 3rd pin is the battery current (white lead).
I noticed that during the "test -> battery" only 1 reading was displayed and it was inaccurate.
I changed the 'sensors' file so that the battery read continues until you hit Enter to exit.
To incorporate the Power Sensor I had to modify these files:
sensors
defines.h
test
config.h
APM_Config.h
ArduPilotMega
Here are the changes I made:
sensors
#if POWER_SENSOR == 1
void read_battery(void)
{
battery_voltage1 = BATTERY_VOLTAGE(analogRead(BATTERY_PIN1)) * .1 + battery_voltage1 * .9; //reads power sensor voltage pin
battery_voltage2 = BATTERY_CURRENT(analogRead(BATTERY_PIN2)) * .1 + battery_voltage2 * .9; //reads power sensor current pin
battery_current = battery_voltage2;
}
#endif
#if BATTERY_EVENT == 1
void read_battery(void)
{
battery_voltage1 = BATTERY_VOLTAGE(analogRead(BATTERY_PIN1)) * .1 + battery_voltage1 * .9;
battery_voltage2 = BATTERY_VOLTAGE(analogRead(BATTERY_PIN2)) * .1 + battery_voltage2 * .9;
battery_voltage3 = BATTERY_VOLTAGE(analogRead(BATTERY_PIN3)) * .1 + battery_voltage3 * .9;
battery_voltage4 = BATTERY_VOLTAGE(analogRead(BATTERY_PIN4)) * .1 + battery_voltage4 * .9;
#if BATTERY_TYPE == 0
if(battery_voltage3 < LOW_VOLTAGE)
low_battery_event();
battery_voltage = battery_voltage3; // set total battery voltage, for telemetry stream
#endif
#if BATTERY_TYPE == 1
if(battery_voltage4 < LOW_VOLTAGE)
low_battery_event();
battery_voltage = battery_voltage4; // set total battery voltage, for telemetry stream
#endif
}
#endif
defines.h
#define BATTERY_VOLTAGE(x) (x*(INPUT_VOLTAGE/1024.0))*VOLT_DIV_RATIO
#define BATTERY_CURRANT(x) (x*(INPUT_VOLTAGE/1024.0))*CURR_DIV_RATIO
test
static int8_t
test_battery(uint8_t argc, const Menu::arg *argv)
{
print_hit_enter();
#if POWER_SENSOR == 1
while(1){
for (int i = 0; i < 20; i++){
delay(20);
read_battery();
}
Serial.printf_P(PSTR("Volts:"));
Serial.print(battery_voltage1, 2); //power sensor voltage pin
Serial.print(" Amps:");
Serial.println(battery_voltage2, 2); //power sensor current pin
if(Serial.available() > 0){
return (0);
}
}
#else
Serial.printf_P(PSTR("Power Sensor Not enabled\n"));
#endif
delay(3000);
#if BATTERY_EVENT == 1
while(1){
for (int i = 0; i < 20; i++){
delay(20);
read_battery();
}
Serial.printf_P(PSTR("Volts: 1:"));
Serial.print(battery_voltage1, 4);
Serial.print(" 2:");
Serial.print(battery_voltage2, 4);
Serial.print(" 3:");
Serial.print(battery_voltage3, 4);
Serial.print(" 4:");
Serial.println(battery_voltage4, 4);
if(Serial.available() > 0){
return (0);
}
}
#else
Serial.printf_P(PSTR("Battery Event Not enabled\n"));
#endif
delay(3000);
}
config.h
//////////////////////////////////////////////////////////////////////////////
// Battery monitoring
//
#ifndef POWER_SENSOR
# define POWER_SENSOR DISABLED
#endif
#ifndef BATTERY_EVENT
# define BATTERY_EVENT DISABLED
#endif
#ifndef BATTERY_TYPE
# define BATTERY_TYPE 0
#endif
#ifndef LOW_VOLTAGE
# define LOW_VOLTAGE 11.4
#endif
#ifndef VOLT_DIV_RATIO
# define VOLT_DIV_RATIO 3.0
#endif
#ifndef CURR_DIV_RATIO
# define CURR_DIV_RATIO 3.0
#endif
APM_Config.h
// Battery monitoring
#define POWER_SENSOR ENABLED
#define BATTERY_EVENT DISABLED
#define BATTERY_TYPE 0
#define LOW_VOLTAGE 9.6
#define VOLT_DIVRATIO 15.7 //AttoPilot sensor voltage ratio (Minor adjustments to these values
#define CURR_DIV_RATIO 30.35 //AttoPilot sensor current ratio allow calibration to desired accuracy)
ArduPilotMega
// Sensors
// --------
float airpressure_raw; // Airspeed Sensor - is a float to better handle filtering
int airpressure_offset; // analog air pressure sensor while still
int airpressure; // airspeed as a pressure value
float battery_voltage = LOW_VOLTAGE * 1.05; // Battery Voltage of total battery, initialized above threshold for filter
float battery_voltage1 = LOW_VOLTAGE * 1.05; // Battery Voltage of cell 1, initialized above threshold for filter
float battery_voltage2 = LOW_VOLTAGE * 1.05; // Battery Voltage of cells 1+2, initialized above threshold for filter
float battery_voltage3 = LOW_VOLTAGE * 1.05; // Battery Voltage of cells 1+2+3, initialized above threshold for filter
float battery_voltage4 = LOW_VOLTAGE * 1.05; // Battery Voltage of cells 1+2+3+4, initialized above threshold for filter
float battery_current;
Tip 1. ID the ArduPilotMega files that you are working with.
I put this at the top of the files:
//
// FILE: file-name
//
When you edit files in the Arduino IDE there is no indication of the file name you are working on. The added preamble helps a lot.
Tip 2. Add identifying information to the system file so that the CLI header will reflect the current condition of the system software.
The original CLI header shows:
The 'system' file segment that places the header file on the CLI window is:
(In the Arduino window - use Edit-->Find... "Init to get to the right place)
My modified header shows the serial port baud rates that are set, which GPS and GCS PROTOCALs have been chosen, and any special conditions that have been set:
The changes made to the system file were:
Tip 3. If you get the dreaded:
I get this message when I try to change the Sketchbook location to a different release version.
The solution is easy!
Just reboot your computer. Then start Arduino again
I know that is a pain, but that will solve the problem.
I hope these tips help.
Earl
https://docs.google.com/?hl=en&tab=go&authuser=0&pli=1#all
This weeks' Robots Podcast has a great interview with Alan Winfield, co-founder of the Bristol Robotics Lab. Although it's not about UAVs, per se, it is about the challenges in swarming robotics. He makes some excellent points:
--There are no real swarming robotic deployments in the world outside of lab simulations.
--One of the problems with designing swarming strategies is that in the absence of natural evolution and selection pressure, we can't measure how "good" they are.
--My favorite line: "There are no robot ethics. Only ethical roboticists." Like him, I've always thought that Asimov's Three Laws of Robotics were silly. Robots aren't people and can't have ethics. Robticists can.
Recent additions to my GCS include the APM binary stream and the UavDevBoard (Matrix Pilot/Serial UDB Extra) data stream plus playback of UDB text file on the Data File tab. Quad model is also a new addition (many more models coming soon). Screen is now resizable and depending upon the instruments size, either the glass cockpit or 3D model can be clicked and selected as the "big" instrument. Minimum screen size is now 800X400 which should work well on older laptops and netbooks.
Next on the To-Do list is APM 2-way with MAVlink support, Installer and more models. Still hoping to add AttoPilot support shortly.
If you're having problems installing, make sure to run the DirectX download EVEN IF you have the latest DirectX installed. These include additional drivers MISSING from the standard DX installer.
Minimum requirement still includes .NET 2.0, Google Earth and the Google Earth API.
Download the latest here: http://code.google.com/p/happykillmore-gcs/downloads/list?saved=1&ts=1291154906
Links to minimum requirements here: http://code.google.com/p/happykillmore-gcs/
"These aerial images of White Sands National Monument [top] and Glen Canyon Dam [left] were taken by a Canon SD30 carried on a radio-controlled model airplane [right], using CHDK to operate the shutter."
IEEE Spectrum magazine on the great CHDK software, which allows you to control Canon cameras remotely. The whole piece is long and interesting, so read it all, but here's a bit about what CHDK is:
"The CHDK firmware resides on the camera's memory card, but the original Canon firmware remains on the camera's internal flash memory. So you're not likely to "brick" your camera by using CHDK inappropriately. Indeed, you can return your camera to its stock configuration merely by restarting it without CHDK on its memory card or by switching the locking tab on the card to its unlocked position. (CHDK loads only if the card is locked, and once this firmware is loaded, the camera can still record images.) The CHDK firmware is described fully on the wiki athttp://www.chdk.wikia.com, which includes a "CHDK for Dummies" section and plenty of pointers for getting up and running."