All Posts (14049)

Sort by
3D Robotics

3689501235?profile=original

The Congressional Research Service has done a new report on US drone policy. You can read an overview of it here from the Federation of American Scientists, which originally obtained the report.

Here's the abstract:

Under the FAA Modernization and Reform Act of 2012, P.L. 112-95, Congress has tasked the Federal Aviation Administration (FAA) with integrating unmanned aircraft systems (UASs), sometimes referred to as unmanned aerial vehicles (UAVs) or drones, into the national airspace system by September 2015. Although the text of this act places safety as a predominant concern, it fails to establish how the FAA should resolve significant, and up to this point, largely unanswered legal questions. 

For instance, several legal interests are implicated by drone flight over or near private property. Might such a flight constitute a trespass? A nuisance? If conducted by the government, a constitutional taking? In the past, the Latin maxim cujus est solum ejus est usque ad coelum (for whoever owns the soil owns to the heavens) was sufficient to resolve many of these types of questions, but the proliferation of air flight in the 20th century has made this proposition untenable. Instead, modern jurisprudence concerning air travel is significantly more nuanced, and often more confusing. Some courts have relied on the federal definition of “navigable airspace” to determine which flights could constitute a trespass. Others employ a nuisance theory to ask whether an overhead flight causes a substantial impairment of the use and enjoyment of one’s property. Additionally, courts have struggled to determine when an overhead flight constitutes a government taking under the Fifth and Fourteenth Amendments. 

With the ability to house surveillance sensors such as high-powered cameras and thermal-imaging devices, some argue that drone surveillance poses a significant threat to the privacy of American citizens. Because the Fourth Amendment’s prohibition against unreasonable searches and seizures applies only to acts by government officials, surveillance by private actors such as the paparazzi, a commercial enterprise, or one’s neighbor is instead regulated, if at all, by state and federal statutes and judicial decisions. Yet, however strong this interest in privacy may be, there are instances where the public’s First Amendment rights to gather and receive news might outweigh an individual’s interest in being let alone. 

Additionally, there are a host of related legal issues that may arise with this introduction of drones in U.S. skies. These include whether a property owner may protect his property from a trespassing drone; how stalking, harassment, and other criminal laws should be applied to acts committed with the use of drones; and to what extent federal aviation law could preempt future state law. 

Because drone use will occur largely in federal airspace, Congress has the authority or can permit various federal agencies to set federal policy on drone use in American skies. This may include the appropriate level of individual privacy protection, the balancing of property interests with the economic needs of private entities, and the appropriate safety standards required.

Read more…
3D Robotics

From Hackaday:

This gentleman is using electrical impulses from his neck muscles to fly a toy helicopteraround the room. The project is a demonstration of the AsTeRICS project which seeks to reduce the complexity of adapting the set of skills a disabled person can use to do a wide range of functions. In this case, controlling the helicopter could easily be switched to other tasks without changing the user interface hardware.

One of the plugins for the AsTeRICS project uses the OpenEEG library. This reads the signals coming from a pair of electrodes on top of each shoulder. In the video after the break you can see that as he flexes these muscles the changes in signal are mapped to the altitude of the helicopter. This is just one example of a wide range of inputs that include things like building a webcam-based mouse or using  facial recognition.

The toy itself is being driven by an Arduino sending IR commands. We’ve seen quite a few project where the helicopter communications protocols are laid bare.

3689501172?profile=original

Read more…
3D Robotics

Great ABC News profile of Jordi

3689501210?profile=originalABC News/Univavision has a great profile of Jordi today. The video segment is here (embedding isn't working here) and the text is below. It's quite long, but here how it starts:

At 20 years old, Jordi Muñoz found himself in Riverside, Calif., an hour east of Los Angeles with not much to do. He'd settled there to wait for a green card after leaving Mexico to start a family with his wife, a U.S. citizen. He was away from friends and family, and he could not work or continue school in the U.S. either. So, with an internet connection and his programing skills, he began tinkering with a remote control helicopter.

"I was extremely bored," he said. "I could only watch TV or program something, so I decided to program."

In his bedroom he tweaked his toy, splicing motion sensors of a Nintendo Wii controller with the mini copter. He calls those eight months "the most productive months of my life." That's because his pet project put him at the front of a budding industry: the personal drone market.

His eureka moment came when he was able to stabilize the helicopter's flight using computer code. He posted his progress on DIY Drones, an online community started by Chris Anderson, who was editor of Wired magazine at the time. Muñoz also acquired more and better sensors, like a geographical positioning system, or GPS, probably the most important component on a drone, he says.

Anderson took an interest in Munoz's work. He found Muñoz's accomplishments "impressive" and began collaborating with him virtually.

"He was able to quickly learn about very advanced technology. He did all that by teaching himself on the internet," Anderson said. "He's that generation of people who don't know what they don't know. He didn't know he was supposed to have a PhD to invent a drone. He just did it."

Read the rest of the article here

Read more…

DC Area Drone User Group Flight Training

3689501119?profile=original

The DC Area Drone User Group held a flight training for new users at the George Mason University Field House on Sunday, February 3. Instructor Christopher Vo reviewed safety, basic controls, and best practices with a group of 10 flyers. Miraculously, we had the whole facility to ourselves as other people were distracted by some other sporting event that evening. This training was a great opportunity for those who had built new drones during the January build party to learn how to operate them.  

Our group is always looking for new members so if you are interested in learning more about us check out our website at dcdrone.org . We have an exciting series of events in the next two months including:

We hope to see some of you at our future events if you live in the Washington, DC area or are ever passing through.

Read more…

Integration of air traffic data?

3689500840?profile=original

   I wonder if it is a practical idea to integrate air traffic data with all the other data we track in the GCS and the OSD. Data seems to be available. I don't see a complete solution in hand but some pieces are there., It looks like there is a growing standard for position reporting (ADS-B) and you can buy a receiver that will allow you to track air traffic in the area. This will only track aircraft that are equipped with ADS-B, but that is in growing numbers and  is more prevalent in Europe.  There is other data available too, from the FAA, that covers aircraft based on Radar and transponders (I think)  for that data there is a 5 min delay that limits its usefulness.

Pictured above is a nifty item, reminds me of the 3DR radios!

This is an example of a receiver that can be connected to a laptop and display a map of local traffic. I'm sure it could easily be hacked and the data turned into waypoints.

There are several networks and software available to allow you to contribute data as well as receive data even if you do not have a receiver.  This can work similar to how the findu.com network works for hams using APRS.  To me, this seems to be a next logical step for hobbyists who are using altitudes higher than a few hundred feet.

I wonder if the drone community as a hold has an interest and see it as being practical as something that could increase safety, or is this a "dude" idea?

If you would like to learn more, here are a few links i found: 

http://planefinder.net

http://www.microadsb.com

http://www.flightradar24.com

Read more…
3D Robotics

Lego Mindstorms blimp drone

I love Lego Mindstorms (it's how I got my start with drones), but it's pretty heavy. You need a lot of helium to lift all that plastic.  Via Makezine:

This blimp uses two 55″ helium balloons, Dexter Industries NXTBee wireless modules, a servo, and two DC motors. One of the creators, Tyler Westmoreland, shared the RobotC code.

We're hoping to relaunch our own blimp project later this year. I loved our old Blimpduino, but the sensor technology didn't exist back then to really let it autonomously navigate indoors. Now it does ;-)

Read more…

Snowy first flight with Radian Pro

3689500908?profile=original

First of all, sorry for the unexciting video. Waiting on my new gopro so I made do with an old keychain cam. 

Flight went well! I was very impressed with the RTL performance straight out of the box. The Radian displayed much less of a tendency to wander than the 6' Telemaster I had flown with APM before. I will take a stab at tweaking the gains. This first image is just the RTL flight and the next one is all the flight data. 

Before landing, the plane didn't want to come out of RTL mode, so a lot of frantic switch-toggling happened and eventually it came back. Not sure what the issue was.On the plane,  I have the auxiliary antenna set up orthogonal to the main RX on board so antenna nulls shouldn't have been an issue....

3689500960?profile=original

Like I said, onboard video isn't great. RTL starts at about 2:17. 

All in all, not a bad morning!

Read more…

mavlink-logo.png?w=400

I have been asked a few times now how I got my GCS basics up and running. MAVLink is a rather complicated protocol to implement. The documentation is rather thin and partially outdated... Not an easy thing, if you are no experienced developer...

I'm no pro developer by a long shot. I speak reasonably well C and I did some smaller projects in JAVA. All in all, I can hold my own but can't do nor understand really fancy stuff. Without the help of Pat Hickey, who answered many of my questions on the mailing list, I wouldn't have gotten very far, so first of all, thanks again to him and the others!

Now, how do we get a MAVLink interpreter going?

The first thing to do would be to snatch a copy of the GCS_MAVLink library from the Arducopter project. You will also need AP_Common, AP_Math and FastSerial for dependencies. FastSerial must be the first to be included in your sketch.

Then, go an grab the code of ArduStationII. When you have that, open 2 windows: One with the code of MAVLink.pde from ArduStation2, the other in the GCS_MAVLink library subdir include/mavlink/v1.0/common/

Then, in MAVLink.pde, look at void gcs_handleMessage(mavlink_message_t* msg). This function is called by gcs_update and does the main decoding work. Basically, it's a huge switch-case block which switches on the message id of the received package. Let's have a look at one case block:

  case MAVLINK_MSG_ID_ATTITUDE:
{
// decode
mavlink_attitude_t packet;
mavlink_msg_attitude_decode(msg, &packet);
pitch = toDeg(packet.pitch);
yaw = toDeg(packet.yaw);
roll = toDeg(packet.roll);
break;
}

That doesn't look so complicated, does it? ArduStationII works mainly with global variables for all the interesting parameters. So what they do is, for everything they want, they knit a case block, fishing for the right message id, then call the respective decode method and assign the values to their global variables.

Let's have a look at something that isn't included in ArduStationII. In the common directory, search for mavlink_msg_vfr_hud.h and open it in an editor.

The very first thing in this file is the definition of what's in a MAVLINK_MSG_VFR_HUD packet:

typedef struct __mavlink_vfr_hud_t
{
float airspeed; ///< Current airspeed in m/s
float groundspeed; ///< Current ground speed in m/s
float alt; ///< Current altitude (MSL), in meters
float climb; ///< Current climb rate in meters/second
int16_t heading; ///< Current heading in degrees, in compass units (0..360, 0=north)
uint16_t throttle; ///< Current throttle setting in integer percent, 0 to 100
} mavlink_vfr_hud_t;

Let's say, we want the heading... If you scroll down a bit, you find the decode function...

static inline void mavlink_msg_vfr_hud_decode(const mavlink_message_t* msg, mavlink_vfr_hud_t* vfr_hud)

Seems, like we have everything. Now you just need to define a global variable for the heading...

int heading;

and write the respective case block:

    case MAVLINK_MSG_ID_VFR_HUD:
{
mavlink_vfr_hud_t packet;
mavlink_msg_vfr_hud_decode(msg, &packet);
heading = packet.heading;
break;
}

Now there is only one more thing to do. Normally, the APM sends the VFR_HUD message rather rarely. You have to tell it to do it more often. In MAVLink.pde, scroll down until you find the start_feeds function. VFR_HUD is a stream in the EXTRA2 category ;). You can find out, which message stream is in which category from the GCS_MAVLink.pde in the Arducopter project :).

Have fun coding!

Read more…

image.php?album_id=1&image_id=62

I'm sure everyone who is into aerial videgography with the wonderful GoPros have struggled with getting Jello Free (rolling shutter) video footage from multicopters. I have tried them all. Using ND filters to lower the shutter speed of the GoPro (worked somewhat but ruined image quality), shot in 720p @60fps but would have preferred shooting in full HD. Tried all sorts of foam and vibration absorption materials, ear plugs, rubber foam mounts, hours balancing my propellers and laser mirror and phone seismograph app to iron out any vibrations to no avail. Despite the GoPro Hero 3 B.E. shooting at full HD @ 60fps, I still get some rolling shutter. 

Recently I saw this video by Matt Hall who's Tcopter design GoPro video footage is devoid of any rolling shutter effect. I followed the hardware specs recommendations, using 1400kv SunnySky motors and Simon K'd F30a ESCs. I took his frame design and modified it a little and used a cradled motor yaw mechanism inspired by this design. And this is the result, below is a conceptual design I created in Blender 3D to test out the yaw mechanism and also to get the dimensions.

image.php?album_id=1&image_id=61

Here is the Actual Craft built last Friday.

image.php?album_id=1&image_id=67

image.php?mode=medium&album_id=1&image_id=64

And here's the video. I believe the wooden frame, small high speed manually balanced GWS style HK propellers (8x4) combined with the high KV rating of the motors (1400) helped to generate very high vibration frequencies that is not noticeable on the GoPro Hero 3 B.E. CMOS sensor. All my previous multicopters motors are rated at 750kv and below spinning 11 inch props. Balancing them, especially the 11x47 Gemfans are a nightmare, so it was very difficult for me to get perfectly balanced setup and that resulted in rolling shutter. So the key I discovered is, use high KV motors rated 1K and above and also high speed smaller props. There is a catch however, flight times are reduced as this setup is not as efficient as the lower KV motors, but I think this is a small trade off for the footage I'm able to get. Right now on a 2200mah 4S Nanotech LiPo, I get around 7minutes flight time. All up weight with Camera and Battery is 1.3kg or nearly 3 pounds.

Read more…
3D Robotics

Connecting to APM 2.5 via Bluetooth

3689500811?profile=original

If you'd like to connect to APM 2.5 wirelessly but don't need the long-distance reach of the 3DR radios, you can use the Bluetooth connection built in to your laptop, along with a Bluetooth module. 

Here's what you'll need to do this:

Once you get the Bluetooth module, you need to set it to the right baud rate. A guide to all the commands it can use is here, but here's the short form:

  1. If you have a USB Xbee adapter or an FTDI cable,  use that to connect the Bluetooth adapter directly to your PC. If you don't have those, you can also do with wirelessly over Bluetooth using the Serial-over-Bluetooth process described below, but I haven't tried that myself.
  2. With any terminal program (TeraTerm, HyperTerm, etc), select the COM port assigned to the cable/adapter the Bluetooth module is plugged into, and set the baud rate to 115k.
  3. Within 60 seconds of powering on the Bluetooth module, enter "$$$". This should produce a "CMD" command prompt.
  4. Type "SU,57"[Return]. This will set the baud rate to 57k. It will not take effect until you power cycle the module.
  5. Now you can unplug it from the USB Xbee adapter or FTDI cable. Plug it into the Xbee adapter board, and with the APM adapter cable, plug it into the APM telem port.
  6. Power APM with the Power Module. (Don't power it via USB, since you can't use wireless and USB at the same time)
  7. On your PC, go into the Windows Control Panel/Hardware and Sound/Devices and Printers. Select "Add a Bluetooth device". It should see the Bluetooth module, which will be called "FireFly". Connect to it. Say OK to any boxes that come up about pairing codes.
  8. Now if you look in your Control Panel/Device Manager/Ports (COM & LPT), you should see two new ports called "Standard Serial over Bluetooth link" with COM numbers after them. You'll be using the first (lower) of the two.
  9. Go into the APM Mission Planner and select that port, with the baud rate of 57k.
  10. You should now be able to connect to APM over MAVLink via Bluetooth!

 

 

 

Read more…

Just another quick technology preview

We are still making some nice progress with the project. I finally got the smallprotocol running, which is a propietary but well documented communications protocol with checksums and stuff for the display. Result: Rock solid performance and no trash on the screen if the Arduino sends too fast or some bits get lost.

At this point goes a HUUUGE thankyou to Pat Hickey who helped me out quite a bit on the mailing list. The MAVLink and other libs are really complicated if you are no pro-dev and Pat and others were so nice to point me to the right direction quite a few times during the last few days.

Now, back to the question, why I chose the relatively expensive eDIPTFT from Electronic Assembly...

  1. EA is a manufacturer which is specialized on industrial embedded systems, means, the products are a bit more robust
  2. The eDIPTFT series is pretty intelligent and takes quite some load from the developer and the microcontroller.

For example: To create this instrument in the video, you quickly (5mins) design it in the instrument designer of the (costfree) IDE and upload it to the display. Then it's only 3 lines of code... When the screen is called, one line to define the instrument. Then repeatingly one line to update it - and that is just 3 bytes plus instrument ID plus value. And when you leave the screen another command to destroy the instrument. The whole painting stuff is done by the display.

One big part of this project is writing a library of convenience functions for the display.

I tinker with EA displays already for 7 or so years... The functionality makes them really cool for all kinds of projects. You only need one free serial, I²C or SPI port.

Read more…

Ladybird autopilot

ladybird31.jpg

The 1st flights good enough to get on video. It's extremely unstable, but there is hope. The MPU9150 was the only way to get the sensors small enough. It might also be the smallest thing currently flown by a computer, anywhere in the world.

ladybird35.jpg


ladybird33.jpg



There was some chance noisy stick voltages were making it unstable, but the oscilloscope showed the noise was down near 20mV while flying deflection was near 500mV in the pitch & 100mV in the roll. The Y position has more error because the camera can't sense depth.

NewFile11.bmp


roll & pitch sticks while flying

NewFile9.bmp



roll & pitch sticks while flying

NewFile10.bmp


 roll & pitch sticks while flying

NewFile7.bmp



inactive stick noise

NewFile8.bmp


 changing stick by 0.01

ladybird32.jpg

ladybird30.jpg





After a flimsy hot gluing of the board on top, the MPU9150 gave drastically improved results over having 1 board for a magnetometer under the battery & a 2nd board on top for a gyro.  There's still a bit of error because the board isn't perfectly flat, but it manages to recover.  The current or the flexing of having it under the battery was a problem.

There was 1 source file for the MPU9150 on http://permalink.gmane.org/gmane.linux.kernel.iio/4339, revealing undocumented register REG_YGOFFS_TC. He never uploaded the header files.

With the full Marcy 3 software & the crystal, the MPU9150 only reads at 80Hz.

marcy3_13.jpg



A wishful crystal to get the PIC as fast as possible added some more size. If the PIC does nothing else, it can get 140 readings of all 9 sensors per second. Without the crystal, it can get 80 hz. 50Hz is probably enough for the application.

Being the low end PIC that it is, only software I2C was justified. If it was concerned with the full attitude stabilization, it would need hardware I2C for the gyro/accel part. The trend is now to use STM32F ARMs for even these nano quads.

ground18.jpg

ground19.jpg


ladybird13.png

ladybird12.jpg

ladybird11.jpg

ladybird10.jpg

ladybird12.jpg

ladybird08.jpg


Took some doing to override the stick voltages.

Automating the Ladybird is a big deal since it requires many many voltages, leading to a pile of LM317's on the verge of cracked resistor circuit board incineration. Mercifully, the remote control has reversible directions, eliminating the need to make a new table of stick directions or require multiple steps to initialize the stick voltages.
Read more…

Getting ready to fly Radian Pro with APM 2.5

3689500631?profile=original

Set up with airspeed sensor and telemetry. A little worried about propwash on the pitot-static probe but we'll have to see! 

3689500793?profile=original

APM frame is 3D printed and uses nylon standoffs to space everything out. Flight battery is under there but balancing leads exit on the right side of the fuselage so it can be charged and remain inside. Its also wired up to a switch so there's no more fussing with battery connectors. Battery was moved aft to counteract extra weight up front. CG is pretty dead on. 

I can post STLs of my frame if anyone is interested. It's stuck onto the fuselage with VHB. Hopefully will get some good video soon!

Read more…