All Posts (14049)

Sort by

Drone Con today

3689528933?profile=originalBroadcast starts at 10am mountain.

http://www.ustream.tv/channel/3d-robotics

9:30 - 10am - Registration / Coffee
10:00 - 10:45 - Pat Hickey - SMACCMPilot - 45 min
10:45 - 11:30 - Doug Weibel - Advanced UAV control system design - 45 min
11:30 - 12:15 - Brandon Jones - L1 Control Algoritm in Arduplane - 45 min
LUNCH
1:15 - 2:00 - Chris Miser - Falcon UAV / Mapping - 45 min
2:00 - 2:30 - Ryan Beall - Embedded GPS INS - 30 min
2:30 - 3:00 - Robert Lefebvre - Arducopter for TradHeli - 30 min
3:00 - 3:45 - Leonard Hall - Control system design of Arducopter V3 - 45 min
3:45 - 4:30 - Jason Short - How we got here...hacking for non-engineers - 45 min
4:30 - 5:00 - Wrap-up / flex time

Read more…

Abstract: This project was executed by SBL for a mining company having operations in Africa. The study area covered approximately 32 sq.km of land falling in the south western part of Congo.This project was awarded to SBL based on the excellent quality and timeliness shown during an earlier project for Landuse and Landcover mapping. The client requirement was for the creation of hight accuracy DTM editedto 0.2m, DSM of 0.2m and orthophotos of 10cm GSD using UAV imagery

Client: Client is a mining company based in the UK and having mining operations in various parts of African continent

Geography: ~32 sq.km of land falling in the south western part of Congo

Industry: Mining .

Challenges:
• GCP distribution was not even and so getting a good quality stereo model became difficult. So SBL’s experts suggested suitable areas for collecting additional GCP’s which proved to be very useful in the end.
• Fixing the terrain height and water body was a challenge as the area is a dense forest with large swathes of swampy land.
• Data being of 0.2m handling and viewing it was difficult.
• Editing 0.2m DTM was a very big challenge.

Approach: SBL used state of the art UAV softwares in carrying out the Aerial Triangulation of the UAV images. After Point cloud densification and DSM creation Undistorted images were generated by the UAV software after the creation of DSM data. The Aerial Triangulation reports were imported into Inpho Match AT and stereo models were created in it. After this DTM were generated and edited using Inpho DT Master software. SBL’s highly experienced professionals produced high accuracy DTM by the correct usage of soft and hard breaklines . All manmade structures and vegetated areas were interpolated to terrain height after plotting breaklines which were fixed to terrain heights. The edited DTM points were exported in XYZ format and edge matched with adjacent DTM tiles and dispatched to the client.

Benefits: The client benefited by this work in
• Getting high accuracy DTM’s of 0.2m by the use of cutting edge UAV softwares.
• Very quick turnaround time.
• Use of photogrammetry software along with UAV software helped in getting a good DTM output.

Read more…

3689528914?profile=original

I am pleased that my friends panorama made ​​with the help of my flight controller and frame :)

Stanislav Sedov (pilot) with hexacopter:

3689528856?profile=original3689528955?profile=originalalso photo from 515m with 5d mkII:

3689528869?profile=originalHardware setup:

FC: acFC5.1 (special edition for airpano.ru)
ESC: HK BlueSeries 30A
motors: axi2820/14
propellers: 13" apc
Battery: 2x5800 Zippy
RC: Futaba T8FG s.bus
Camera: Canon 5D mkII
1-axis (yaw) gimbal

Read more…

fpv groundstation

I wanted to share my quad transport slash storage slash FPV groundstation for some time already..

Ordinary tool box, the biggest one from 3-in-1 special offer pack, €15.99 in local hobby store:

3689528634?profile=original

When opened, my quadcopter is folded inside with batteries, charger etc:

3689528742?profile=original

Plan is to have also the RC transmitter in the same box, there is still room for it. I will have to make some divider for it so it does not get all destroyed while being transported in a car.

This is how the copter looks unfolded:

3689528756?profile=original

I have used flamewheel clone arms, some plywood (will use carbon fiber but I did not bother for now..) and aluminium square tube for the fuselage. The tube is drilled to save weight. Flight controller is in the mid section inside the fuselage.

3689528678?profile=original

The unfolded arms are not locked in place on the picture so it looks a bit wobbly, but in fact it is quite stiff.. I used bamboo sushi sticks as landing gear to lift the frame so I can mount the camera on the bottom.

The frame is a bit on the heavy side, I will replace it with something lighter when I am in the mood.. But for now it works, and it does not look like flying spaghetti monster..

To the more interesting stuff now: I have used the removable compartment which came with the box to store my FPV gear. The compartment is locked to the lid with two screws and nuts and opens with the lid. While opened, it is held in place by 12AWG wire on the right-hand side:

3689528807?profile=original3689528684?profile=original

In the upper-left corner I have 3 antenna connectors - there are hidden 3 separate video receiver modules (cheapo 5.8GHz modules for some 12-14USD excl. postage), each with it's own antenna. There is RSSI comparator which I made using "standalone arduino" - just an Atmega168 chip mounted on prototyping board with 16MHz crystal, some capacitor and hidden spaghetti of wiring. Based on measured RSSI the comparator selects video signal only from receiver with the best quality, and sends only the best video output to the 8inch LCD TV. The TV was some 75 USD from dealextreme, I chose it because google told me that it switches to blue screen only when there really is little to no signal - on picture, the video transmitter was off. Signal switching itself is done by 74HC4052 multiplexer which is controlled by the arduino signal comparator. The nice thing is that it could be scaled up to 8 separate receivers with very little changes - maybe I will do it, once, but for now there was no place on the protoboard anymore :-)There is also composite video-out connector in the lower-left corner, to connect video goggles or recorder.

The whole thing runs of 3S LiPo (voltage on the right LCD). The video modules / comparator / signal selector all work on 5V, so there is hidden UBEC, and it's voltage is displayed on the left LCD. The current consumption is on the analog meter in the middle. It is there only because I really liked how it looks. I have no idea how accurate it is.

The little oled display just shows the measured RSSI values, and which receiver is selected. That is not really needed for the thing to work, but it was quite handy for debugging. You can also see which receiver/antenna is in use by the LED which is bellow the antenna connector. The 3 LEDs flash together while the video transmitter is off.

3689528777?profile=original

Picture with antennas - pictured are "stock" omnidirectional whip and flat panel directional linear polarized antennas. 

I will use 1 skew-planar circularly polarized omnidirectional antenna and 2 helical directional antennas, 3 to 5 turns, in an angle of each other so that I will cover wider area, without the need of antenna tracker. If I ever scale it up to use all 8 video receivers, I could combine 1 omnidirectional  with 7 directional antennas of different turn-count to cover 360 degrees around myself. I do not need that for now though..

This is how the wiring looks like:

3689528822?profile=original

I know it is a mess, but you don't see it normally :-)

3689528790?profile=original

I wanted to make custom PCB to get rid of all the wires.. and maybe I will, eventually. But none of my friends would believe me then, that I made it myself, if it didn't look like a mess of wiring and hot glue :-)   Anyway, normally it is covered and not visible, and there is no stress on the wires, so I do not expect it to fail because of loose wire or connection.

To the end of this post I just want to say thank you all the nice people sharing the knowledge all over the internet - very little of all this comes from my own head - I found tons of material about video diversity receivers (I just don't know why people usually only combine two receivers - one omni and one directional.. the receivers are so cheap and it cost nothing to scale it up a little to get the little bit more land covered), I found many articles about standalone ardino, about video buffers, etc..  I just love the internet :-)

cheers, I hope I inspired someone to try something like this.

Read more…

3689528614?profile=original

By Evan Ackerman

Posted 6 Jun 2013 | 14:48 GMT
We hear about lots of robots that could potentially be used for "search and rescue" or "disaster relief," because that's kind of what you say when you've made a robot that doesn't have a commercial or military application but you still need to come up with some task that it might be useful for. It's much rarer that we see these robots actually performing search and rescue or disaster relief tasks, which is why it's especially nice to see this firefighting robot from UCSD doing something that firefighters would find immediately useful.

The UCSD robot is called FFR for "firefighting robot," although FLR for "firelocating robot" might be more technically correct. The robot uses a stereo camera and a thermal camera to generate 3D pointclouds with thermal overlays, allowing the robot to autonomously generate maps showing hot spots and humans even through smoke. The sensor hardware on board the robots doesn't look especially complex, meaning that the 'bots might ultimately become inexpensive (and replaceable) enough to deploy in swarms. So, instead of running around burning buildings looking for people, firefighters can just deploy a bunch of robots first, and rapidly build up a thermal map telling them where to go.

Incidentally, that nifty stair climbing system is something we first wrote about back in 2009, and it's great to see that it's been turned into something useful. Now, if they'd just give iFling some water balloons, it really could be a firefighting robot.

Read more…

29651.jpgFresh from Hobbyking is this video switch which allows you to switch a maximum of 3 camera's on your aircraft (or boat, car for that matter) by the flick of a switch. This switch features one receiver input, 3 camera inputs and one video output

With dimensions of only 30x18x6mm and just 4 grams this is a small and light device.

Available from Hobbyking

Read more…
T3

3689528575?profile=originalThis is a screenshot from Ecosynth's brand new 500 by 500 meter photo acquisition method, taken at Patapsco State Park!

3689528394?profile=originalThis is Wolfgang: our newest octocopter. It flew the 500 by 500 meter mission. Wolfgang is our biggest and heaviest copter, it can carry four lipo batteries plus its payload. It can stay aloft for 30 minutes of safe flying time. The camera is the new Ecosynth standard, the Canon Powershot ELPH 520. The camera is mounted in a card case for protection and ease of attachment to the copter; it points straight down to collect photos for aerial mapping.

Wolfgang uses the frame, legs, motors, and propellers of a Mikrokopter Miko XL.  However the brain is an APM 2.5, and the ESC's are jDrones 30 amp (those Mikrokopter ESC boards are too finicky and fragile.)  The electronics are covered by a tupperware container (not the most glamorous, but it is sturdy and gets the job done.)  It carries four Venom 5000mAh 4-cell lipos.  Control is the Spektrum seven channel.  One Garmin Astro dog tracker is attached for locating the copter in an emergency situation.

3689528588?profile=originalWolfgang's planned route in Mission Planner (created by Michael Oborne). The flight path is actually 550 by 550 meters, so that the 500 by 500 meter collection area is surrounded by 25 meters of buffer area on all sides. The tracks are 50 meters apart. The copter flies at 120 meters above the ground, so that pictures taken along adjacent tracks will overlap by 50 percent at the edges. This overlap is important to provide a seamless point cloud product.  550 by 550 is an unprecedented collection size for Ecosynth.  Layed end to end, the flight is 7.8 km long, more than twice as long as out previous flight standard: 275 by 275 meters.

3689528524?profile=originalThis screenshot from Google Earth displays the actual path followed by Wolfgang while it was gathering pictures. It managed to follow its router with great precision. I estimate it deviated no more than two meters from its planned track at any given time. In addition, according to the telemetry it very rarely dipped below 119.5 meters or 120.5 meters, so the altitude was very consistent. The groundspeed reported that it flew between 7 and 9 m/s along the tracks, which is well within desirable parameters. The photo collection took 20 minutes to fly over all 12 tracks, and the entire flight took 25 minutes including takeoff and landing.

3689528660?profile=originalAn example photo from the set. Wolfgang recorded 2250 pictures in the collection area, all of which were sharp and detailed like this one. The sun was bright and unclouded, so the lighting was consistent throughout the entire flight. These favorable conditions and outstanding copter performance resulted in a very consistent and detailed point cloud. The pictures were run through Photoscan to produce this point cloud:

(View in HD and use fullscreen to gain the full effect. At least, the fullest effect that can be gained without manipulating the cloud yourself.)

Blog post cross posted from the Ecosynth blog.

Read more…

APM 2.5 in a Nurf

Had a few people ask how we install the APM in the Nurf, so here's a short video of the install.  Could use a bit of tweaking but it works for us.  We do not like to put stuff in the wings as we do a bunch of testing and it makes it a bit harder to transfer gear.  But there is plenty of room in the wings if you wanted to go there.

 

Read more…
Moderator

A LowCost drone as a SAR project.

This test flight was the first one with WP route navigation.

Height was set to 100m (328 feet).

The flight was made LOS max distance was arround 950 m.

Testplatform: Blixler II

Autopilot: FY31AP (without datalink)

Battery: 4S lipo 4500mAh

Wind: 8-10m/s (very shaky in winds)

 

OSD: EzOSD (just what needed)

Videolink: Immersionrc 600mW 5.8Ghz with antennatracker, Diversity Rx,

Antenna from Ciculare wireless

HELIAXIAL58_MAIN-600x450.jpgSPW58_MAIN-600x450.jpg

antennetracker-1.JPG

 

 

Read more…
T3

See video

Brain-controlled aircraft Students control a helicopter in flight using only brain waves

A remote controlled helicopter has been flown through a series of hoops around a college gymnasium in Minnesota. It sounds like your everyday student project; but there is one caveat: the helicopter was controlled using just the power of thought. The helicopter was controlled by a noninvasive technique called electroencephalography (EEG), which recorded the electrical activity of the students’ brains through a cap fitted with sixty-four electrodes.

 

Controlling a virtual helicopter with brainwaves // Source: University of Minnesota via youtube.com

A remote controlled helicopter has been flown through a series of hoops around a college gymnasium in Minnesota. It sounds like your everyday student project; but there is one caveat: the helicopter was controlled using just the power of thought. The experiments have been performed by researchers hoping to develop future robots that can help restore the autonomy of paralyzed victims or those suffering from neurodegenerative disorders.

An Institute of Physics (IOP) release reports that their study has been published yesterday, 4 June 2013, in IOP Publishing’s Journal of Neural Engineering.

There were five students who took part in the study and each one was able successfully to control the four-blade helicopter, also known as a quadcopter, quickly and accurately for a sustained amount of time.

Lead author of the study, Professor Bin He from the University of Minnesota College of Science and Engineering, said: “Our study shows that for the first time, humans are able to control the flight of flying robots using just their thoughts, sensed from noninvasive brain waves.”

The noninvasive technique used was electroencephalography (EEG), which recorded the electrical activity of the students’ brain through a cap fitted with sixty-four electrodes.

Facing away from the quadcopter, the students were asked to imagine using their right hand, left hand, and both hands together; this would instruct the quadcopter to turn right, left, lift, and then fall, respectively. The quadcopter was driven with a pre-set forward moving velocity and controlled through the sky with the subject’s thoughts.

The students were positioned in front of a screen which relayed images of the quadcopter’s flight through an on-board camera, allowing them to see which direction it was travelling in. Brain signals were recorded by the cap and sent to the quadcopter over WiFi.

“In previous work we showed that humans could control a virtual helicopter using just their thoughts. I initially intended to use a small helicopter for this real-life study; however, the quadcopter is more stable, smooth and has fewer safety concerns,” continued Professor He.

The release notes that after several different training sessions, the students were required to fly the quadcopter through two foam rings suspended from the gymnasium ceiling and were scored on three aspects: the number of times they sent the quadcopter through the rings; the number of times the quadcopter collided with the rings; and the number of times they went outside the experiment boundary.

A number of statistical tests were used to calculate how each subject performed.

A group of students also directed the quadcopter with a keyboard in a control experiment, allowing for a comparison between a standardized method and brain control.

This process is just one example of a brain-computer interface where a direct pathway between the brain and an external device is created to help assist, augment or repair human cognitive or sensory-motor functions; researchers are currently looking at ways to restore hearing, sight and movement using this approach.

“Our next goal is to control robotic arms using noninvasive brain wave signals, with the eventual goal of developing brain–computer interfaces that aid patients with disabilities or neurodegenerative disorders,” continued Professor He.

— Read more in Karl LaFleur et al., “Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface,” Journal of Neural Engineering 10 (4 June 2013) (doi:10.1088/1741-2560/10/4/046003)

Read more…

Hexa back in the air

After a couple of crashes and some time getting things back in order we have updated our www.airviewmedia.com website with blogs, images, and follow up on my crashes and recovery. 

Enjoy http://www.airviewmedia.com

3689528468?profile=original

My ground control station and set up

  • Ammo box for transporting batteries
  • Spektrum DX8
  • Arducopter Hexa Copter apm 2.5
  • Lap top with ground control station
  • Eliminator Portable power station
  • Turnigy accucel charger
  • Turnigy 6000mah, 4S
Read more…

Australian Techpods Now Shipping

Hi,  

I are am pleased to confirm that the Techpods have arrived in Australia and have now cleared customs. Shipping commences this week.

Australian Techpod

Please check boltrc.com for the great prices for many of the parts required for your new Techpod, however as this is a new store, our quantities are limited! Any parts orders you make with your new Techpod will be delivered free with your plane.

Kickstarter’s, please contact us at admin@boltrc.com, if you have not already received an email from us.

Read more…