I attached some additional sensors today - a pair of Maxbotics EZ0 ultrasonic ranging modules and a Honeywell HMC6532 I2C compass. Also, I added an extender to the camera module to point downward about 30-degrees and I finally adjusted the focus. I think the camera angle is pretty good now - the wide angle lens has a field-of-view of approx 120-degrees, so the bottom of the frame captures objects almost directly below.I haven't yet filtered the sensor data. The ultrasonic data is pretty solid - the forward looking readings bounce around a bit, but the download readings are fairly consistent. However, I am thinking about going with a narrower beam module - probably the EZ1. Also, I would like to add a couple of side-looking modules to enable some degree of mapping capability.The compass is less consistent, but I suspect the may be due to magnetic field interference from the motors. When I was recording, I wasn't paying attention to the readings, so I will have to run some tests to see if the compass reading change depending on whether or not the rotors are firing.Here's a snapshot of the gondola with the additional sensors -I'll work on cleaning up the data a bit, and then will start to write a script (there's an onboard C interpreter) to let the blimp wander around on its own, perhaps following a course based on heading. After that, I will add some logic to use the camera to lock onto an object and follow it around while maintaining altitude and avoiding collision - that will be a bit more challenging.Here's a short video clip of the latest test -
This blimp would not work outside - the motors do not have enough power. For outdoor blimps, you might contact MiniZepp in Switzerland - http://www.minizepp.com/. They have some good designs and I think their prices are reasonable. You would be able to use their blimps for aerial photography.
donde se puede comparar este dispositivo y cuanto cuesta. Exsite la posibilidad de colocarle un autopilot, es posible convertirlo en un outdoor, para toma de fotografias aereas?
I've seen a few entry-level vision systems based on ARM7, e.g. CMUcam3 and Georgia Tech's Fluke (you probably saw that one in Atlanta), but this is the first I've heard of anyone else using Blackfin, though the Blackfin is a much better processor for vision applications. If you want to play around with host-based vision software, take a look at RoboRealm - it's free and very capable.
Robotic Machine Vision Software
RoboRealm is a powerful vision software application for use in machine vision, image analysis, and image processing systems. Using an easy to use poi…
When I was at the FIRST Competition in Atlanta I met with a company that's also developing a Blackfin-based vision system. They were showing it integrated into Mindstorms NXT, with a small camera/Blackfin/bluetooth board and PC-based image recognition software. Basically you could mount their board on your NXT bot and it would do some image processing onboard, some object recognition offboard, and then turn that into structured data ("object found at x,y") that was sent back to the NXT bot as the output of regular NXT-G sensor block in the software. Pretty cool. Here's the software that it's based on.
Glad to hear that this approach is working for you. The other issue with my setup is the servo which vectors the rotor thrust is less than 3" away from my original compass placement, and the servo is continuously powered, so there's no way around relocating the compass.
My blimp has been gathering dust for the past few weeks while I have been working on robot firmware. The next step (for a while now) has been to switch over from teleoperation to autonomous flight, and I have held off from writing some onboard C code in favor of either hooking up the onboard neural net code or employing Lisp interpreter I've been integrating into the firmware. However, it would be easy to script some autonomy in C, and while there would be nothing adaptive in the robot blimp's behavior, this would be a useful exercise of the sensors and motion control strategies.
Under Construction - Service provided by CNTSB www.cntsb.com
CNT Solution Sdn. Bhd. CNTSB provide custom software solutions, the custom web design, functional, installation and commissioning of web page, conten…
The compass is on the main board with the CPU, motor drivers etc. The motors are on a rod that takes them about six inches out on each side from the mainboard. Have not plotted the results, but the bottom line is that the blimp flies where it's supposed to.
I thought about that approach, but had some concern about magnetic field distortion caused by the permanent magnets in close proximity to the compass. However, the fixed magnet distortion may be small in magnitude compared to the field generated by the charged motor coils.
Have you made a plot of the compass heading in all directions with the motors turned off to see if there's any remaining distortion ?
We had a similar problem of motor interference with the compass with one of our blimps (the "maximum" one we're doing with HiTechnic). The solution was what we called the "WWI machine gun" fix. You know how the WWI biplanes had hardware that syncronized the machine gun with the prop so it would only shoot when the prop wasn't in the way? Well, we do the same thing with the compass and motor. We run the motor in pulses, and only sample the magnetometer in between motor pulses. Just 10ms is enough of a pause to let the magnetometer settle for a good reading.
Comments
When I was at the FIRST Competition in Atlanta I met with a company that's also developing a Blackfin-based vision system. They were showing it integrated into Mindstorms NXT, with a small camera/Blackfin/bluetooth board and PC-based image recognition software. Basically you could mount their board on your NXT bot and it would do some image processing onboard, some object recognition offboard, and then turn that into structured data ("object found at x,y") that was sent back to the NXT bot as the output of regular NXT-G sensor block in the software. Pretty cool. Here's the software that it's based on.
My blimp has been gathering dust for the past few weeks while I have been working on robot firmware. The next step (for a while now) has been to switch over from teleoperation to autonomous flight, and I have held off from writing some onboard C code in favor of either hooking up the onboard neural net code or employing Lisp interpreter I've been integrating into the firmware. However, it would be easy to script some autonomy in C, and while there would be nothing adaptive in the robot blimp's behavior, this would be a useful exercise of the sensors and motion control strategies.
I thought about that approach, but had some concern about magnetic field distortion caused by the permanent magnets in close proximity to the compass. However, the fixed magnet distortion may be small in magnitude compared to the field generated by the charged motor coils.
Have you made a plot of the compass heading in all directions with the motors turned off to see if there's any remaining distortion ?
We had a similar problem of motor interference with the compass with one of our blimps (the "maximum" one we're doing with HiTechnic). The solution was what we called the "WWI machine gun" fix. You know how the WWI biplanes had hardware that syncronized the machine gun with the prop so it would only shoot when the prop wasn't in the way? Well, we do the same thing with the compass and motor. We run the motor in pulses, and only sample the magnetometer in between motor pulses. Just 10ms is enough of a pause to let the magnetometer settle for a good reading.