I know that some commercial UAVs have a feature which allows the pilot to choose a vehicle and the UAV tracks/follows it.On the arudupilot mega, other than a camera, would I need additional hardware to program such a feature?
You need to be a member of diydrones to add comments!
My partner and I are trying to do the same thing. We were going to do it a different way(check out our form at object tracking). Our plan was to have a live video down link sent down to the ground station where the image processing would occur.
The image processing would:
figure out the targets GPS locating
figure out a set of commands for the tilt and pan on the camera
and give the targets speed.
When it was done it would send a signal to the ArduMega over xbee comm. and give the UAV a a new way-point(which would be the targets location but with the UAV's current alt.) and give the appropriate commands to the camera on the pan and tilt.
Its alot of delay because of the wireless transmission and off board calc. but if you think about it when you at <400 ft a moving car isnt going to moving out of your feild of veiw very fast so the tracking adjustjents wont need to be that sudddle anyway. It just seemed like an easy approach
I belive Martin Seven gave an obvious and simple solution to the problem. Place a bluetoth enabled gps in the first vehicle and feed that gps signal to the following vehicle. Then all that would be needed would be a reciver for the gps signal in the follower, plus some small adjustments in the navigation routines and a distanse hold algoritm.
It's got an onboard processor that is capable of image recognition features (with camera resolution of 352x288 at 26 frames per second), and communicates via a serial port (which the ardupilot mega has 4 of). All software is open-source too.
CMUcam: Open Source Programmable Embedded Color Vision Sensors
The gumstix overo hardware is a good starting point. The Pixhawk project use two of them for an high end quadrocopter.
The gumstix integrate an ARM cortex A8 with simd NEON extension and a DSP (+ a GPU for GPGPU if it's not enought).
This give you a lot of power for imaging and the power consumption is very low (< 2W).
I used lot of imaging library on this platform (openCV, siftfast, pan-o-matic) but unfortunatly all theses libraries are optimized for x86 SSE and you have to write assembly (Neon or DSP) to get decent performance (see piwhawk).
The ticket price to master the build environnement (openembedded, linux,... ) is high and it's definitivelly not a 'one man' project but the result is stuning.
Atom platform is very interresting and will get very good performance as all library are already optimized for SSE/SSE3.
The power budget is still high (expect 15-20W) but this could be acceptable (?).
Blackfin platform is also an interresting platform (see Surveyor by example) but I have no experience with it.
Um, the Mega has nowhere near enough firepower or data bandwidth to facilitate image recognition. By several orders of magnitude.
You'd need something like an ARM with linux on it running an image processing program, perhaps written using OpenCV.
Or - and this could be done even with a Mega - put a GPS beacon on the vehicle you want to follow and follow that. A GPS module and an Xbee or something like that.
Replies
The image processing would:
figure out the targets GPS locating
figure out a set of commands for the tilt and pan on the camera
and give the targets speed.
When it was done it would send a signal to the ArduMega over xbee comm. and give the UAV a a new way-point(which would be the targets location but with the UAV's current alt.) and give the appropriate commands to the camera on the pan and tilt.
Its alot of delay because of the wireless transmission and off board calc. but if you think about it when you at <400 ft a moving car isnt going to moving out of your feild of veiw very fast so the tracking adjustjents wont need to be that sudddle anyway. It just seemed like an easy approach
See:
http://www.cmucam.org/
It's got an onboard processor that is capable of image recognition features (with camera resolution of 352x288 at 26 frames per second), and communicates via a serial port (which the ardupilot mega has 4 of). All software is open-source too.
The Pixhawk project use two of them for an high end quadrocopter.
The gumstix integrate an ARM cortex A8 with simd NEON extension and a DSP (+ a GPU for GPGPU if it's not enought).
This give you a lot of power for imaging and the power consumption is very low (< 2W).
I used lot of imaging library on this platform (openCV, siftfast, pan-o-matic) but unfortunatly all theses libraries are optimized for x86 SSE and you have to write assembly (Neon or DSP) to get decent performance (see piwhawk).
The ticket price to master the build environnement (openembedded, linux,... ) is high and it's definitivelly not a 'one man' project but the result is stuning.
Atom platform is very interresting and will get very good performance as all library are already optimized for SSE/SSE3.
The power budget is still high (expect 15-20W) but this could be acceptable (?).
Blackfin platform is also an interresting platform (see Surveyor by example) but I have no experience with it.
You'd need something like an ARM with linux on it running an image processing program, perhaps written using OpenCV.
Or - and this could be done even with a Mega - put a GPS beacon on the vehicle you want to follow and follow that. A GPS module and an Xbee or something like that.