All Posts (14048)
Enjoy!
This is a civilian UAV which is developed by Integrated Dynamic Pakistan. http://www.idaerospace.com
The ROVER is a pioneering effort to create an affordable civilian scientific and ENG (Electronic News Gathering) UAV system, with an operational range in excess of 5 km, and is designed for the researcher, academic professional or news crew needing information quickly and reliably. Able to operate out to altitudes of over 2000 feet (600 m) with a noiseless electric propulsion module the ROVER is a robust UAV system. Its compact autopiloting system takes the hassle out of programming and calibration. Supplied with a PTZ camera system and telemetry downlink, the ROVER can stay in the air for over 1 hour.
Weighing less than 5 kg, the ROVER is hand-launched and recovered by a deep stall landing making it available and responsive on short notice. The simplicity and ease of operation greatly reduce operator training and improve aircraft survivability.
A complete system consists of 4 ROVER UAV’s; A laptop PC Ground Control Station with programming and moving map mission display software; a DSS (Digital Spread Spectrum) telecommand link; antennas, cables and operational spares.
A new lake and the story:
After having surveyed my good old diving lake to nearly all of its extents, and haven´t found the 6000m abyss, a new one should be on the list. A friend of mine, is a red-cross volunteer, who is working in his free time as a rescue swimmer on a nearby lake. (there are no familiar relationships to David Hasselhoff - i swear).
So, on one evening of the less rainy days in august, we went to that lake to try out, how the workflow on a new lake would be.
Mission planning by Google-Earth was not directly possible, because the database of GE dated back to 1999. And until now, the lake has received some "modifications" in the form of some pontons and a pier. To avoid unwanted encounters with those objects, we first had to do a pre-survey. For that, we took two kajaks and fixed the catamarane on one of them with a string. With that configuration we made one tour around the lake with little standoff to the shoreline to get the outline of the lake. Then we surrounded the swimming pontons, the pier and the small island in the middle of the lake.
On the early next morning, we went back to the lake with the prepared mission on the ArduPilot.
To keep the adrenaline level low, we decided to follow the boat with the kajaks. A really relaxed survey!
It is nice to see, how the boat behaves when you can follow it on the water! A totally new perspective. All went fine, no crashes and the ship found its path back to the shore after a 30 minute trip. The straight-line behaviour could have been better, but anyway, the measured depth values were more than OK.
The total time spent for this survey mission was less than 5 hours.
And this is the result, that was produced by DrDepth:
The KMZ file can be found here:
Next on this blog:
Ardupilot goes into water, episode one -reloaded-
The return to the mountain-lakes.
I recently started the building of my own UAV
At this moment i´m building the wings 3 meters wingspan.
General description (pictures soon)
wingspan 3 meters aprox 80 dm2 of wing surface
Engine 40cc pusher 3 blade propeller
Tricicle
conventional tail
intended for 1 hr of flight and 2kg of payload
also i´m soldering the ardupilot. (i think gonna need the MEGA version)
yesterday i recive my ArduPilot with the ArduIMU and the GPS.
Known as the Terrestrial Artificial Lunar and Reduced Gravity Simulator, or Talaris, the three-foot-wide vehicle is a smaller version of a hopper that would be used in space. It is designed to go about 20 meters per hop; space-based hoppers might cover tens of kilometers--or possibly more--in a single bound. The team that built Talaris wants to use it on Earth to test guidance, navigation, and control software developed by Draper that would allow the space-based hopper to navigate autonomously.
The prototype was developed as part of MIT's effort to win the Google Lunar X Prize, a $30 million competition to get a privately funded spacecraft to reach the moon, travel 500 meters across its surface, and transmit video, images, and other data back to Earth. Both MIT and Draper are members of Next Giant Leap, one of about 20 teams registered in the competition.
Pretty big article, although nothing you don't already know:
"Drones get ready to fly, unseen, into everyday life"
Here's the bit that mentions, us, giving away a future T3 Competition challenge!
"The ability to share software and hardware designs on the Internet has sped drone development, said Christopher Anderson, founder of the website DIY Drones, a clearinghouse for the nearly 12,000 drone hobbyistsaround the world.
A coming DIY Drone competition will challenge members to walk a mile with a drone following from above. The goal is to make a drone that can stabilize itself and track its target. Given the rapid evolution of technology, Mr. Anderson said, "that's now a technically trivial task.""
Today, my TrIMUpter v1.0 has successfully done its first outdoor flight. This is a VTOL Tricopter fully stabilized by the ArduIMU+ V2 flat and (with my firmware TriStab v1.0). No external gyro here, only the ArduIMU on board... The TrIMUpter is able to flight outdoor and indoor.
More photos at: http://diydrones.com/photo/albums/trimupter-a-vtol-tricopter
Stay tuned on: http://diydrones.com/profile/JeanLouisNaudin
Company:
Procerus Technologies is the developer of the Kestrel autopilot system that is known in the UAV world for its small size, light weight, and feature rich capabilities. More information about our products can be found online at http://www.procerus.com.
Job Summary:
Procerus is looking for a highly motivated and experienced individual to fill the role of Flight Engineer. This is a senior position responsible for conducting safe flight operations using the Kestrel autopilot system on a variety of airframes. A strong background in Radio Controlled airplanes and helicopters as well as excellent piloting abilities are required. This is a customer facing role that requires good communication skills and a professional, detail oriented demeanor. This position will involve a fair amount of travel, including international travel.
Essential Job Duties:
• Thoroughly understand the Kestrel Autopilot System
• Assist customers with troubleshooting issues related to autopilot integration and flight
• Conduct safe flight operations in restricted airspace
• Act as a safety pilot for high-value airframes
• Integrate the Kestrel autopilot and subsystems into customer airframes
• Tune gains for smooth, stable flight
• Integrate and troubleshoot payloads
• Troubleshoot and solve communications issues (video, comms,etc)
• Troubleshoot and solve GPS interference issues
• Field customer emails and support calls.
• Represent Procerus at customer locations
• Supervise airplane building techniques and subsystems.
• Troubleshoot avionics subsystems (video, servos, etc)
• Knowledge of airplane and helicopter control laws
• Provide feedback to software engineers to improve Procerus products
Desired Qualifications:
• BS Degree in a technical field
• Experienced RC airplane and helicopter pilot
• US Citizen
If you are interested please email info@procerus.com - address email to Damon"
I hope this doesn't come off as a marketing pitch, my intent here is to make these opportunities available to you, not be an annoying solicitor. AUVSI is a not-for-profit association whose mission is to advance the unmanned systems/robotics industry.
Abstracts are currently being collected for AUVSI's Unmanned Systems North America 2011 which will be held August 16-19, 2011 in Washington, DC. We expect over 6,000 people and 400 exhibitors; it's "the place to be" for unmanned systems and robotics! The submission period closes on November 10, so submit your abstract today. Details can be found here: http://www.auvsi.org/2011CFP
Also, AUVSI is holding it's annual "Program Review" on February 1-3, 2011 at the Omni Shoreham in Washington, DC. Providing the latest information on government and industry programs for air, ground and maritime systems, this annual event is one of the most important to the unmanned systems community. Details can be found here: http://www.auvsi.org/uspr11.
Thanks for reading!
I'm learning to love things open source! To see what the fuss was about, I obtained an Arudino Duemilanove board from Sparkfun and decided to play around with it. It didn't take very long for me to assemble a (very) simple optical flow sensor using this board and one of my 16x16 Tam vision chips.
The circuit is very simple- the only electronic components were the Arduino board, a 16x16 Tam vision chip I developed at Centeye, and a single bypass capacitor. The vision chip and the bypass capacitor reside on a one inch (25.4mm) square breakout board. This particular Tam chip is pretty simple to operate- aside from the power signals, it requires two digital inputs, clock and reset counter, and generates one analog pixel output. A counter on the chip determines which pixel is to be output (by row/column) at the output analog line. Every time the clock is pulsed, the counter increments and the next pixel is selected. Pixels are read out one row at a time. The pixel circuits themselves operate in continuous time and are always generating a voltage in response to light. The counter merely determines which pixel voltage is sent to a voltage buffer before being sent off-chip.
A simple Arduino program reads out the pixel signals, digitizes them with the Arduino/Atmel's ADC, and computes a simple one-dimensional optical flow measurement. For the optical flow algorithm, I chose a variation of the Hassenstein Reichardt algorithm, an venerable algorithm from the 1950's that was one of the first proposed neural models for visual motion sensing. The Arduino program then dumps the simple running graph of the optical flow onto the serial dump terminal.
The optical flow algorithm is very simple. Let pA and pB be the signals output by pixels A and B respectively. Let lp( ) be a simple low-pass filter function, which can be implemented as a running average. The optical flow estimated from pixels A and B is merely lp(pA*lp(pB)-pB*lp(pA)), with the outer low pass filter having a longer time constant than the inner low pass filters. If we have an array of pixels A, B, C, D, and so on, then we compute this algorithm once for pA and pB, then again for pB and pC, and again for pC and pD, and so on, and average the results. This certainly isn't the best algorithm one could use, but it was very simple to throw together and I was actually curious to see how it would work.
For this implementation, I'm only reading in the middle 8x8 block of pixels and row-averaging them to form an eight-pixel line image. Thus the optical flow output you see is really from eight pixels worth of image data, or seven individual optical flow measurements averaged together as described in the last paragraph.
The first video above shows the response when the 16x16 Tam chip is exposed to light and a moving card casts a moving shadow across the chip. The second video shows the response when a lens is placed over the chip, so that the image of my moving hand is tracked. The pictures below show the two complete sensor setups, with and without lens, and a close-up of the Tam chip on it's breakout board.
The purpose of this experiment was to see how easy it would be to throw together a simple optical flow sensor using an Arduino board and a simple image sensor chip. The results are certainly crude, but the concept works. I think with some more work a decent Arduino-based sensor can be made, and it could be as easy to hack as pretty much any other Arduino project. (Arduino rocks!)
For those that are curious, I have another post on another forum that shows simple ASCII images taken from the image sensor, and discusses the operation of the chip in greater detail.
(Note: The "Tam" chip here is similar to but not the same as the "Tamalpais" chip used in the recent post on the 125mg sensor. Both are 16x16, but the Tam has larger pixels and is simpler to operate while the Tamalpais is smaller and has better on-board amplification. There is a story behind both names...)
Source: http://asmaraaerospace.yolasite.com/uav-airframe.php
RJX1
Airframe Specifications
| Propulsion Specifications
Payload Allowance
|
Aircraft Performance
Stall Speed | 40kts (approx) |
Cruise Speed | 45kts (approx) |
Max Speed | 80kts |
Crosswind Takeoff/Landing | maximum 30kts crosswind |
Takeoff Distance | 50m (3kg payload) with flaps set 0deg |
Landing Distance | 30m (3kg payload) with 30deg flap |
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
PT1
PT1 Specifications
Wing Span | 280cm |
Length | 200cm |
Empty Weight | 5kg |
MTOW | 9kg |
Landing Gear Configuration | Tri-cycle |
Actuators | Servos (8pieces) |
Propulsion | 45cc Modified Piston Petrol Engine |
Propeller | 20 x 10” |
Fuel Tank | 1500cc complete with rubber tubing |
Takeoff Distance | 60 – 80m |
Landing Distance | 50 – 70m |
Stall Speed | 40kts |
Cruise Speed | 50kts |
Maximum Speed | 100kts |
Minimum Banking Radius | 25m |
Endurance | 1.5 hours (approx.) |
VTOL Tricopter v2.0 Flight demonstration from Jean-Louis Naudin on Vimeo.
More info at: http://diydrones.com/profile/JeanLouisNaudin
ArduCopter team member Jose Julio presenting the project at the InterQue conference in Spain over the weekend.
Another shot here:
(both from the conference's Flick stream)
hello friends,
I have now made some photos like the Mega housing glued together.
soon in the Store
As an exercise in size reduction, we have prototyped a complete optical flow sensor in a 125 milligram and 7mm x 7mm package. This mass includes optics, image sensing, and all processing. Below is a video and two close-up photographs. In the video, note the green vector indicating measured optical flow as a result of image motion.
Image sensor: Centeye Tamalpais 16x16 pixel image sensor (only an 8x8 block is being used), 1.3mm x 4.1mm, focal plane about 0.3mm x 0.3mm.
Optics: Proprietary printed pinhole, about 25 microns wide
Processor: Atmel ATtiny84
Optical flow algorithm: Modified "Image Interplation" algorithm, originally developed by Prof. Mandyam Srinivasan (well known for his research on honey bee vision and navigation).
Frame rate: About 20Hz.
This work is being performed as part of Centeye's participation in the Harvard University Robobees project, an NSF-funded project to build a robotic bee. The final target mass for the complete vision system (including processing) will be on the order of between 10mg to 25mg, and will include omnidirectional sensing as well as algorithms to detect flowers. Obviously we still have some more work to do!
We documented the construction of this sensor, with lots of photographs, in case anyone is interested.
Sorry guys for keeping spamming the site with all these UAV frames and platforms.
But somebody gota do it.
My newest find is the nice UAV platform are made in Asia but not Chinese made.
Specs.
Wing span 2200mm
Fuselage length 1525mm
Flying weight up to 10kg
Wing area 80 dm2
Wing Material composite
Fuselage Material Fibreglass
Engine 30cc
__________________________________________________________
I plan to build this UAV with the FY3-zt Autopilot, the autopilot is ready for installing with the 433 Mhz data link for some range flying. (Electric motor set-up, don't like all the noise pollution in the air) ;-)
If anyone should be interested in the platform please send me a message. The platform will not be posted on my website before it's tested.
Price for this beauty is just under 2K with 30cc gas engine. Without engine 1.8k. (Yes it's a little pricey, but when you want things which smells right then it's gonna cost right)
BTW: Is everyone flying Q-coptor and is fixed wing getting old fashion now? ;-)
Today we hit 12,000 members, and growth continues to accelerate. Despite a pretty rigorous admissions process (applicants need to answer questions about themselves, do a quick quiz to prove that they're human, and then they need to be approved by a moderator), we're now adding 1,000 members every seven weeks (a month ago that was every eight weeks).
As you can see from the traffic stats above (click for a larger picture), we're doing nearly 30,000 page views every day. Over the past year, we got 2.5 million visits and 8 million page views and more than doubled our traffic.
At this point we may be the largest robotics community in the world. The other community that is normally described (well, describes itself) as the World's Largest Community of Robot Builders is Let's Make Robots, and it looks like we're now bigger than them: