Today at the Intel Developer Forum, CEO Brian Krzanichannounced both the company's Aero drone development board and a full ready-to-fly drone based on Aer on the company's RealSense sense-and-avoid solution, which is already used on the Yuneec Typhoon H drone. Both of them are using the Dronecode PX4 flight stack.
Both will be available in Q4 2016. The Aero board is $399 and the price for the whole drone has not been set. More details are here.
IDF San Francisco 2016 – Drones Intel Reveals UAV Developments and Availability of New Technologies at IDF Aug. 17, 2016 – Intel Corporation today announced its involvement in the development of multiple best-in-class unmanned aerial vehicles (UAVs), commonly called drones, showcasing how they interact with their environment, solve problems and thrill users by helping them explore and interact with their worlds unlike ever before.
Intel® Aero Platform for UAVs Intel’s® Aero Platform is available today for developers to build their own drones. This purpose-built, UAV developer kit powered by an Intel® Atom™ quad-core processor combines compute, storage, communications and flexible I/O all in a form factor the size of a standard playing card. When matched with the optional Vision Accessory Kit, developers will have tremendous opportunities to launch sophisticated drone applications into the sky. Aero supports several “plug and play” options, including a flight controller with Dronecode PX4 software, Intel® RealSense™ technology for vision, AirMap SDK for airspace services, and will support LTE for communications. The Intel Aero Platform is available for preorder now on click.intel.com – the Intel Aero compute board is $399, the Intel Aero Vision Accessory Kit is $149, and the Intel Aero Enclosure Kit is $69.
A separate Intel Aero Platform Ready-to-Fly Drone will be available in Q4. Yuneec Typhoon H* with Intel RealSense Technology Now publically available, the Yuneec Typhoon H is the most advanced, compact aerial photography and videography platform available, featuring Intel RealSense technology. With an intelligent obstacle navigation system, the drone can see objects and self-navigate around them. The drone has an Intel RealSense camera and an Intel Atom processor while the ground station is also equipped with an Intel Atom processor. The Typhoon H with Intel RealSense technology is available for purchase for $1,899. AscTec Falcon 8* The AscTec Falcon 8 drone went into serial production in 2009 and has since been used globally for professional applications, most recently as an aerial inspection and surveying tool for Airbus*. The patented V-form octocopter is designed for precision and safety with the reliable AscTec HighPerformance GPS and the new control unit AscTec Trinity. It weighs only 2.3 kilograms on takeoff and works with maximum efficiency in the air, on- and offshore, even in challenging conditions.
Intel and Drone Policy Advocacy Intel CEO Brian Krzanich was recently appointed by the Federal Aviation Administration (FAA) to chair the Drone Advisory Council, a committee focused on addressing “integration strategies” regarding drones. In August, Brian addressed The White House Office of Science and Technology Policy, which includes experts in government, academia and industry, to discuss airspace integration, public and commercial uses, and ways to ensure safety, security and privacy in this emerging field. On Tuesday afternoon, Anil Nanduri (Vice President and General Manager, UAV Segment and Perceptual Computing Group at Intel), Earl Lawrence (Director, Unmanned Aircraft Systems Integration Office at the Federal Aviation Administration), Art Pregler (UAS Director at AT&T*), Ronnie Gnecco (Innovation Manager for UAVs at Airbus), and Shan Phillips (USA CEO at Yuneec) discussed how new drone capabilities and regulatory changes present new opportunities for drone developers
There is an interesting device that was presented at the same show: (Taken here: http://hackerboards.com/intel-euclid-a-brain-vision-sensors-and-hot...)
At the Intel Developer Conference in San Francisco this week, Intel showed off a prototype of an Intel Euclid robotics controller, equipped with a stereo depth-sensing Intel RealSense camera and running an Ubuntu/ROS stack. Designed for researchers, makers, and robotics developers, the device is a self contained, candy-bar sized compute module ready to pop into a robot. It’s augmented with a WiFi hotspot, Bluetooth, GPS, and IR, as well as proximity, motion, barometric pressure sensors. There’s also a snap-on battery.
Euclid robotics compute module (left) and its snap-on battery pack
(click images to enlarge)
According to Sarang Borude, an Interaction Designer on the Experience Design/Development Team of Intel’s Perceptual Computing unit, the device is preinstalled with Ubuntu 14.04 with Robot Operating System (ROS) Indigo. When it’s released in Q1 of 2017, it will likely run Ubuntu 16.04 with the latest Kinetic ROS version.
Euclid module and battery pack, snapped together
(click image to enlarge)
On top of this OS layer, there’s a software stack that “really makes the device easy to use,” said Borude. “You can use this device without any other software installation. Usually a PC is married to the robot, but what we’re bringing is plug and play.”
Euclid module’s USB 3.0 and micro-HDMI ports (left) and battery pack interface
(click images to enlarge)
The Euclid module’s built-in Intel ZR300 RealSense camera features a wide-FoV 640 x 480-pixel RGB camera element, along with depth and accelerometer-gyroscope motion sensors. These features enable the acquisition of high-quality, high-density depth data at up to 60fps, says Intel. Other features include USB 3.0 and micro-HDMI ports, as well as a separate charging port for the battery.
"LOL, all I see is another missed product launch. 8 months later and... ?"
Call me cynical but I don't think there was ever so much as a drawing or a product to be launched. What benefit would 3DR have to hint at partnership with Qualcomm 3 weeks before CES and then show or say nothing? Of course, I'm thinking logically and that isn't always the impetus behind these statements.
The whole promise of SnapgDragon (marketing) was the cost factor for what you got.
But it's hard to see the ZeroTech thingy as being more of a value than a $450 DJI - for aerial pics.
What it does have is portability and I suppose there will be some market for close-in portable camera drones. But how big?
So many of these supposed drone markets are small - and then 10 companies jump into them making them even smaller. I predict GoPro may be finding this out quite soon if their Karma is a "sports" drone. Check in with the Hexo, the AIrdog and some others - not selling to beat the band.
The Universe of wakeboarders isn't as big as the wakeboarders think it is.
Absolutely, at this time, gimbals are the only way to go.
I think solving the camera axial roll problem in software is seriously not easy and demands really serious software grunt and large oversize, high speed camera sensors.
I don't think Parrot, Qualcomm, Sony, Ambarella or Intel or anybody else for that matter realized just how hard that was going to be.
But the phone people do have absolutely huge incentive to solve it, so they will solve it - eventually.
They just seem a bit slow.
I am afraid a Beagle Cam generally ends up with better video than I do anyway.
Actually, I'd argue that the TenCent drone is a massive step back from what we have now. Until I see video from one of these fixed camera things that does NOT look like you smeared vaseline onto the lens of a GoPro, and strapped it onto the head of a beagle...
We will get there eventually. But not yet. I think it's being oversold.
Yeah, Snapdragon hasn't been exactly racing ahead with it's "SnapDragon Flight Platform".
They really do seem to be basing their approach on a flying smart phone.
And that may yet turn out to be a significant market.
But reality is that serious stuff is going to eventually have to do a serious job of "visually" navigating around in a complex moving 3D environment and the only serious hardware that can actually handle that currently is made by Nvidia.
And in many ways, Nvidia isn't even really trying.
For them, right now, flight controllers or even robotic controllers are a secondary or even tertiary market.
The Intel Aero in this article is cute, but my guess is that it's intrinsic limitations and narrowness of scope will scuttle it before it even seriously gets off the ground. (Pun intended).
On the other hand Intel does have the raw silicon design power to actually make an appropriate (3D vision capable) processor should they choose to do so.
The Aero and Snapdragon are good for slightly more advanced toys than what we have now, but once you get into the real world where you actually need to NOT run into things, the whole scope changes.
BTW actually got my Solo working pretty well, wish it had one inch longer arms, 2 inch longer props and slower motors, but minor complaint, documentation would help too, some documentation - any documentation.
Still letting me concentrate on learning how to shoot decent video rather than keeping the copter flying, so all good.
LOL, all I see is another missed product launch. 8 months later and... ?
I agree with you that there is a good potential for building an affordable functional SLAM system, that itself justify the 150$ for the R200. As side note, you get as well a fairly large programmer base that can push this product to the limit an a solid corporation to support (hopefully) the long term evolution.
Like I wrote before, the Autonomous Intelligent System Group from University of Bonn, have demonstrated a fully autonomous obstacle avoidance and real-time planning system using component described on my previous comment. The way that they accomplished that is by using Visual Odometry and LIDAR build a MAV-centric Multiresolution Map. I invite everyone interested in Visual Odometry and VSLAM as mean to control autonomous MAV to read their papers.
As for the RPI Balloon Popper experiment, discarding the RPI is purely based on performance: At the actual state of development, a RPI3 running a 32 bit Linux OS and without any decent GPU driver working, it cannot process more that 8 FPS on a very basic openCV color Filter/Blob detector, making it unqualified (or very risky if not dangerous) as a controller for a UAV using DronrKit API.
I'm not sure (''An internal clock triggers all 3 image sensors as a group and this library provides matched frame sets'') correlates to not being able to integrate sensor fusion.
Presuming you can either gain immediate access to the matched frame sets or get a time stamp of when the frame sets were acquired, or have a specified delay before the frame sets are available some sensor fusion should still be reasonable.
I definitely do understand the desire to integrate with additional sensors in real time.
But since this is clearly a device for real time data acquisition and response I would also expect it to be fast enough to be able to allow at least some useful real time sensor fusion.
In fact, I also envision it as a possible component in a more robust vision system.
The fact is that with stereo devices that work like this one does accuracy degrades very rapidly with increasing distance whereas TOF technology (laser scanners or IR flash TOF cameras) maintain identical high accuracy to their maximum range).
So using the two of them together could permit using a TOF device to close focus on objects of potential interest as determined by the lower resolution stereo device.
They each also work more reliably under different conditions, so could cross compensate for each other.
All that said, this is really a rather inexpensive system and in my opinion definitely worth having a closer look at.
I will probably get either the robot or flight controller development kit and see what I can get it to do.
I know that balloon popper challenge looks easy, but I know that in practice it is a really difficult thing to accomplish.
You can probably do a variety of object avoidance or direct pathfinding using assorted high performance CPU's, but if you really want to do real time SLAM I really think the TX1 is the minimum starting point, you really need a serious matrix of GPU's to do that.
For larger robotic projects, the new Nvidia 1080 GTX Titan would be a good starting point.
(Need lots of batteries).
I am still skeptical about the capability of the R200 as being a contender for an efficient Visual Odometry/SLAM system. It is lacking, speed , resolution, and I cannot find any external trigger (''An internal clock triggers all 3 image sensors as a group and this library provides matched frame sets'') so you cannot integrate sensor fusion to generate an accurate pose.
That is the eternal quest that we, the DYers are facing. We try to build a serious solution with low priced components and we end-up with a sub-optimal system that at the end of day, sums up to the price of the real thing..
My last episode of this series is trying to get Randy's Balloon Popper to run on a RPi2... then a RPI3...and I finally ended up buying what Randy used at the first instance an ODROID XU4 !! Well, so far this exercise was not too costly and I can reuse most of the components on other projects. Unfortunately the next step is much more demanding; getting a TX1 or an embedded I7 ,a pair of global shutter hi-res externally triggered USB3 cameras and a fast and lightweight LIDAR for a total cost that can easily get in the thousands $$...