In the near future you should be able to find a paper on the details of how this was done on Daniel's website: http://fling.seas.upenn.edu/~dmel/wiki/index.php?n=Main.Quadrotor

Motion capture system used/mentioned: http://vicon.com/

Help Daniel break into the 1M YouTube views club—sent this to all your friends!
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Thanks for interviewing Mr. Mellinger. I ws hoping he'd provide some additional technical details

    I don't know if you were covering for some technical difficulties, but it seems to me that more time could have been spent talking to Daniel than to talking to each other and hyping the site. Don't get me wrong - I'm very grateful and impressed with the services you are providing to the community. But, at least at first listen, the extraneous chatter seemed excessive. I don't listen to many of these podcasts, so perhaps this is no different than the others.

    - Roy
  • Very cool and interesting podcast, thanks very much to Daniel Mellinger for all the inner details of his research project, and of course, thanks to Chris and Tim. The only real question I had when I first watched the video some time ago was if the path and behaviour of the quad was computed real time with realtime analysis of a dynamic obstacle configuration (cameras figuring out an unknown dynamic target, and computing path for the quad accordingly). It took long enough for Daniel to tell us (there was no question oddly about that, apart from the "freaked out" part that hinted at no intelligence, hence precomputed paths) "they just fly along the trajectory I tell them to fly along". So it seems that the system is using precomputed flight paths for now.
    A very cool feature tho, being able to design a sub cm flight pattern in a given environment. I guess it would be very easy for Daniel to do a full "parcours" with several windows around and the quad to fly through in a single fly (actually several quads -noticed the 6 of them laying on the ground + transceivers on the table- should do that for next video, please, please, Daniel). With that kind of precision and a few leds on the quads it should be easy now to get the quads do group flying and loops/barrels in a precision prepared airshow. Coming myself from the video-games industry, it's funny to see that all the vectors and physics/aerodynamics models we use in your daily fps and more are used there.
    You have been talking, Chris and Tim, about "what we do about this technology is left to the people to invent". Well, I can definitely say that an integration of a Havok engine inside an arm9 is a reality, and that with enough onboard sensors characterizing a given area, and a full "what kind of components and fabric/weight estimation" by density models (that will take quite long to get accurately) an area is made of, then the given platform will be able to model its real environment and behaviour of objects populating this area with enough data enabling it to compute inimaginable possible moves/outcomes way faster than us (predict the future if I push this object with this velocity at this angle, then this object will fall there at this given time, giving me enough time to do this and this (like a certain "oracle" did)).
    It's no science fiction, and seeing what Aibo at home here can do with so little sensors since some years now, and now seeing how a quad can move,
    I'm willing to bet that this is going to happen sooner than expected by many. And this, this will be freaking awesome, and freaky :)
    Thanks for this podcast again !
This reply was deleted.