What do you want from motion capture?

Hi Everyone,

 

Ever seen the series of  Quadrotor videos posted online by the GRASP Lab? Or the piano playing UAV from ETH?

 

 

The real time tracking system (motion capture) used in this technology is developed by Vicon Motion Systems.

 

My name is Alex and I work for Vicon.  We’re new to this community and very keen to hear your thoughts on how we can become more of an active member. Feel free to share your ideas. If there is technical information we can provide, or perhaps trial software please don’t hesitate to contact me.

 

Our T Series systems are the latest technology and can cost big dollars. What you might not know is we have developed a cost effective camera named the Bonita camera. It’s based on the same technology and software as our higher-end cameras, however is available at a more affordable price. A full Bonita system (suitable for UAV tracking) with software is available for under $20,000.

 

Happy to be involved in the community and look forward to more awesome vids!

 

Alex

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Vicon is a really beautiful system with lots of capability but is probably overkill for the DIY community. Even for research labs that are getting off the ground, there are systems that come close to the capability for a fraction of the cost.

    NaturalPoint's OptiTrack system is about $6,000 for a six-camera system with software (about $600 / camera and $2,000 for the tracking software).

    While the Vicon system can run at 250 fps, even that is overkill for a research lab.  From reading up on GRASP lab, Berkeley, ETH Z - they're running their systems at 120 fps.  The OptiTrack system above runs at 100 fps.  

    With a concerted effort, the DIY community can likely come up with an IR localization system that is reliable for a 15 x 15 x 15 ft^3 cube for under $1,000.  

  • Thanks, Alex. Video is clear, but looks there was much more flights beside the screen. I understand that any stereo camera is enough to make single-plane 3D-projection. And the total minimum of projections is 360 / min_surface_angle.  Processing power - if we decrease resolution, we'll get square of power.

  • Hi George - Vicon's sister company 2d3 is able to extract 3D information for a moving video camera. This is used in Visual FX all the time and is basically match moving. You don't need a Kinect just a video camera - however the image storage and processing is a hard problems to solve and you need lots of computing power. If you are interested I can ask one of my colleagues from 2d3 to follow up post explaining their technology further. 

    However I think what you asking is shown in the following video around 4:07 and towards the end of the video are a couple of examples.

  • Can motion capture technology be reversed using drones?

    I mean, for example, a drone flying in the city or forest and collecting 3D data of all the surroundings.

    A flight track, including position, acceleration and GPS, along with 3-4 mounted MS Kinects should give enough data to scan almost any landscape or room into 3D.

  • Hi Luke - we are working on it. This is four cameras for facial motion capture. Hopefully this could be inside a cockpit soon. The logger is small enough for a UAV (wieght of 2 iPhones and battery) but the real time processing isn't there yet for UAV use. 

    Vicon_Mobile_Mocap_01_thumb.jpg

    http://www.dexigner.com/news/23611

  • As already said the applications of traditional motion capture are not that broad since it would only work in environments where they cameras were. Perhaps in the future when more processing power is available in smaller packages something like this could be available for UAV use?

     

    http://drp.disneyresearch.com/projects/mocap/

     

    Wishful thinking I suppose but it sure would be a huge improvement over just GPS for information about the position of the vehicle.

  • That's some cool stuff!

  • The 2d3 link didn't embed for some reason so here it is again.

    http://www.engineeringtv.com/video/Managing-Imagery-on-UAVs-from-2

  • Thanks for all the questions; I was hoping for a lively discussion and I will do the best I can to answer them.

     

    Chris - for a smaller 4-5 cameras system the split of hardware to software costing is probably 75%/25%. The reality is that we as a company probably spend more time (and money) developing our software than we do our hardware. However, these days it is very hard to justify selling software at its 'real' cost. Not to compare us to Apple but their OSX Lion is on sale for $29.99 I am sure that price doesn't get close to covering the development. As for open source I am not sure if anyone would benefit from our software without having the hardware. I guess that is a deeper conversation if there is any off the shelf hardware we could integrate with. The Kinect (or similar technology) is an idea for the future but I don't think it is accurate or fast enough just yet.

    That being said our Tracker software installs with a 30 trial license and with our system emulator, some trial data and our Real Time SDK so if someone wanted to evaluate our software and see how it works I am sure we would be open to that.

     

    Joe - as I mentioned I am new here and I guess you have a lot more experience about how this technology and research can be used. I know there is the 'bad guy' scenario but having a Real Time tracking system helps with part of the problem. Knowing the Real Time position of a UAV means the research into UAV co-operation is able to be run in parallel to the research for localisation and control.

     

    Last week I was in Japan where the government is heavily looking into UAVs and UGVs for the maintenance and repair of the Fukushima Power Station. Having a UAV land on a moving UGV for recharge/refuel is a difficult problem that they must first solve in the lab. I understand that an indoor tracking system isn't ever going to be the full solution.  

     

    Paul - I understand that the first issue of designing a control system is knowing where your plane is and the ability to predict where is it will be next. Again I think the flying through windows or a moving hula hoop is a 'sexy' way to show these control system are moving forward and very rapidly as well. The ability to interpret, identify and share information gained from UAVs through imagery is managed by our sister company 2d3. Here it a quick interview with Jon Damush the MD of 2d3 if you interested.

     

     

     

    Also r691175002 was able to answer your questions. The only thing I would add is that Vicon cameras work outside now in full sunlight.

     

     

     

     

  • Thank you for explaining that. I wasn't aware that there were multiple cameras or even IR reflective spots on the quads in the video. I figured it was a single camera actually recognizing the quad as the object of interest and tracking it's movement through space via the single camera feed. So in fact, this technology is much like that of the actor/athelte's motion capture with all the white balls taped to their black suit grabbing data for video games.

     

    I have to admit, the first time I saw the video of the quads flying through the open window at high speed I thought that was pretty cool and would have both military and police applications. Then I found out it was camera based and would have the problems I wrote about in my first post. This would be unusable for a military or police application for the reasons you describe above.

     

    My thought was having to do with tracking via a camera. My hopes were that your software would be able to select an object of interest and then track it on the screen...thanks for your response!

This reply was deleted.