What do you want from motion capture?

Hi Everyone,

 

Ever seen the series of  Quadrotor videos posted online by the GRASP Lab? Or the piano playing UAV from ETH?

 

 

The real time tracking system (motion capture) used in this technology is developed by Vicon Motion Systems.

 

My name is Alex and I work for Vicon.  We’re new to this community and very keen to hear your thoughts on how we can become more of an active member. Feel free to share your ideas. If there is technical information we can provide, or perhaps trial software please don’t hesitate to contact me.

 

Our T Series systems are the latest technology and can cost big dollars. What you might not know is we have developed a cost effective camera named the Bonita camera. It’s based on the same technology and software as our higher-end cameras, however is available at a more affordable price. A full Bonita system (suitable for UAV tracking) with software is available for under $20,000.

 

Happy to be involved in the community and look forward to more awesome vids!

 

Alex

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Paul, you have combined many different technologies in your post which makes answering your questions very confusing.
    Motion capture refers to using many (at least three) stationary cameras to track IR markers glued to a rigid object.  By combining the view from multiple cameras you can determine the location of the markers to within 1mm.  Note that this is not stereo vision since only reflective markers can be tracked.  This system requires a room full of cameras - picking up or moving a camera would just make the system blow up.  Going outside would saturate the IR cameras and make the system worthless.
    You seem to have confused motion capture with visual odometry.  Visual odometry is using cameras to estimate distance traveled.  A step further is visual SLAM which stores features your robot seen.  If you travel in a circle visual odometry will have a buildup of error whereas SLAM will recognize that it has visited a location before and remain accurate.

    So what benefit is there to flying in a lab with motion tracking?
    When flying a UAV there are essentially two problems, localization and controls.  Reseachers who use motion capture are given perfect localization data and thus can focus on controlling flips and such.
    There are, of course, other researchers who focus on only localization.  The thing is to test UAV controls you need to actually fly whereas a guy testing localization can just walk around with a box of electronics and see how accurately it guesses its own position.  The latter is not as popular on youtube.
    In an academic setting the two fields remain relatively seperate which is why quadrotor guys use motion capture and localization guys don't actually build robots.
    Of course, getting millimeter accuracy outside is impossible with todays technology.  You could probably get 5cm drift in a hover but as soon as the quadrotor starts performing aggressive maneuvers getting accurate localization becomes hard.
    Currently controls have been solved far better than localization, but if the technogy appears all the lab research might suddenly become viable outdoors.
    There is also research into agressive maneuvers without motion capture but to be honest the problem is so hard most of it is military so it doesn't get thrown on youtube.
    This is a huge topic so my explanations in this post are very simplified.  If you are interested wikipedia has some information and I can elaborate on certain topics.

  • Honestly, I think that if motion tracking is going to become a part of the hobbyist scene, I see it happening with consumer motion tracking devices like the Kinect and it's future revisions. Not accurate or fast, yet, but it's cheap and it has a ton of support from the community, it can also provide a lot of the information needed to keep the quad stable in the air if the right visibility markers are used. It's also cheap enough for a hobbyist to try, and fail at using it, and not feel too bad about the investment.

  • Alex, thank you for posting here! I have some questions!

     

    How does motion capture technology translate from an indoor lab to chasing bank robbers through open windows? Or if bank robbers or terrorists isn't the reason to be able to fly through a square or a hoop, then what is the target maket? Is there a group of buyers out there who want to buy robots that can play ping pong or juggle? Maybe put on some sort of Las Vegas style show? I just wonder how you move from indoors to outdoors with your technology.

     

    What are the camera requirements? Can it fit on a 1-2 lb quad or plane? Do you have a software package that can use a low-res FPV camera's data stream to track via a ground station computer and then the data could be fed back up to the UAV? The UAV is not typically going to have a laptop onboard to do any non-onboad the camera processing, so I'm wondering how the tracking works exactly.

     

    In our applications, the camera will need to handle very heavy vibrations and the camera itself needs to be somewhat durable as soft landings are not a sure thing. So when the camera is not sitting on a tripod in a lab, how well does it work?

     

    If your system requires the camera it is not going to be useful to our community for several reasons. The software, however, might be useful if you've got an API of some sort that can handle any video input and "recognize" something in the video feed to follow and then output some kind of coordinate data based on the current FOV. That way, the camera can know which way to turn to keep it in the center of the screen and the plane can then turn based on where the camera needs to point.

  • A hobbiest really wouldn't have a use for motion tracking anyways. We use optitrack (starts at 800$) in our lab although the university has other systems (including a vicon MX) should additional precision be necessary.

    I'm personally more interested in reducing dependence on this kind of stuff though. No matter how impressive your work is, if it requires 500,000$ of cameras tracking its trajectory your work is only useful in an academic context.
  • The sorts of demos referenced are impressive but I find very misleading.  When people see a quadrotor flying through a narrow slot they can imagine this platform sneaking into  a terrorist camp and finding the "bad guys" but these demos only work in one room with 100K of external sensors.  The demos are completely the opposite of autonomous the real brains are external to the platform. That means that these platforms can never leave that instrumented room. 

     

    This leaves only two options

    1) we need to invite the terrorists into university research labs and into these instrumented rooms

    2) we need to pay the terrorists to instrument thier hideouts so we can spy on them

     

    If a platform can't fly under its own power/sensors then it is nothing more than a parlor trick. 

  • Unfortunately I think a motion capture system is and will stay out of the reach of most of the members here, even an extremely affordable system like the Vicon Bonita. And yes, $20,000 in this context can unfortunately be considered extremely affordable. Motion capture systems are a great tool for detailed trajectory tracking, but what hobbyist will ever have that need? They are an extremely valuable tool, but due to the price will primarily be available to research institutions.

    An example: we recently purchased a motion capture system for an upcoming graduate course using quadrotor trajectory planning. Due to a limited budget, the full-feature Vicon system was not an option so we purchased a competitor's simpler version for about $15,000. It probably lacks performance in precision and many other areas compared to the Vicon MX, but it gets the job done for us.

    Alex, I'd be extremely interested to start an offline discussion with you about the Bonita system. Thanks

  • 3D Robotics
    John and Randy are right: that's well within the budget of many university research labs.

    The point is that we've described Vicon systems as costing $500,000 before (and the high-end ones do), but there are also much cheaper solutions as Alex notes.

    What I'd like to know is how much of that cost is due to hardware and how much software. If there were an open source alternative, how much would it cost?
  • Developer

    $20K might seem like a lot of money for a DIY hobbyist. But for someone with a commercial interest (or a research grand) it is peanuts.

  • Developer

    DIYDrones is getting pretty big though.  Perhaps there are some university people are kicking around that have some ideas...

  • Mostly here is all about non-profit hobby. Maybe $20.000 system is more affordable than some others, however, I think, for the most part of the members of this community it's still impossible to pay such price. I can hardly imagine who can afford himself to pay such price for hobby staff.

This reply was deleted.