Source: Tested.com - http://www.tested.com/tech/528736-steamvrs-lighthouse-virtual-reali...
Norm from Tested.com interviews Alan Yates from Valve about the tracking technology behind the new SteamVR head-mounted display.
The tracking technology is being opened up to the maker community, and has potential to deliver high-precision and low-latency location data to a flight controller. I predict we'll have a few projects planning to test this approach.
The SteamVR system may also work as an alternative to expensive motion capture systems currently used by several research groups in drone algorithm development.
A slow-motion video of an early Lighthouse Base Station in operation.
Note the IR LED sync pulses emitted by the sync blinker globally into the tracking volume. The sync pulses tell the receiver when the rotors are pointing in a known direction. The time between these sync pulses and the flash from a rotor beam sweeping past sensors on a tracked object are used to compute the angle of each sensor with respect to the axis of the rotor. From these bearing angles (azimuth and elevation of the sensor seen at the base station) and knowing the configuration of sensors on the tracked object the pose of the object (its 6-DOF position and orientation) can be determined with high resolution.
The sync pulses are also modulated to communicate base station identification and calibration to receivers in the tracking volume. The calibration information includes the measured nonidealities of the base station optical emissions and improves the global accuracy of the tracking by allowing the pose fitter to compensate for these nonidealities.
The video was made by slightly aliasing the camera frame rate against the spin rate of the rotors. There are some artifacts in the image caused by the brightness of the rotor beams interacting with the camera image sensor before read-out.