Nick McCarthy's Posts (2)

Sort by

From http://www.rtl-sdr.com/monitoring-drone-fpv-frequency-usage-with-a-usrp-software-defined-radio/


A researcher at Ettus (creators of the USRP line of software defined radios) has uploaded a video showing how he is using his USRP to help with frequency management at FPV time trial racing events.

One important technical challenge at these events is frequency management. FPV drones use many frequencies at around 2.4 GHz for control and 5.8/2.4/1.3 GHz for video. With many drones in the air it is important that frequencies are managed appropriately so as to not jam each others signals.


To try and solve this problem Balint has been using GNU Radio coupled with a USRP X310 software defined radio to get very wide band RF spectrum waterfall views of the 2.4 and 5.8 GHz bands. In the waterfalls he is able to see when control signals and video signals are transmitted and at what frequency, and is able to tell if any are overlapping and jamming each other.


The technology allows monitoring of both the full frequency spectrum used by 5.8ghz analog video transmitters, and 2.4ghz RC links. I imagine the software could also be expanded to allow the single receiver to decode and display all of the FPV video streams simultaneously for spectating and documenting races.

Read more…

Source: Tested.com - http://www.tested.com/tech/528736-steamvrs-lighthouse-virtual-reality-and-beyond/

Norm from Tested.com interviews Alan Yates from Valve about the tracking technology behind the new SteamVR head-mounted display.

The tracking technology is being opened up to the maker community, and has potential to deliver high-precision and low-latency location data to a flight controller. I predict we'll have a few projects planning to test this approach.

The SteamVR system may also work as an alternative to expensive motion capture systems currently used by several research groups in drone algorithm development.

A slow-motion video of an early Lighthouse Base Station in operation.

Note the IR LED sync pulses emitted by the sync blinker globally into the tracking volume. The sync pulses tell the receiver when the rotors are pointing in a known direction. The time between these sync pulses and the flash from a rotor beam sweeping past sensors on a tracked object are used to compute the angle of each sensor with respect to the axis of the rotor. From these bearing angles (azimuth and elevation of the sensor seen at the base station) and knowing the configuration of sensors on the tracked object the pose of the object (its 6-DOF position and orientation) can be determined with high resolution.

The sync pulses are also modulated to communicate base station identification and calibration to receivers in the tracking volume. The calibration information includes the measured nonidealities of the base station optical emissions and improves the global accuracy of the tracking by allowing the pose fitter to compensate for these nonidealities.

The video was made by slightly aliasing the camera frame rate against the spin rate of the rotors. There are some artifacts in the image caused by the brightness of the rotor beams interacting with the camera image sensor before read-out.

Read more…