I want to forward the parsed telemetry data (such as roll pitch yaw and altitude) from a ground control station over a UDP or UART port so that I can receive it in another software on the same PC. Is there a way to do this? Which GCS do I need? (Windows-based)

Consider that I do not want to forward the telemetry stream, as I can't properly parse the stream to extract the data; I want the extracted data.

Any help would be much appreciated!


You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –


  • Hi,

    sample how to read telemetry from UgCS - https://github.com/ugcs/ugcs-java-sdk/blob/master/ucs-client/src/ma...

    UgCS Java SDK. Contribute to ugcs/ugcs-java-sdk development by creating an account on GitHub.
  • hi,

    Missionplanner secretly (shhhh!)   outputs subsets of its internal data in response to html requests, in   JSON.     Take a look at     while Missionplanner is connected to a real plane or to a SITL aircraft.  (results below).     It returns lots of useful stuff in JSON, although it lacks SYSID and battery info.   I have some python code somewhere that parses this in an amateurish manner if you want me to dig it out, but it aint hard

    And for other hidden jems of Missionplanner, poke this into a browser while it is running:

    some vague pseudo-python code that might point you in the right direction (no guarantees, it is late at night here!!!) ... 

    url =  ""
    urllib2.urlopen(url, timeout=.05)
    request = urllib2.Request(url)
    contents = urllib2.urlopen(request).read(4096)

    parsed_json = json.loads(contents) 
        alt = parsed_json["VFR_HUD"]["msg"]["alt"]
        airspeed = parsed_json["VFR_HUD"]["msg"]["airspeed"]
        throttle = parsed_json["VFR_HUD"]["msg"]["throttle"]
        heading = parsed_json["VFR_HUD"]["msg"]["heading"]
        roll = parsed_json["ATTITUDE"]["msg"]["roll"]
        pitch = parsed_json["ATTITUDE"]["msg"]["pitch"]
        target_bear= parsed_json["NAV_CONTROLLER_OUTPUT"]["msg"]["target_bearing"]
        lat = parsed_json["GPS_RAW_INT"]["msg"]["lat"]
        lon = parsed_json["GPS_RAW_INT"]["msg"]["lon"]
        heading = parsed_json["VFR_HUD"]["msg"]["heading"]
        climb = parsed_json["VFR_HUD"]["msg"]["climb"]


        print "Beware the target bear!!!!"

    Parsing the telemetry stream itself is not trivial  (although I am currently trying to work out that via pymavlink), but I hope something like the above should do the job for you.











    "META_LINKQUALITY":{"msg":{"master_in":1631,"mav_loss":0,"mavpackettype": "META_LINKQUALITY","master_out":81,"packet_loss":0.0},"index":0,"time_usec":0}}

  • No, I found nothing. I have stopped looking for answers since long ago though, there might be new ways to do this now after many updates and changes.

    Good luck!

  • Any luck finding an answer on  this ?

This reply was deleted.


DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord: https://discord.gg/EPsZHkg9Nx) https://t.co/PNDewvJdrb
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20
DIY Robocars via Twitter
Incredible training performance with Donkeycar https://www.youtube.com/watch?v=9yy7ASttw04
Nov 9
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge https://t.co/nZQTff5…
Oct 31
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉 https://t.co/NtKnO…
Oct 23
DIY Robocars via Twitter
RT @IndyAChallenge: Congratulations to @TU_Muenchen the winners of the historic @IndyAChallenge and $1M. The first autonomous racecar comp…
Oct 23
DIY Robocars via Twitter
RT @JoeSpeeds: 🏎@TU_Muenchen #ROS 2 @EclipseCyclone #DDS #Zenoh 137mph. Saturday 10am EDT @IndyAChallenge @Twitch http://indyautonomouschallenge.com/stream
Oct 23
DIY Robocars via Twitter
RT @DAVGtech: Another incident: https://t.co/G1pTxQug6B
Oct 23
DIY Robocars via Twitter
RT @DAVGtech: What a great way to connect why @diyrobocars community is so valuable and important! Have to start somewhere @IndyAChallenge…
Oct 23