3D Robotics

Very cool! From the project repo:

AirSim is a simulator for drones (and soon other vehicles) built on Unreal Engine. It is open-source, cross platform and supports hardware-in-loop with popular flight controllers such as Pixhawk for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped in to any Unreal environment you want.

Our goal is to develop AirSim as a platform for AI research to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles. For this purpose, AirSim also exposes APIs to retrieve data and control vehicles in a platform independent way.


Development Status

This project is under heavy development. While we are working through our backlog of new features and known issues, we welcome contributions! Our current release is in beta and our APIs are subject to change.

How to Get It


To get the best experience you will need Pixhawk or compatible device and a RC controller. This enables the "hardware-in-loop simulation" for more realistic experience. Follow these instructions on how to get it, set it up and other alternatives.


There are two ways to get AirSim working on your machine. Click on below links and follow the instructions.

  1. Build it and use it with Unreal
  2. Use the precompiled binaries


The official Linux build is expected to arrive in about a couple of weeks. All our current code is cross-platform and CMake enabled so please feel free to play around on other operating systems and report any issues. We would love to make AirSim available on other platforms as well.

How to Use It

Manual flights

Follow the steps above to build and set up the Unreal environment. Plugin your Pixhawk (or compatible device) in your USB port, turn on the RC and press the Play button in Unreal. You should be able to control the drones in the simulator with the RC and fly around. Press F1 key to view several available keyboard shortcuts.

More details

Gathering training data

There are two ways you can generate training data from AirSim for deep learning. The easiest way is to simply press the record button on the lower right corner. This will start writing pose and images for each frame.

record screenshot

If you would like more data logging capabilities and other features, file a feature request or contribute changes. The data logging code is pretty simple and you can modify it to your heart's desire.

A more complex way to generate training data is by writing client code that uses our APIs. This allows you to be in full control of how, what, where and when you want to log data. See the next section for more details.

Programmatic control

The AirSim exposes easy to use APIs in order to retrive data from the drones that includes ground truth, sensor data as well as various images. It also exposes APIs to control the drones in a platform independent way. This allows you to use your code to control different drones platforms, for example, Pixhawk or DJI Matrice, without making changes as well as without having to learn internal protocols details.

These APIs are also available as a part of a separate independent cross-platform library so you can deploy them on an offboard computer on your vehicle. This way you can write and test your code in simulator and later execute it on the real drones. Transfer learning and related research is one of our focus areas.

More details


You can get additional technical details in our paper (work in progress). Please cite this as:

@techreport{MSR-TR-2017-9,      title =  } 


We welcome contributions to help advance research frontiers.


This project is released under MIT License. Please review License file for more details.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • So we will use a simulator to train our AIs, but who will train the simulator?

  • Intel Edison,  Windows 10 IOT please.

  • Hi Chris

    One question: Would it also work, to use the 3D map feature from Bing for a very similar purpose? Especially the 3D cities? Could that be beneficial too? Maybe as alternative to unreal.

    Hint: the next version of my flightzoomer (as a UWP app) will make use of 3D map views.



  • Hi Chris!

    Thanks for all the hard work! I love how Microsoft is involved in making OSS recently. Can't wait to see how the AirSim will grow. Are there any plans to support helicopters and fixed wings?
    Best regards,


  • That sounds awesome, can I smell a pull request in the works ? :-)  We've been super impressed by the Unreal engine.  We looked around and couldn't believe someone hadn't already done this.  So, well, we did it.  Actually Shital did most of the real work, the guy is a programming machine.  If you don't know anything about Unreal, check out this awesome demo.  And with Microsoft AirSim I can now drop into these worlds any time and fly my PX4 based drone around.  We could practically charge money for this thing.  I have a real hard time putting down the controller when I'm testing a new build...  But it's free... even better.  Party on!

  • Hi Chris - - -,

    This is a really great project, not just for the simulator but for getting into the operational capability of the Unreal engine.

    Very well presented.

    I'm working on a Laser Rangefinder based 3D perception system and have been playing around with Unity, but this gives a good insight into Unreal so may try it too.

    Best Regards,


  • If I had time, I would get started with Q learning on this simulator LIKE RIGHT NOW. This simulator is a big (very) deal.

    Great job Chris L. and team !

  • Thanks Chris, I'm actually working on this in Microsoft Research.  Super fun project.  Finally a really nice simulator that works nicely with the PX4 stack.  It can also work with DJI SDK.   It can also work with PX4 SITL mode, so Pixhawk hardware is optional.  We just find that testing fancy flight algorithms on the real flight hardware ensures the algorithms are more likely to work on a real drone.  The other PC requirements are because you want the Unreal Game engine to shine.  The Unreal game engine is absolutely amazing, I've had people walk into my office asking where I shot the video outside... 

  • Amazing, it looks like real !

  • Developer

    The requirements are to be expected when you considering it's a visual emulator for training computer vision,where you need realistic graphics to get algorithms that will work in the real world.

This reply was deleted.