AirSim
Open source simulator for autonomous vehicles built on Unreal Engine / Unity, from Microsoft AI & Research
Top Related Projects
The Unity Machine Learning Agents Toolkit (ML-Agents) is an open-source project that enables games and simulations to serve as environments for training intelligent agents using deep reinforcement learning and imitation learning.
A toolkit for developing and comparing reinforcement learning algorithms.
OpenSpiel is a collection of environments and algorithms for research in general reinforcement learning and search/planning in games.
Open-source simulator for autonomous driving research.
Bullet Physics SDK: real-time collision detection and multi-physics simulation for VR, games, visual effects, robotics, machine learning etc.
Check out the new game server:
Quick Overview
AirSim is an open-source simulator for autonomous vehicles built on Unreal Engine / Unity. It offers a platform for AI research to experiment with deep learning, computer vision, and reinforcement learning algorithms for autonomous vehicles. AirSim provides realistic environments, hardware-in-loop protocols, and APIs to seamlessly interact with autonomous vehicle platforms.
Pros
- Highly realistic simulation environments for autonomous vehicles
- Support for both Unreal Engine and Unity game engines
- Extensive APIs for various programming languages (C++, Python, C#, Java)
- Integration with popular robotics frameworks like ROS and MAVLINK
Cons
- Steep learning curve for beginners
- Resource-intensive, requiring powerful hardware for optimal performance
- Limited documentation for advanced features
- Occasional stability issues and bugs
Code Examples
- Connecting to the AirSim client:
import airsim
# Connect to the AirSim simulator
client = airsim.MultirotorClient()
client.confirmConnection()
- Taking off and moving a drone:
# Arm the drone and take off
client.enableApiControl(True)
client.armDisarm(True)
client.takeoffAsync().join()
# Move the drone to a specific position
client.moveToPositionAsync(-10, 10, -10, 5).join()
- Capturing images from the drone's camera:
# Get camera images
responses = client.simGetImages([
airsim.ImageRequest("0", airsim.ImageType.DepthVis),
airsim.ImageRequest("1", airsim.ImageType.Scene)
])
# Save the images
for idx, response in enumerate(responses):
airsim.write_file(f'output_{idx}.png', response.image_data_uint8)
Getting Started
-
Install AirSim:
pip install airsim
-
Download and set up Unreal Engine or Unity.
-
Clone the AirSim repository:
git clone https://github.com/Microsoft/AirSim.git
-
Build AirSim:
cd AirSim ./setup.sh ./build.sh
-
Launch an AirSim environment in Unreal Engine or Unity.
-
Use the Python API to interact with the simulator:
import airsim client = airsim.MultirotorClient() client.confirmConnection()
Competitor Comparisons
The Unity Machine Learning Agents Toolkit (ML-Agents) is an open-source project that enables games and simulations to serve as environments for training intelligent agents using deep reinforcement learning and imitation learning.
Pros of ml-agents
- Broader application scope, suitable for various game and simulation scenarios
- Tighter integration with Unity, leveraging its extensive ecosystem
- More accessible for developers already familiar with Unity
Cons of ml-agents
- Less specialized for robotics and autonomous vehicle simulations
- May require more setup for complex physics simulations
Code Comparison
ml-agents:
from mlagents_envs.environment import UnityEnvironment
from mlagents_envs.side_channel.engine_configuration_channel import EngineConfigurationChannel
channel = EngineConfigurationChannel()
env = UnityEnvironment(file_name="MyEnvironment", side_channels=[channel])
AirSim:
import airsim
client = airsim.MultirotorClient()
client.confirmConnection()
client.enableApiControl(True)
client.armDisarm(True)
Both repositories provide Python APIs for interacting with their respective simulation environments. ml-agents focuses on creating and managing Unity-based environments, while AirSim offers more specific controls for drone and vehicle simulations.
ml-agents is better suited for general game development and simulations within the Unity ecosystem, while AirSim excels in realistic physics-based simulations for robotics and autonomous vehicles. The choice between them depends on the specific requirements of your project and your familiarity with the respective platforms.
A toolkit for developing and comparing reinforcement learning algorithms.
Pros of Gym
- Broader range of environments, including classic control, robotics, and Atari games
- Simpler setup and installation process
- More extensive documentation and community support
Cons of Gym
- Less realistic physics simulation compared to AirSim
- Limited support for autonomous vehicle and drone simulations
- Fewer built-in sensors and perception capabilities
Code Comparison
Gym example:
import gym
env = gym.make('CartPole-v0')
observation = env.reset()
for _ in range(1000):
action = env.action_space.sample()
observation, reward, done, info = env.step(action)
AirSim example:
import airsim
client = airsim.MultirotorClient()
client.enableApiControl(True)
client.armDisarm(True)
client.takeoffAsync().join()
client.moveToPositionAsync(-10, 10, -10, 5).join()
The Gym code demonstrates creating a simple environment and running a basic loop, while the AirSim code shows connecting to a drone and executing flight commands. AirSim provides more detailed control over vehicle dynamics, while Gym offers a wider variety of simpler environments for reinforcement learning tasks.
OpenSpiel is a collection of environments and algorithms for research in general reinforcement learning and search/planning in games.
Pros of OpenSpiel
- Broader scope: Focuses on reinforcement learning and game theory across various game types
- Extensive game library: Includes a wide range of board games, card games, and abstract games
- Easier to set up: Simpler installation process and fewer dependencies
Cons of OpenSpiel
- Less realistic simulations: Lacks the high-fidelity 3D environments provided by AirSim
- Limited to game-based scenarios: Not suitable for robotics or autonomous vehicle research
- Fewer built-in visualization tools: AirSim offers more advanced rendering capabilities
Code Comparison
OpenSpiel (Python):
import pyspiel
game = pyspiel.load_game("tic_tac_toe")
state = game.new_initial_state()
while not state.is_terminal():
legal_actions = state.legal_actions()
action = np.random.choice(legal_actions)
state.apply_action(action)
AirSim (C++):
#include "vehicles/multirotor/api/MultirotorRpcLibClient.hpp"
msr::airlib::MultirotorRpcLibClient client;
client.enableApiControl(true);
client.armDisarm(true);
client.takeoffAsync(5)->waitOnLastTask();
client.moveToPositionAsync(-10, 10, -10, 5)->waitOnLastTask();
Open-source simulator for autonomous driving research.
Pros of CARLA
- More realistic urban environments and traffic scenarios
- Better support for multi-agent simulations
- Larger and more active community, with frequent updates
Cons of CARLA
- Higher system requirements and more resource-intensive
- Steeper learning curve for beginners
- Limited support for aerial vehicles compared to AirSim
Code Comparison
CARLA example (Python):
import carla
client = carla.Client('localhost', 2000)
world = client.get_world()
blueprint = world.get_blueprint_library().find('vehicle.tesla.model3')
spawn_point = world.get_map().get_spawn_points()[0]
vehicle = world.spawn_actor(blueprint, spawn_point)
AirSim example (Python):
import airsim
client = airsim.CarClient()
client.confirmConnection()
client.enableApiControl(True)
car_controls = airsim.CarControls()
car_controls.throttle = 1.0
client.setCarControls(car_controls)
Both simulators offer Python APIs for vehicle control, but CARLA's API is more focused on urban scenarios and multi-agent interactions, while AirSim provides a simpler interface for basic vehicle control and supports both cars and drones.
Bullet Physics SDK: real-time collision detection and multi-physics simulation for VR, games, visual effects, robotics, machine learning etc.
Pros of Bullet3
- More versatile physics engine, suitable for various applications beyond robotics and drones
- Longer development history and wider community support
- Lighter weight and easier to integrate into existing projects
Cons of Bullet3
- Less specialized for autonomous vehicle and drone simulations
- Lacks built-in camera and sensor simulation capabilities
- Requires more setup and configuration for specific use cases
Code Comparison
AirSim example (Python):
import airsim
client = airsim.MultirotorClient()
client.enableApiControl(True)
client.armDisarm(True)
client.takeoffAsync().join()
Bullet3 example (C++):
#include "btBulletDynamicsCommon.h"
btDefaultCollisionConfiguration* collisionConfiguration = new btDefaultCollisionConfiguration();
btCollisionDispatcher* dispatcher = new btCollisionDispatcher(collisionConfiguration);
btBroadphaseInterface* overlappingPairCache = new btDbvtBroadphase();
btSequentialImpulseConstraintSolver* solver = new btSequentialImpulseConstraintSolver;
btDiscreteDynamicsWorld* dynamicsWorld = new btDiscreteDynamicsWorld(dispatcher, overlappingPairCache, solver, collisionConfiguration);
AirSim is more focused on drone and autonomous vehicle simulation, providing a higher-level API for these specific use cases. Bullet3 offers a more general-purpose physics engine, requiring more low-level setup but providing greater flexibility for various applications.
Check out the new game server:
Pros of Football
- Focuses on a specific domain (soccer), making it more accessible for researchers in sports AI and game strategy
- Provides a rich, multi-agent environment with complex dynamics, ideal for studying cooperative and competitive behaviors
- Offers a lightweight simulation that can run faster than real-time, enabling rapid experimentation and training
Cons of Football
- Limited to soccer scenarios, less versatile than AirSim's broader robotics and autonomous systems applications
- Lacks the photorealistic graphics and physics simulation capabilities of AirSim
- May not be as suitable for transferring learned behaviors to real-world robotic systems
Code Comparison
Football example:
env = football_env.create_environment(
env_name="academy_empty_goal_close",
representation="raw",
stacked=False,
logdir='/tmp/football',
write_goal_dumps=False,
write_full_episode_dumps=False,
render=False
)
AirSim example:
import airsim
client = airsim.MultirotorClient()
client.confirmConnection()
client.enableApiControl(True)
client.armDisarm(True)
client.takeoffAsync().join()
client.moveToPositionAsync(-10, 10, -10, 5).join()
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
AirSim announcement: This repository will be archived in the coming year
In 2017 Microsoft Research created AirSim as a simulation platform for AI research and experimentation. Over the span of five years, this research project has served its purposeâand gained a lot of groundâas a common way to share research code and test new ideas around aerial AI development and simulation. Additionally, time has yielded advancements in the way we apply technology to the real world, particularly through aerial mobility and autonomous systems. For example, drone delivery is no longer a sci-fi storylineâitâs a business reality, which means there are new needs to be met. Weâve learned a lot in the process, and we want to thank this community for your engagement along the way.
In the spirit of forward momentum, we will be releasing a new simulation platform in the coming year and subsequently archiving the original 2017 AirSim. Users will still have access to the original AirSim code beyond that point, but no further updates will be made, effective immediately. Instead, we will focus our efforts on a new product, Microsoft Project AirSim, to meet the growing needs of the aerospace industry. Project AirSim will provide an end-to-end platform for safely developing and testing aerial autonomy through simulation. Users will benefit from the safety, code review, testing, advanced simulation, and AI capabilities that are uniquely available in a commercial product. As we get closer to the release of Project AirSim, there will be learning tools and features available to help you migrate to the new platform and to guide you through the product. To learn more about building aerial autonomy with the new Project AirSim, visit https://aka.ms/projectairsim.
Welcome to AirSim
AirSim is a simulator for drones, cars and more, built on Unreal Engine (we now also have an experimental Unity release). It is open-source, cross platform, and supports software-in-the-loop simulation with popular flight controllers such as PX4 & ArduPilot and hardware-in-loop with PX4 for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped into any Unreal environment. Similarly, we have an experimental release for a Unity plugin.
Our goal is to develop AirSim as a platform for AI research to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles. For this purpose, AirSim also exposes APIs to retrieve data and control vehicles in a platform independent way.
Check out the quick 1.5 minute demo
Drones in AirSim
Cars in AirSim
How to Get It
Windows
Linux
macOS
For more details, see the use precompiled binaries document.
How to Use It
Documentation
View our detailed documentation on all aspects of AirSim.
Manual drive
If you have remote control (RC) as shown below, you can manually control the drone in the simulator. For cars, you can use arrow keys to drive manually.
Programmatic control
AirSim exposes APIs so you can interact with the vehicle in the simulation programmatically. You can use these APIs to retrieve images, get state, control the vehicle and so on. The APIs are exposed through the RPC, and are accessible via a variety of languages, including C++, Python, C# and Java.
These APIs are also available as part of a separate, independent cross-platform library, so you can deploy them on a companion computer on your vehicle. This way you can write and test your code in the simulator, and later execute it on the real vehicles. Transfer learning and related research is one of our focus areas.
Note that you can use SimMode setting to specify the default vehicle or the new ComputerVision mode so you don't get prompted each time you start AirSim.
Gathering training data
There are two ways you can generate training data from AirSim for deep learning. The easiest way is to simply press the record button in the lower right corner. This will start writing pose and images for each frame. The data logging code is pretty simple and you can modify it to your heart's content.
A better way to generate training data exactly the way you want is by accessing the APIs. This allows you to be in full control of how, what, where and when you want to log data.
Computer Vision mode
Yet another way to use AirSim is the so-called "Computer Vision" mode. In this mode, you don't have vehicles or physics. You can use the keyboard to move around the scene, or use APIs to position available cameras in any arbitrary pose, and collect images such as depth, disparity, surface normals or object segmentation.
Weather Effects
Press F10 to see various options available for weather effects. You can also control the weather using APIs. Press F1 to see other options available.
Tutorials
- Video - Setting up AirSim with Pixhawk Tutorial by Chris Lovett
- Video - Using AirSim with Pixhawk Tutorial by Chris Lovett
- Video - Using off-the-self environments with AirSim by Jim Piavis
- Webinar - Harnessing high-fidelity simulation for autonomous systems by Sai Vemprala
- Reinforcement Learning with AirSim by Ashish Kapoor
- The Autonomous Driving Cookbook by Microsoft Deep Learning and Robotics Garage Chapter
- Using TensorFlow for simple collision avoidance by Simon Levy and WLU team
Participate
Paper
More technical details are available in AirSim paper (FSR 2017 Conference). Please cite this as:
@inproceedings{airsim2017fsr,
author = {Shital Shah and Debadeepta Dey and Chris Lovett and Ashish Kapoor},
title = {AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles},
year = {2017},
booktitle = {Field and Service Robotics},
eprint = {arXiv:1705.05065},
url = {https://arxiv.org/abs/1705.05065}
}
Contribute
Please take a look at open issues if you are looking for areas to contribute to.
Who is Using AirSim?
We are maintaining a list of a few projects, people and groups that we are aware of. If you would like to be featured in this list please make a request here.
Contact
Join our GitHub Discussions group to stay up to date or ask any questions.
We also have an AirSim group on Facebook.
What's New
- Cinematographic Camera
- ROS2 wrapper
- API to list all assets
- movetoGPS API
- Optical flow camera
- simSetKinematics API
- Dynamically set object textures from existing UE material or texture PNG
- Ability to spawn/destroy lights and control light parameters
- Support for multiple drones in Unity
- Control manual camera speed through the keyboard
For complete list of changes, view our Changelog
FAQ
If you run into problems, check the FAQ and feel free to post issues in the AirSim repository.
Code of Conduct
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
License
This project is released under the MIT License. Please review the License file for more details.
Top Related Projects
The Unity Machine Learning Agents Toolkit (ML-Agents) is an open-source project that enables games and simulations to serve as environments for training intelligent agents using deep reinforcement learning and imitation learning.
A toolkit for developing and comparing reinforcement learning algorithms.
OpenSpiel is a collection of environments and algorithms for research in general reinforcement learning and search/planning in games.
Open-source simulator for autonomous driving research.
Bullet Physics SDK: real-time collision detection and multi-physics simulation for VR, games, visual effects, robotics, machine learning etc.
Check out the new game server:
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot