We are thrilled to present to you the CARLA 0.9.16 release! This version brings some major upgrades to the Unreal Engine 4.26 version of CARLA that promise to augment your CARLA workflow and enhance the diversity of your simulation data!
CARLA 0.9.16 integrates powerful new rendering technologies and export options from NVIDIA. The new Cosmos Transfer1 foundational style transfer model allows the generation of multiple style variations of CARLA simulation output using text prompts, including variations in architecture, vehicles, weather and lighting conditions. Cosmos Transfer1 enables massive diversity augmentation of CARLA generated training datasets with only 1 set of simulation input data! NVIDIA NuRec brings neural reconstruction into the CARLA engine, allowing the re-rendering of a 3D scene learned from real-world sensor data with variations in viewpoint, camera configuration or perturbations in trajectory! NVIDIA’s SimReady Converter Tool enables the export of SimReady assets in the Universal Scene Description format (USD) for hassle-free transfer to other USD compatible applications from Omniverse.
Connectivity with ROS 2 is now natively supported by the CARLA server, abolishing the need for a bridge tool, giving a lower latency connection with ROS nodes for smoother simulations.
CARLA 0.9.16 includes support for Left-Handed Traffic! This long-awaited feature is finally here, allowing simulation of traffic for countries with left-handed traffic rules such as the UK or Japan. A new wheelchair model enables the inclusion of vulnerable road users into CARLA-generated training datasets.
Whether you’re building complex autonomy stacks, experimenting with digital twins, or exporting CARLA content to other platforms — this release brings you closer to production-grade simulation workflows.
This CARLA release features NVIDIA Cosmos Transfer1 integration. Cosmos Transfer1 is a foundational style transfer model designed to augment simulation outputs.
With Transfer1, users can generate endless hyper-realistic video variations from CARLA sequences using simple text prompts. This capability is ideal for:
Furthermore, by using this feature in combination with Inverted AI’s DRIVE API, users can produce realistic behaviors enhanced with endless visual variations. The perfect combination for training AV stacks! You can read more about this topic here: NVIDIA: Acceleraging AV Simulation with Neural Reconstruction and World Foundational Models.
This release also introduces support for NVIDIA NuRec 25.07, a state-of-the-art neural rendering pipeline.
With this integration, CARLA scenes can be rendered using a learned representation of light and geometry — enabling:
This is a major step toward photorealistic simulation, especially for perception training and evaluation. Expect improvements over the next few releases as the neural rendering pipeline evolves.
Learn more about NuRec and how to use it in the CARLA documentation!
We’ve heard you — ROS2 is here.
CARLA 0.9.16 ships with native ROS2 integration, opening the door for:
You can now connect CARLA directly to ROS2 Foxy, Galactic, Humble and more — with sensor streams and ego control - all without the latency of a bridge tool.
Autonomy teams, rejoice.
Need to export your CARLA environments or assets to other simulators or visualization platforms?
Our new USD SimReady Exporter lets you package your CARLA scenes and assets in the SimReady format — making them portable to the OpenUSD ecosystem and beyond.
Highlights:
Whether you’re simulating, rendering, or training agents across multiple platforms, this tool reduces friction and increases reusability. Thanks to the NVIDIA team for this contribution!
The CARLA 0.9.16 API includes support for Left-Handed Traffic. CARLA simulator now supports OpenDRIVE roads labelled with left-handed traffic rules, meaning that CARLA can accurately simulate traffic from left-handed driving countries such as the United Kingdom, India or Japan. We’re delighted to deliver this long-requested feature - we hope you find it useful!
CARLA 0.9.16 includes a wheelchair model compatible with most of the existing pedestrian models, enabling the inclusion of vulnerable road users in CARLA generated training data. This helps avoiding the misclassification of wheelchair users as bicycle riders or users of other types of personal transport. Many thanks to Marc Teuber, Andreas Graf and Hamza Ben Haj Ammar from Itemis for this excellent community contribution!
# Choose a pedestrian blueprint
pedestrian_bp = blueprint_library.find('walker.pedestrian.0041')
# Check if the pedestrian model supports the wheelchair option
if pedestrian_bp.has_attribute('can_use_wheelchair'):
# Set the use_wheelchair attribute to true and spawn
pedestrian_bp.set_attribute('use_wheelchair','True')
pedestrian = world.spawn_actor(pedestrian_bp, spawn_point)
We’ve completely re-engineered the Digital Twin Tool to make it easier than ever to generate full environments from real-world map data. Leveraging OpenStreetMap as the primary data source, this new version brings:
With just a few clicks, users can generate a CARLA environment grounded in the real world — ideal for geo-specific experiments, regulatory testing, and sim-to-real pipelines. Now shipped as a standalone and easy-to-use tool.
This is a step toward bridging digital twins with procedural simulation. And it’s just the beginning.