Software

Welcome to our lab's software section! We are a team of innovative researchers and developers who believe in the power of technology to change the world. Our lab is dedicated to inventing new software that solves real-world problems for autonomous vehicles and making it available to everyone.

We are proud to be a part of the open-source community, where knowledge is shared freely and collaboration drives progress. All of the teaching material we create and the software that we develop is released open source, meaning that anyone can access, use, and build upon our work.

On this page, you will find a comprehensive list of all the software that our lab has developed and released. From cutting-edge tools for data analysis, simulations for autonomous vehicles to algorithms that enable safe and trustworthy autonomy for a wide range of highly integrated autonomous vehicle applications.

At our lab, we believe that software has the power to make a difference. Whether you are a researcher, a developer, or just someone who is interested in technology, we invite you to join us in our mission on developing the next generation of intelligent autonomous vehicles.

Visit our Github Page

 

TUM EDGAR: Digital Twin of an autonomous vehicle

A digital twin is a virtual representation of a physical object or system. In the case of our repository, the digital twin serves as a digital counterpart of our autonomous research vehicle. It captures the vehicle's behavior, performance, and characteristics in a virtual environment. With this information you can simulate the EDGAR vehicle in various 2D and 3D simulation environments.

Link to Github Repository

Link to Paper

FlexMap Fusion: Multi-LiDAR Localization and Mapping Pipeline

Autonomous vehicles require accurate and robust localization and mapping algorithms to navigate safely and reliably in urban environments. We present a novel sensor fusion-based pipeline for offline mapping and online localization based on LiDAR sensors. The proposed approach leverages four LiDAR sensors. Mapping and localization algorithms are based on the KISS-ICP, enabling real-time performance and high accuracy.

Link to Github Repository

Link to Paper

FRENETIX: Motion Planner for Autonomous Vehicles

Autonomous vehicles need a dedicated software module that allows for safe and efficient trajectory calculation. We release here a Frenet trajectory planning algorithm that generates trajectories in a sampling-based approach. The Repo provides a python-based and a C++-accelerated motion planner that sample polynomial trajectories in a curvilinear coordinate system and evaluate them for feasibility and cost.

Link to Github Repository

Link to Paper

ROS 2 Latency Analysis

ROS2 software systems usually consist of a high number of publisher and subscriber modules. Our software tool creates a data flow graph (DFG) for a ROS2 software for autonomous driving using static code analysis. This allows IData dependencies of a C++-based ROS2 system to be found and visualized more easily without manual annotation.

Link to Github Repository

Link to Paper

CamRaDepth: Semantic Guided Depth Estimation Using Monocular Camera and Sparse Radar

Since cameras output only 2D images and active sensors such as LiDAR or RADAR produce sparse depth measurements, information-rich depth maps must be estimated. In this software, we combine the potential of visual transformers by fusing images from a monocamera, semantic segmentations, and the projections from radar sensors for robust monocular depth estimation.

Link to Github Repository

Link to Paper

ORB-SLAM3-RGBL

In this work, we add RGB-L (LiDAR) mode to the well-known ORB-SLAM3. This allows us to integrate LiDAR depth measurements directly into the visual SLAM. In this way, we obtain an accuracy that is extremely close to a stereo camera. At the same time, we were able to greatly reduce the calculation times.

Link to Github Repository

Link to paper