Our team has been active in a wide range of different research projects including autonomous vehicle benchmarking, road user trajectory forecasting, simulation environments for autonomous driving and automatic calibration. At the moment, we have open positions for undergraduate and graduate students in the areas of simulation, real-time dynamic planning, hardware synchronization of sensors, edge computing, and robust sensor fusion using radars, lidars and vision. To learn more or engage in our research, feel free to contact us.
Motivated by recent developments in semantic scene modeling, we are exploring dynamic methods for trajectory generation to address the scalability constraints of existing HD maps. Our approach aims to tackle intersection navigation and settings in which multi-modal trajectory generation is required. Learn more about the approach and our data.
During the early mail delivery deployment missions, AVL logged various vehicle control and state signals to characterize overall system performance and robustness. This work has been submitted for publication and showcases appropriate metrics for benchmarking autonomous vehicles. We expect that the tools will raise awareness on the performance of state-of-the-art autonomous vehicle technology in order to better understand the shortcomings of today’s technology and collectively design better performing systems.
To achieve autonomy for self-driving vehicles, it is essential to understand the intent of other road-users operating in proximity to the vehicle, such as passing through an intersection, queuing, and navigating crosswalks. While many methods have been proposed for detection of cars and pedestrians and even tracking over time, less effort has been devoted to the recognition of the intent of other road-users. Will the pedestrian cross the road? Will the driver stop at the intersection or turn left? These intricacies are the type of problems that the AVL at UC San Diego is actively exploring and addressing.
While High-Definition (HD) and dense point cloud maps considerably facilitate navigation and path tracking of autonomous cars, these maps often hinder generalization and scalability. To address these constraints, AVL is currently exploring alternative architectures that leverage semantic information extracted in real-time to characterize environments dynamically.
Based on aerial imagery, elevation data, and point cloud maps of the UCSD campus, AVL is currently working on building a simulation environment that incorporates 3D scenes for simulating and testing automated driving systems.
For intelligent vehicle applications, calibration is often an important component of sensor fusion, depth estimation and scene understanding. However, in many scenarios, the estimated calibration parameters can change over time as the result of temperature and vibrations. For this reason, we are actively developing tools and methods that leverage road furniture and geometric shapes such as stop signs to dynamically calibrate our cameras on board of our vehicles in real-time. Learn more about the approach and our data.
© 2020 Autonomous Vehicle Laboratory