- Projekt Title: iLIDS4SAM
- FFG IKT der Zukunft Leitprojekt
- Project duration: 36 Monate
- Requested project volume: 5,67 M€ | requested funding volume: 3.49 M€
- Project coordinator: Infineon Technologies Austria AG
- Website: https://www.ilids4sam.at
Current Advanced Driver Assistance Systems focus on comparatively simple scenarios with objects behaving in predictable ways, such as highway traffic or parking assistance without pedestrians, cyclists or transversal traffic. Yet, there is an urgent need to extend ADAS and Automated Driving (AD) to handle urban traffic scenarios. The bmvit’s Aktionsplan “Automatisiert-vernetzt-mobil” specifies seven use cases, three of which are assigned priority. They include urban traffic scenarios and call for 360° awareness and anticipatory hazard recognition. Also Euro NCAP now defines test cases that involve Vulnerable Road Users (VRUs) such as adult pedestrians, children and cyclists in urban scenarios.
iLIDS4SAM will address this by developing novel LiDAR-based systems for predictive assessment of hazardous situations involving VRUs in an urban setting. The necessary parallel improvement of the field of view and the resolution requires innovation of all components of LiDAR sensors: a hybrid laser source providing shorter and more intense pulses at a higher repetition rate, a new scanning mirror and packaging design with larger area at higher oscillation angles, a receiver with a larger detector array as well as more efficient and more accurate pulse detection and timing.
The resulting point clouds will be the input for 3D object detection and classification. Objects will be segmented, identified and classified as vehicles, pedestrians, cyclists, stationary objects, etc. by making use of machine learning algorithms and deep learning in particular in a secure, subsidiary data processing environment. These LiDAR data will subsequently be sensor data-fused with data from e.g. radar sensors and cameras. The hardware for sensor data processing and fusion includes an embedded computer with standard sensor interfaces to support various LiDAR, radar and ultrasound sensors as well as cameras and provide the required network connectivity. This will allow for a comprehensive data collection from all sensors during test runs. The collected data will be labelled and used for training, testing, and evaluation of the object classifiers and algorithms. It is planned to make a subset of the data publicly available to support and foster future research in the area; a data management plan will provide details for how to access these data. Furthermore, the hardware for sensor fusion will serve as a development platform that will also be available for future research projects. In addition to a planned contribution to a new ISO standard in ISO TC22/SC31 “Road vehicles – data communication” for standard sensor interfaces, the Open Simulation Interface shall be used to pass the fused objects, free space information etc. to the scene understanding algorithms where objects are tracked and their behavior is predicted to derive a hazard assessment.
For the highly relevant simulation and validation of driver assistance and autonomous systems in urban environments, new test and reference systems based on the high-resolution LiDAR sensor will be developed. Finally, several selected use cases, including road and rail as well as agricultural applications, will be implemented and tested to demonstrate the practical relevance and capability of the approach.
Dr. Andreas Tortschanoff
Staff Scientist Photonic Systems