Time-of-Flight camera simulation
Silicon Austria Labs
High Tech Campus Villach
9524 Villach, Austria
Lidar sensors are widely used for reliable 3D perception and classification of the environment. Locally resolved brightness and distance information is either achieved by point-by-point scanning or by directly imaging of the whole scene onto a detector array (Time-of-Flight camera). The modelling of a such direct imaging sensor together with a 3D scenery in OpticStudio is presented. OpticStudio’s raytracing capabilities allow a detailed analysis of various multipath interference effects – arrangements where a ray can take several paths within the scene before arriving at the detector. Python scripts are used to cover further raytracing data analysis and per-pixel distance calculation. The developed model is finally used to investigate how environmental effects, namely rain and fog, influence the sensor’s distance accuracy.