Last year, OTSL released Advanced Millimeter Wave Radar Simulator (AMMWR Simulator) and Advanced Laser Radar/LIDAR Simulator (ALR Simulator), entering the business of real-time simulators for autonomous driving.
This time OTSL adds Advanced Infrared/ uBolometer Simulator (AIRB Simulator), Advanced Camera Image Sensor Simulator (ACIS Simulator), and Advanced Ultrasonic Simulator (AUS Simulator) to its simulator product lineup, offering a total of five types of simulators that can be used for all sensor systems for autonomous driving.
AIRB Simulator superimposes the far-infrared (temperature) data of all road objects on a 3D map and visualizes them in real time while moving them freely in any direction and at any speed. Since infrared/ uBolometer sensors not only enable recognizing objects at night but also are hardly influenced by bad weather such as rain and snow, they will hopefully be used mainly in Europe and America which have road conditions where lighting infrastructure on expressways is often underdeveloped.
AIRB Simulator analyzes the material data of objects on a 3D map from their molecular structures and performs sophisticated calculation processing in real time to bring data such as absorption spectra, thermal radiation based on the blackbody radiation model, and temperature rises due to the radiation energy of sunlight as close to real far-infrared (temperature) data as possible. AIRB Simulator is the world’s first infrared sensor simulator for autonomous driving that enables dynamic real-time simulation.
ACIS Simulator enables simulation for autonomous driving not only for a camera but also for lens configurations. Since autonomous driving technology using cameras can be implemented at low cost, it is an effective solution in Japan and other regions where road conditions are relatively good due to spread of night lighting infrastructure.
However, common camera simulators for autonomous driving do not allow you to configure the detailed optical characteristics that are different for each lens, such as the ghost phenomenon where the light reflected at the lens surface is shown in the image or where the image is distorted or deformed due to light refraction in the lens. So, simulation images remain different from real-world images.
ACIS Simulator allows you to configure optical characteristics such as aberration and depth of field that are different for each lens, and the settings of lens characteristics, such as enabling/ disabling of the anti-reflection coating, enabling a more accurate real-time simulation to be implemented.
AUS Simulator is designed for ultrasonic sensors that are mainly used for autonomous parking or parking assist technology. The ability to freely set the number of sensors, the mounting height and angle, the detection capability depending on the material of the mounting location and the mounting method, and the ultrasonic irradiation angle and intensity enables real-time simulation in various situations of not only parking but also autonomous driving.
OTSL’s 3D real-time sensor simulator product lineup for autonomous driving, COSMOsim, is the world’s only platform that allows you to operate five types of simulators for millimeter-wave radar, LIDAR, camera, infrared/ uBolometer, and ultrasonic sensors simultaneously in real time on a single screen. Automotive manufacturers can simulate a driving situation by sensor-based modeling and check the recognition and control of autonomous driving, and verify the sensor-mounting positions on vehicles efficiently, which eliminates the need for test driving with real vehicles.
System component suppliers (vehicle sensor manufacturers) can review the design parameters and check the reaching distance and sensing area with higher efficiency by visualizing the behavior of vehicle sensors. Semiconductor manufacturers developing sensor devices can model and simulate a device under development and verify it at high speed.
Source: Press Release