Autonomous Vehicle

AEye’s Perception Software Runs Inside the Sensors of Autonomous Vehicles

AEye has made commercially available a perception software designed to run inside the sensors of autonomous vehicles.

This 2D/3D perception system enables basic perception to be distributed to the edge of the sensor network. This allows autonomous designers to use sensors to not only search and detect objects, but also to acquire, and ultimately to classify and track these objects. The ability to collect this information in real-time both enables and enhances existing centralized perception software platforms by reducing latency, lowering costs and securing functional safety.

This in-sensor perception system is intended to accelerate the availability of autonomous features in vehicles across all SAE levels of human engagement, allowing automakers to enable the right amount of autonomy for any desired use case – including edge cases – in essence, providing autonomy “on demand” for ADAS, mobility and adjacent markets.

The perception system is based on AEye’s iDAR platform, whose perception advancements the company will make broadly available via a software reference library. That library includes the following features that will be resident in AEye’s AE110 (Mobility) and AE200 sensors (ADAS):

• Detection: Identification of objects (e.g. cars, pedestrians, etc.) in the 3D point cloud and camera. The system accurately estimates their centroids, width, height and depth to generate 3D bounding boxes for the objects.

• Classification: Classifying the type of detected objects. This helps in further understanding the motion characteristics of those objects.

• Segmentation: Further classifying each point in the scene to identify specific objects those points belong to. This is especially important to accurately identify finer details, such as lane divider markings on the road.

• Tracking: Tracking objects through space and time. This helps keep track of objects that could intersect the vehicle’s path.

• Range/Orientation: Identifying where the object is relative to the vehicle, and how it’s oriented relative to the vehicle. This helps the vehicle contextualize the scene around it.

• True Velocity: Leveraging the benefits of agile LiDAR to capture the speed and direction of the object’s motion relative to the vehicle. This provides the foundation for motion forecasting.

• Motion Forecasting: Forecasting where the object will be at different times in the future. This helps the vehicle to assess the risk of collision and charter a safe course.

AEye’s achievement is the result of its flexible iDAR platform that enables intelligent and adaptive sensing. The iDAR platform is based on biomimicry and replicates the elegant perception design of human vision through a combination of agile LiDAR, fused camera and artificial intelligence. The system takes a fused approach to perception – leveraging iDAR’s Dynamic Vixels, which combine 2D camera data (pixels) with 3D LiDAR data (voxels) inside the sensor. This unique software-definable perception platform allows for disparate sensor modalities to complement each other, enabling the camera and LiDAR to work together to make each sensor more powerful, while providing “informed redundancy” that ensures a functional safe system.

Source: Press Release

Back to top button