AEye has introduced a new sensor data type called “Dynamic Vixels”, which are designed to more intelligently acquire and adapt data for the company’s iDAR (Intelligent Detection and Ranging) perception system.
In simple terms Dynamic Vixels combine pixels from digital 2D cameras with voxels from AEye’s Agile 3D LiDAR (Light Detection and Ranging) sensor into a single super-resolution sensor data type.
Real-time integration of all the data captured in pixels and voxels is combined into a data type that can be dynamically controlled and optimized by artificial perception systems at the point of data acquisition.
This advancement in AEye technology further strengthens its biomimicry approach to visual perception, essentially enabling vehicles to see and perceive more like humans to better evaluate potential driving hazards and adapt to changing conditions.
Dynamic Vixels create content that inherits both the ability to evaluate a scene using the entire existing library of 2D computer vision algorithms as well as capture 3D and 4D data concerning not only location and intensity but also deeper insights such as the velocity of objects.
AEye’s iDAR perception system mimics how a human’s visual cortex evaluates a scene and calculates potential driving hazards. Using embedded artificial intelligence within a distributed architecture, iDAR employs Dynamic Vixels to critically and actively assess general surroundings to maintain situational awareness, while simultaneously tracking targets and objects of interest.
As a core data element for a scalable, integrated system, Dynamic Vixels enable iDAR to act reflexively to deliver more accurate, longer range and more intelligent information faster. Dynamic Vixels can also be encrypted.
Simply put, this new way of collecting and inspecting data using at the edge-processing of the iDAR system enables the autonomous vehicle to more intelligently assess and respond to situational changes within a frame, thereby increasing the safety and efficiency of the overall system.
Now iDAR can identify objects and differentiate objects of the same color which can be leveraged to detect changing weather and automatically increase power during fog, rain, or snow.
iDAR’s heightened sensory perception allows autonomous vehicles to determine contextual changes, such as in the case of a child’s facial direction, which can be identified to calculate the probability of the child stepping out onto the street, enabling the car to prepare for the likelihood of a halted stop.
The iDAR perception system includes inventions covered by recently awarded foundational patents, including 71 intellectual property claims on the definition, data structure and evaluation methods of dynamic Vixels. These patented inventions contribute to significant performance benefits, including a 16x greater coverage, 10x faster frame rate, and 7-10x more relevant information that boosts object classification accuracy while using 8-10x less power.
AEye’s first iDAR-based product, the AE100 artificial perception system, will be available this summer to OEMs and Tier 1s launching autonomous vehicle initiatives.