Posted 2024-11-06 00:00:00 +0000 UTC
According to foreign media reports, AEye has released the world's first commercial 2D/3D sensing system for automatic driving vehicle sensors. This is the first time to realize basic perception on the edge of sensor network, which enables vehicle designers not only to search and detect objects with sensors, but also to classify and track them. This real-time information collection capability can reduce latency, reduce costs and ensure functional security, and support and enhance the existing centralized awareness software platform. (photo source: aeye official website) the 2D / 3D sensing system is based on the idar? Platform. Aeye's idar? Platform supports intelligent and adaptive sensing. The platform is based on bionics, combining lidar, fusion camera and AI, so that vehicles can "see the world" more like humans and perceive the surrounding environment. Idar's dynamic vixels, combined with 2D camera data (pixels) and 3D lidar data (voxels), is the first perception system adopting fusion method. The software defines the sensor platform to allow different sensor models to complement each other, simultaneous interpreting the camera and lidar to make sensors more powerful, and providing "informed redundancy" to ensure system functional safety. The AEye method improves the reliability of automatic driving vehicle detection and classification, and expands the scope of object detection, classification and tracking. The earlier an object is classified and its trajectory is accurately predicted, the more time a vehicle has to brake, steer or accelerate to avoid collision. The first generation robot vision system attempts to capture as much data as possible to solve the challenge of fully automatic driving, but it needs a lot of time and processing ability. The second generation system is designed to intelligently collect and manage data and transform it into actionable information. The idar platform supports a variety of ADAS security enhancement applications, such as collision avoidance, selective automatic driving (highway lane change), and full automatic driving in closed geographic fence or open scene. Now, engineers can use new software to define sensors for testing without waiting for the next generation of hardware. Engineers can adjust the lens mode in less than a second and simulate the impact for optimum performance. In addition, through modular design, customized functions or energy consumption utilization, such as using only a smaller laser, not a camera, to create a specific ADAS system with a price lower than $1000. Or combine short-range and long-range lidars, cameras and radars to create a more advanced 360 degree system for less than $15000. Unlike previous generations of sensors, OEM and tier 1 can now move algorithms to new sensors at the right time. When acquiring objects, the system will use speed and direction information to verify and classify, so it can search, detect and classify more quickly and accurately, and predict the behavior of objects, including inferring their intentions. The system has more accurate, timely and reliable sensing ability, and less energy consumption than traditional sensing solutions. The 2D / 3D sensing system is based on the idar platform of aeye, which will provide a wide range of sensing capabilities through the software reference library. The software reference library includes the following functions: recognition of 3D point clouds and objects in cameras, such as cars, pedestrians, etc. The system can accurately estimate the center, width, height and depth of the object, and then generate 3D bounding box. : classifying the types of detected objects is helpful to further understand the motion characteristics of objects. : further classify each point in the scene and identify the specific object to which these points belong. This is especially important for accurately identifying more subtle details, such as lane separation signs. : tracking objects from two dimensions of space and time helps track objects that may cross the vehicle path. : determine the relative position and direction of the object and the vehicle to help the vehicle understand the surrounding scene. Using lidar to capture the moving speed and direction of the object relative to the vehicle is helpful to predict the movement of the object. : predicting the position of the object at different times in the future will help the vehicle assess the risk of collision and plan a safe route.
Copyright © 2020. TUTESL All rights reserved.