In autonomous driving, environment perception is the fundamental task for intelligent vehicles which provides the necessary environment information for other applications. The main issues in existing environment perception can be… Click to show full abstract
In autonomous driving, environment perception is the fundamental task for intelligent vehicles which provides the necessary environment information for other applications. The main issues in existing environment perception can be categorized into two aspects. On the one hand, all sensors are prone to measurement errors and failures. On the other hand, in complex driving environments, vehicles may encounter a variety of blind spots caused by vehicle occlusions, overlaps, and harsh weather conditions, which will cause sensors to experience low-quality data or to miss crucial environmental information. To cope with these issues, a multivehicle and multisensor (MVMS) cooperative perception method is presented to construct the occupancy grid map (OGM) of vehicles in a global view for the environment perception of autonomous driving. Distinct from existing environment perception methods, our proposed MVMS-OGM not only provides continuous geographical information but also captures and fuses continuous information with soft occupancy probabilities, resulting in more comprehensive and raw environmental information. Simulations and real-world experiments demonstrate that the proposed approach not only expands the perception range in comparison with single-vehicle sensing but also better captures the uncertainty of sensor data by fusing the occupancy probabilities with soft information.
               
Click one of the above tabs to view related content.