Driving through dense urban environments is difficult for autonomous vehicles because they must reason about the unknown intentions of a large number of road users while also dealing with a variety of uncertain information, such as sensor noise and inaccurate predictions. The partially observable Markov decision process (POMDP) is a systematic method for planning optimal policies in stochastic environments. However, using POMDP in scenarios with a large number of road users necessitates significant computational effort. In this paper, we propose a scalable online POMDP behavior planner for autonomous driving in dense urban environments. We enable intention-aware POMDP planning while considering uncertainties by using Multi-step Occupancy Grid Maps (MOGM) to represent the current and predicted states of surrounding road users, as well as their uncertain intentions. Furthermore, MOGM is applied to create a more computationally efficient POMDP model by condensing the state space and reducing the number of calculations used for collision checks. We show by numerical experiments that our approach is computationally more efficient. We demonstrate that our approach is able to navigate a dense urban scenario involving a large number of road users.
«
Driving through dense urban environments is difficult for autonomous vehicles because they must reason about the unknown intentions of a large number of road users while also dealing with a variety of uncertain information, such as sensor noise and inaccurate predictions. The partially observable Markov decision process (POMDP) is a systematic method for planning optimal policies in stochastic environments. However, using POMDP in scenarios with a large number of road users necessitates signific...
»