Department of Computer Science
Integrating value-directed compression and belief space analysis for POMDP decomposition
Partially observable Markov decision process (POMDP) is a commonly adopted framework to model planning problems for agents to act in a stochastic environment. Obtaining the optimal policy of POMDP for large-scale problems is known to be intractable, where the high dimension of its belief state is one of the major causes. The use of the compression approach has recently been shown to be promising in tackling the curse of dimensionality problem. In this paper, a novel value-directed belief compression technique is proposed,together with clustering of belief states for further reducing the underlying computational complexity. We first cluster some sampled belief states into disjoint partitions and then apply a non-negative matrix factorization (NMF) based projection to each belief state cluster for dimension reduction. We then compute the optimal policy is then computed using a pointed-based value iteration algorithm defined in the low-dimensional projected belief state space. The proposed algorithm has been evaluated using a synthesized navigation problem. Solutions with quality comparable to the original POMDP were obtained at a much lower computational cost.
Clustering algorithms, Stochastic processes, Large-scale systems, Partitioning algorithms, State-space methods, Computational efficiency, History, Probability distribution, Intelligent agent, Computer science
Source Publication Title
Proceedings of the IEEE/WIC/ACM InternationalConference on Intelligent Agent Technology (IAT'06)
Hong Kong, China
Copyright © 2006 by The Institute of Electrical and Electronics Engineers, Inc.
Link to Publisher's Edition
Li, X., Cheung, W., & Liu, J. (2006). Integrating value-directed compression and belief space analysis for POMDP decomposition. Proceedings of the IEEE/WIC/ACM InternationalConference on Intelligent Agent Technology (IAT'06), 45-51. https://doi.org/10.1109/IAT.2006.81