Document Type
Journal Article
Department/Unit
Department of Mathematics
Title
On the convergence of primal-dual hybrid gradient algorithm
Language
English
Abstract
© 2014 Society for Industrial and Applied Mathematics. The primal-dual hybrid gradient algorithm (PDHG) has been widely used, especially for some basic image processing models. In the literature, PDHG’s convergence was established only under some restrictive conditions on its step sizes. In this paper, we revisit PDHG’s convergence in the context of a saddle-point problem and try to better understand how to choose its step sizes. More specifically, we show by an extremely simple example that PDHG is not necessarily convergent even when the step sizes are fixed as tiny constants. We then show that PDHG with constant step sizes is indeed convergent if one of the functions of the saddle-point problem is strongly convex, a condition that does hold for some variational models in imaging. With this additional condition, we also establish a worst-case convergence rate measured by the iteration complexity for PDHG with constant step sizes.
Keywords
Convergence rate, Convex optimization, Image restoration, Primal-dual hybrid gradient algorithm, Saddle-point problem, Total variation
Publication Date
2014
Source Publication Title
SIAM Journal on Imaging Sciences
Volume
7
Issue
4
Start Page
2526
End Page
2537
Publisher
Society for Industrial and Applied Mathematics
DOI
10.1137/140963467
Link to Publisher's Edition
http://dx.doi.org/10.1137/140963467
ISSN (electronic)
19364954
APA Citation
He, B., You, Y., & Yuan, X. (2014). On the convergence of primal-dual hybrid gradient algorithm. SIAM Journal on Imaging Sciences, 7 (4), 2526-2537. https://doi.org/10.1137/140963467