Author

Shangzhi Zeng

Year of Award

10-30-2017

Degree Type

Thesis

Degree Name

Master of Philosophy (MPhil)

Department

Department of Mathematics.

Principal Supervisor

Yuan, Xiaoming

Keywords

Convex programming;Mathematical optimization

Language

English

Abstract

In the literature, error bound conditions have been widely used for studying the linear convergence rates of various first-order algorithms and the majority of literature focuses on how to sufficiently ensure these error bound conditions, usually posing more assumptions on the model under discussion. In this thesis, we focus on the alternating direction method of multipliers (ADMM), and show that the known error bound conditions for studying ADMM's linear convergence, can indeed be further weakened if the error bound is studied over the specific iterative sequence generated by ADMM. A so-called partial error bound condition, which is tailored for the specific ADMM's iterative scheme and weaker than known error bound conditions in the literature, is thus proposed to derive the linear convergence of ADMM. We further show that this partial error bound condition theoretically justifies the difference if the two primal variables are updated in different orders in implementing ADMM, which had been empirically observed in the literature yet no theory is known so far.

Comments

Thesis submitted to the Department of Mathematics.;Principal supervisor: Professor Yuan Xiaoming.Thesis (M.Phil.)--Hong Kong Baptist University, 2017.

Bibliography

Includes bibliographical references (pages 51-55).

Available for download on Monday, April 06, 2020



Share

COinS