A Smoothing Proximal Gradient Algorithm For Matrix Rank Minimization Problem

Abstract

In this paper, we study the low-rank matrix minimization problem, where the loss function is convex but nonsmooth and the penalty term is defined by the cardinality function. We first introduce an exact continuous relaxation, that is, both problems have the same minimizers and the same optimal value. In particular, we introduce a class of lifted stationary points of the relaxed problem and show that any local minimizer of the relaxed problem must be a lifted stationary point. In addition, we derive lower bound property for the nonzero singular values of the lifted stationary point and hence also of the local minimizers of the relaxed problem. Then the smoothing proximal gradient (SPG) algorithm is proposed to find a lifted stationary point of the continuous relaxation model. Moreover, it is shown that any accumulating point of the sequence generated by SPG algorithm is a lifted stationary point. At last, numerical examples show the efficiency of the SPG algorithm.

Publication
Computational Optimization and Applications
Quan Yu
Quan Yu
PhD student

My research interests include low rank tensor optimization, image processing and machine learning.

Related