TY - JOUR
T1 - Proximal extrapolated gradient methods for variational inequalities
AU - Malitsky, Yu
PY - 2018/1/2
Y1 - 2018/1/2
N2 - The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector–vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.
AB - The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector–vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.
KW - convex optimization
KW - ergodic convergence
KW - linesearch
KW - monotone operator
KW - nonmonotone stepsizes
KW - proximal methods
KW - variational inequality
UR - http://www.scopus.com/inward/record.url?scp=85015809540&partnerID=8YFLogxK
U2 - 10.1080/10556788.2017.1300899
DO - 10.1080/10556788.2017.1300899
M3 - Article
AN - SCOPUS:85015809540
SN - 1055-6788
VL - 33
SP - 140
EP - 164
JO - Optimization Methods and Software
JF - Optimization Methods and Software
IS - 1
ER -