Proximal extrapolated gradient methods for variational inequalities

Yu Malitsky

Research output: Contribution to journalArticle

Abstract

The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector–vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.

LanguageEnglish
Pages140-164
Number of pages25
JournalOptimization Methods and Software
Volume33
Issue number1
DOIs
StatusPublished - 2 Jan 2018

Fingerprint

Proximal Methods
Gradient methods
Gradient Method
Variational Inequalities
Line Search
Operator
Monotone Variational Inequalities
Lipschitz Continuity
Composite materials
Convergence Rate
Experiments
Composite
Numerical Experiment
First-order

Keywords

  • convex optimization
  • ergodic convergence
  • linesearch
  • monotone operator
  • nonmonotone stepsizes
  • proximal methods
  • variational inequality

ASJC Scopus subject areas

  • Software
  • Control and Optimization
  • Applied Mathematics

Cite this

Proximal extrapolated gradient methods for variational inequalities. / Malitsky, Yu.

In: Optimization Methods and Software, Vol. 33, No. 1, 02.01.2018, p. 140-164.

Research output: Contribution to journalArticle

@article{d923516392f94118ba22f52ffb857e47,
title = "Proximal extrapolated gradient methods for variational inequalities",
abstract = "The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector–vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.",
keywords = "convex optimization, ergodic convergence, linesearch, monotone operator, nonmonotone stepsizes, proximal methods, variational inequality",
author = "Yu Malitsky",
year = "2018",
month = "1",
day = "2",
doi = "10.1080/10556788.2017.1300899",
language = "English",
volume = "33",
pages = "140--164",
journal = "Optimization methods & software",
issn = "1055-6788",
publisher = "Taylor and Francis Ltd.",
number = "1",

}

TY - JOUR

T1 - Proximal extrapolated gradient methods for variational inequalities

AU - Malitsky,Yu

PY - 2018/1/2

Y1 - 2018/1/2

N2 - The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector–vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.

AB - The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simple, namely, it needs only simple vector–vector operations. For all our methods, we establish the ergodic convergence rate. In addition, we modify one of the proposed methods for the case of a composite minimization. Preliminary results from numerical experiments are quite promising.

KW - convex optimization

KW - ergodic convergence

KW - linesearch

KW - monotone operator

KW - nonmonotone stepsizes

KW - proximal methods

KW - variational inequality

UR - http://www.scopus.com/inward/record.url?scp=85015809540&partnerID=8YFLogxK

U2 - 10.1080/10556788.2017.1300899

DO - 10.1080/10556788.2017.1300899

M3 - Article

VL - 33

SP - 140

EP - 164

JO - Optimization methods & software

T2 - Optimization methods & software

JF - Optimization methods & software

SN - 1055-6788

IS - 1

ER -