Adaptive FISTA for nonconvex optimization

Peter Ochs, Thomas Pock

Publikation: Beitrag in einer FachzeitschriftArtikelBegutachtung

Abstract

In this paper we propose an adaptively extrapolated proximal gradient method, which is based on the accelerated proximal gradient method (also known as FISTA); however, we locally optimize the extrapolation parameter by carrying out an exact (or inexact) line search. It turns out that in some situations, the proposed algorithm is equivalent to a class of SR1 (identity minus rank 1) proximal quasi-Newton methods. Convergence is proved in a general nonconvex setting, and hence, as a byproduct, we also obtain new convergence guarantees for proximal quasi-Newton methods. The efficiency of the new method is shown in numerical experiments on a sparsity regularized nonlinear inverse problem.

Originalspracheenglisch
Seiten (von - bis)2482-2503
Seitenumfang22
FachzeitschriftSIAM Journal on Optimization
Jahrgang29
Ausgabenummer4
DOIs
PublikationsstatusVeröffentlicht - 1 Jan. 2019

ASJC Scopus subject areas

  • Software
  • Theoretische Informatik

Fingerprint

Untersuchen Sie die Forschungsthemen von „Adaptive FISTA for nonconvex optimization“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren