Variational Networks: Connecting Variational Methods and Deep Learning

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we introduce variational networks (VNs) for image reconstruction. VNs are fully learned models based on the framework of incremental proximal gradient methods. They provide a natural transition between classical variational methods and state-of-the-art residual neural networks. Due to their incremental nature, VNs are very efficient, but only approximately minimize the underlying variational model. Surprisingly, in our numerical experiments on image reconstruction problems it turns out that giving up exact minimization leads to a consistent performance increase, in particular in the case of convex models.
LanguageEnglish
Title of host publicationPattern Recognition
Subtitle of host publicationGerman Conference, GCPR 2017, Proceedings
PublisherSpringer
Pages281-293
ISBN (Print)978-3-319-66708-9
DOIs
StatusPublished - 2017

Publication series

NameLecture Notes in Computer Science
Volume10496

Fingerprint

Image reconstruction
Gradient methods
Neural networks
Deep learning
Experiments

Keywords

  • variational methods
  • machine learning

Cite this

Kobler, E., Klatzer, T., Hammernik, K., & Pock, T. (2017). Variational Networks: Connecting Variational Methods and Deep Learning. In Pattern Recognition: German Conference, GCPR 2017, Proceedings (pp. 281-293). (Lecture Notes in Computer Science; Vol. 10496). Springer. DOI: 10.1007/978-3-319-66709-6_23

Variational Networks: Connecting Variational Methods and Deep Learning. / Kobler, Erich; Klatzer, Teresa; Hammernik, Kerstin; Pock, Thomas.

Pattern Recognition: German Conference, GCPR 2017, Proceedings. Springer, 2017. p. 281-293 (Lecture Notes in Computer Science; Vol. 10496).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Kobler, E, Klatzer, T, Hammernik, K & Pock, T 2017, Variational Networks: Connecting Variational Methods and Deep Learning. in Pattern Recognition: German Conference, GCPR 2017, Proceedings. Lecture Notes in Computer Science, vol. 10496, Springer, pp. 281-293. DOI: 10.1007/978-3-319-66709-6_23
Kobler E, Klatzer T, Hammernik K, Pock T. Variational Networks: Connecting Variational Methods and Deep Learning. In Pattern Recognition: German Conference, GCPR 2017, Proceedings. Springer. 2017. p. 281-293. (Lecture Notes in Computer Science). Available from, DOI: 10.1007/978-3-319-66709-6_23
Kobler, Erich ; Klatzer, Teresa ; Hammernik, Kerstin ; Pock, Thomas. / Variational Networks: Connecting Variational Methods and Deep Learning. Pattern Recognition: German Conference, GCPR 2017, Proceedings. Springer, 2017. pp. 281-293 (Lecture Notes in Computer Science).
@inproceedings{7856ea5897614162b6be555431dd9782,
title = "Variational Networks: Connecting Variational Methods and Deep Learning",
abstract = "In this paper, we introduce variational networks (VNs) for image reconstruction. VNs are fully learned models based on the framework of incremental proximal gradient methods. They provide a natural transition between classical variational methods and state-of-the-art residual neural networks. Due to their incremental nature, VNs are very efficient, but only approximately minimize the underlying variational model. Surprisingly, in our numerical experiments on image reconstruction problems it turns out that giving up exact minimization leads to a consistent performance increase, in particular in the case of convex models.",
keywords = "variational methods, machine learning",
author = "Erich Kobler and Teresa Klatzer and Kerstin Hammernik and Thomas Pock",
year = "2017",
doi = "10.1007/978-3-319-66709-6_23",
language = "English",
isbn = "978-3-319-66708-9",
series = "Lecture Notes in Computer Science",
publisher = "Springer",
pages = "281--293",
booktitle = "Pattern Recognition",

}

TY - GEN

T1 - Variational Networks: Connecting Variational Methods and Deep Learning

AU - Kobler,Erich

AU - Klatzer,Teresa

AU - Hammernik,Kerstin

AU - Pock,Thomas

PY - 2017

Y1 - 2017

N2 - In this paper, we introduce variational networks (VNs) for image reconstruction. VNs are fully learned models based on the framework of incremental proximal gradient methods. They provide a natural transition between classical variational methods and state-of-the-art residual neural networks. Due to their incremental nature, VNs are very efficient, but only approximately minimize the underlying variational model. Surprisingly, in our numerical experiments on image reconstruction problems it turns out that giving up exact minimization leads to a consistent performance increase, in particular in the case of convex models.

AB - In this paper, we introduce variational networks (VNs) for image reconstruction. VNs are fully learned models based on the framework of incremental proximal gradient methods. They provide a natural transition between classical variational methods and state-of-the-art residual neural networks. Due to their incremental nature, VNs are very efficient, but only approximately minimize the underlying variational model. Surprisingly, in our numerical experiments on image reconstruction problems it turns out that giving up exact minimization leads to a consistent performance increase, in particular in the case of convex models.

KW - variational methods

KW - machine learning

U2 - 10.1007/978-3-319-66709-6_23

DO - 10.1007/978-3-319-66709-6_23

M3 - Conference contribution

SN - 978-3-319-66708-9

T3 - Lecture Notes in Computer Science

SP - 281

EP - 293

BT - Pattern Recognition

PB - Springer

ER -