On the estimation of the Wasserstein distance in generative models

Thomas Pinetz, Daniel Soukup, Thomas Pock

Publikation: Beitrag in Buch/Bericht/KonferenzbandBeitrag in einem KonferenzbandForschungBegutachtung

Abstract

Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN literature is to use the Wasserstein distance as loss function leading to Wasserstein Generative Adversarial Networks (WGANs). Using this as a basis, we show various ways in which the Wasserstein distance is estimated for the task of generative modelling. Additionally, the secrets in training such models are shown and summarized at the end of this work. Where applicable, we extend current works to different algorithms, different cost functions, and different regularization schemes to improve generative models.
Originalspracheenglisch
TitelGerman Conference on Pattern Recognition
Seiten156-170
PublikationsstatusVeröffentlicht - 2019

Fingerprint

Cost functions
Probability distributions

Dies zitieren

Pinetz, T., Soukup, D., & Pock, T. (2019). On the estimation of the Wasserstein distance in generative models. in German Conference on Pattern Recognition (S. 156-170)

On the estimation of the Wasserstein distance in generative models. / Pinetz, Thomas; Soukup, Daniel; Pock, Thomas.

German Conference on Pattern Recognition. 2019. S. 156-170.

Publikation: Beitrag in Buch/Bericht/KonferenzbandBeitrag in einem KonferenzbandForschungBegutachtung

Pinetz, T, Soukup, D & Pock, T 2019, On the estimation of the Wasserstein distance in generative models. in German Conference on Pattern Recognition. S. 156-170.
Pinetz T, Soukup D, Pock T. On the estimation of the Wasserstein distance in generative models. in German Conference on Pattern Recognition. 2019. S. 156-170
Pinetz, Thomas ; Soukup, Daniel ; Pock, Thomas. / On the estimation of the Wasserstein distance in generative models. German Conference on Pattern Recognition. 2019. S. 156-170
@inproceedings{14655fcaab82458a9cb15094ef9b42bc,
title = "On the estimation of the Wasserstein distance in generative models",
abstract = "Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN literature is to use the Wasserstein distance as loss function leading to Wasserstein Generative Adversarial Networks (WGANs). Using this as a basis, we show various ways in which the Wasserstein distance is estimated for the task of generative modelling. Additionally, the secrets in training such models are shown and summarized at the end of this work. Where applicable, we extend current works to different algorithms, different cost functions, and different regularization schemes to improve generative models.",
author = "Thomas Pinetz and Daniel Soukup and Thomas Pock",
year = "2019",
language = "English",
pages = "156--170",
booktitle = "German Conference on Pattern Recognition",

}

TY - GEN

T1 - On the estimation of the Wasserstein distance in generative models

AU - Pinetz, Thomas

AU - Soukup, Daniel

AU - Pock, Thomas

PY - 2019

Y1 - 2019

N2 - Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN literature is to use the Wasserstein distance as loss function leading to Wasserstein Generative Adversarial Networks (WGANs). Using this as a basis, we show various ways in which the Wasserstein distance is estimated for the task of generative modelling. Additionally, the secrets in training such models are shown and summarized at the end of this work. Where applicable, we extend current works to different algorithms, different cost functions, and different regularization schemes to improve generative models.

AB - Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN literature is to use the Wasserstein distance as loss function leading to Wasserstein Generative Adversarial Networks (WGANs). Using this as a basis, we show various ways in which the Wasserstein distance is estimated for the task of generative modelling. Additionally, the secrets in training such models are shown and summarized at the end of this work. Where applicable, we extend current works to different algorithms, different cost functions, and different regularization schemes to improve generative models.

M3 - Conference contribution

SP - 156

EP - 170

BT - German Conference on Pattern Recognition

ER -