On the estimation of the Wasserstein distance in generative models

Thomas Pinetz, Daniel Soukup, Thomas Pock

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Abstract

Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN literature is to use the Wasserstein distance as loss function leading to Wasserstein Generative Adversarial Networks (WGANs). Using this as a basis, we show various ways in which the Wasserstein distance is estimated for the task of generative modelling. Additionally, the secrets in training such models are shown and summarized at the end of this work. Where applicable, we extend current works to different algorithms, different cost functions, and different regularization schemes to improve generative models.
Original languageEnglish
Title of host publicationGerman Conference on Pattern Recognition
Pages156-170
Publication statusPublished - 2019

Fingerprint

Cost functions
Probability distributions

Cite this

Pinetz, T., Soukup, D., & Pock, T. (2019). On the estimation of the Wasserstein distance in generative models. In German Conference on Pattern Recognition (pp. 156-170)

On the estimation of the Wasserstein distance in generative models. / Pinetz, Thomas; Soukup, Daniel; Pock, Thomas.

German Conference on Pattern Recognition. 2019. p. 156-170.

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Pinetz, T, Soukup, D & Pock, T 2019, On the estimation of the Wasserstein distance in generative models. in German Conference on Pattern Recognition. pp. 156-170.
Pinetz T, Soukup D, Pock T. On the estimation of the Wasserstein distance in generative models. In German Conference on Pattern Recognition. 2019. p. 156-170
Pinetz, Thomas ; Soukup, Daniel ; Pock, Thomas. / On the estimation of the Wasserstein distance in generative models. German Conference on Pattern Recognition. 2019. pp. 156-170
@inproceedings{14655fcaab82458a9cb15094ef9b42bc,
title = "On the estimation of the Wasserstein distance in generative models",
abstract = "Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN literature is to use the Wasserstein distance as loss function leading to Wasserstein Generative Adversarial Networks (WGANs). Using this as a basis, we show various ways in which the Wasserstein distance is estimated for the task of generative modelling. Additionally, the secrets in training such models are shown and summarized at the end of this work. Where applicable, we extend current works to different algorithms, different cost functions, and different regularization schemes to improve generative models.",
author = "Thomas Pinetz and Daniel Soukup and Thomas Pock",
year = "2019",
language = "English",
pages = "156--170",
booktitle = "German Conference on Pattern Recognition",

}

TY - GEN

T1 - On the estimation of the Wasserstein distance in generative models

AU - Pinetz, Thomas

AU - Soukup, Daniel

AU - Pock, Thomas

PY - 2019

Y1 - 2019

N2 - Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN literature is to use the Wasserstein distance as loss function leading to Wasserstein Generative Adversarial Networks (WGANs). Using this as a basis, we show various ways in which the Wasserstein distance is estimated for the task of generative modelling. Additionally, the secrets in training such models are shown and summarized at the end of this work. Where applicable, we extend current works to different algorithms, different cost functions, and different regularization schemes to improve generative models.

AB - Generative Adversarial Networks (GANs) have been used to model the underlying probability distribution of sample based datasets. GANs are notoriuos for training difficulties and their dependence on arbitrary hyperparameters. One recent improvement in GAN literature is to use the Wasserstein distance as loss function leading to Wasserstein Generative Adversarial Networks (WGANs). Using this as a basis, we show various ways in which the Wasserstein distance is estimated for the task of generative modelling. Additionally, the secrets in training such models are shown and summarized at the end of this work. Where applicable, we extend current works to different algorithms, different cost functions, and different regularization schemes to improve generative models.

M3 - Conference contribution

SP - 156

EP - 170

BT - German Conference on Pattern Recognition

ER -