Hybrid generative-discriminative training of Gaussian mixture models

Wolfgang Roth, Robert Peharz, Sebastian Tschiatschek, Franz Pernkopf

Research output: Contribution to journalArticleResearchpeer-review

Abstract

Recent work has shown substantial performance improvements of discriminative probabilistic models over their generative counterparts. However, since discriminative models do not capture the input distribution of the data, their use in missing data scenarios is limited. To utilize the advantages of both paradigms, we present an approach to train Gaussian mixture models (GMMs) in a hybrid generative-discriminative way. This is accomplished by optimizing an objective that trades off between a generative likelihood term and either a discriminative conditional likelihood term or a large margin term using stochastic optimization. Our model substantially improves the performance of classical maximum likelihood optimized GMMs while at the same time allowing for both a consistent treatment of missing features by marginalization, and the use of additional unlabeled data in a semi-supervised setting. For the covariance matrices, we employ a diagonal plus low-rank matrix structure to model important correlations while keeping the number of parameters small. We show that a non-diagonal matrix structure is crucial to achieve good performance and that the proposed structure can be utilized to considerably reduce classification time in case of missing features. The capabilities of our model are demonstrated in extensive experiments on real-world data.

LanguageEnglish
Pages131-137
Number of pages7
JournalPattern recognition letters
Volume112
DOIs
StatusPublished - 1 Sep 2018

Fingerprint

Covariance matrix
Maximum likelihood
Experiments
Statistical Models

Keywords

  • Gaussian mixture model
  • Hybrid generative-discriminative learning
  • Large margin learning
  • Missing features
  • Semi-supervised learning

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Cite this

Hybrid generative-discriminative training of Gaussian mixture models. / Roth, Wolfgang; Peharz, Robert; Tschiatschek, Sebastian; Pernkopf, Franz.

In: Pattern recognition letters, Vol. 112, 01.09.2018, p. 131-137.

Research output: Contribution to journalArticleResearchpeer-review

@article{74814ec54e414f2f8a857baf9a1c8ee5,
title = "Hybrid generative-discriminative training of Gaussian mixture models",
abstract = "Recent work has shown substantial performance improvements of discriminative probabilistic models over their generative counterparts. However, since discriminative models do not capture the input distribution of the data, their use in missing data scenarios is limited. To utilize the advantages of both paradigms, we present an approach to train Gaussian mixture models (GMMs) in a hybrid generative-discriminative way. This is accomplished by optimizing an objective that trades off between a generative likelihood term and either a discriminative conditional likelihood term or a large margin term using stochastic optimization. Our model substantially improves the performance of classical maximum likelihood optimized GMMs while at the same time allowing for both a consistent treatment of missing features by marginalization, and the use of additional unlabeled data in a semi-supervised setting. For the covariance matrices, we employ a diagonal plus low-rank matrix structure to model important correlations while keeping the number of parameters small. We show that a non-diagonal matrix structure is crucial to achieve good performance and that the proposed structure can be utilized to considerably reduce classification time in case of missing features. The capabilities of our model are demonstrated in extensive experiments on real-world data.",
keywords = "Gaussian mixture model, Hybrid generative-discriminative learning, Large margin learning, Missing features, Semi-supervised learning",
author = "Wolfgang Roth and Robert Peharz and Sebastian Tschiatschek and Franz Pernkopf",
year = "2018",
month = "9",
day = "1",
doi = "10.1016/j.patrec.2018.06.014",
language = "English",
volume = "112",
pages = "131--137",
journal = "Pattern recognition letters",
issn = "0167-8655",
publisher = "Elsevier B.V.",

}

TY - JOUR

T1 - Hybrid generative-discriminative training of Gaussian mixture models

AU - Roth, Wolfgang

AU - Peharz, Robert

AU - Tschiatschek, Sebastian

AU - Pernkopf, Franz

PY - 2018/9/1

Y1 - 2018/9/1

N2 - Recent work has shown substantial performance improvements of discriminative probabilistic models over their generative counterparts. However, since discriminative models do not capture the input distribution of the data, their use in missing data scenarios is limited. To utilize the advantages of both paradigms, we present an approach to train Gaussian mixture models (GMMs) in a hybrid generative-discriminative way. This is accomplished by optimizing an objective that trades off between a generative likelihood term and either a discriminative conditional likelihood term or a large margin term using stochastic optimization. Our model substantially improves the performance of classical maximum likelihood optimized GMMs while at the same time allowing for both a consistent treatment of missing features by marginalization, and the use of additional unlabeled data in a semi-supervised setting. For the covariance matrices, we employ a diagonal plus low-rank matrix structure to model important correlations while keeping the number of parameters small. We show that a non-diagonal matrix structure is crucial to achieve good performance and that the proposed structure can be utilized to considerably reduce classification time in case of missing features. The capabilities of our model are demonstrated in extensive experiments on real-world data.

AB - Recent work has shown substantial performance improvements of discriminative probabilistic models over their generative counterparts. However, since discriminative models do not capture the input distribution of the data, their use in missing data scenarios is limited. To utilize the advantages of both paradigms, we present an approach to train Gaussian mixture models (GMMs) in a hybrid generative-discriminative way. This is accomplished by optimizing an objective that trades off between a generative likelihood term and either a discriminative conditional likelihood term or a large margin term using stochastic optimization. Our model substantially improves the performance of classical maximum likelihood optimized GMMs while at the same time allowing for both a consistent treatment of missing features by marginalization, and the use of additional unlabeled data in a semi-supervised setting. For the covariance matrices, we employ a diagonal plus low-rank matrix structure to model important correlations while keeping the number of parameters small. We show that a non-diagonal matrix structure is crucial to achieve good performance and that the proposed structure can be utilized to considerably reduce classification time in case of missing features. The capabilities of our model are demonstrated in extensive experiments on real-world data.

KW - Gaussian mixture model

KW - Hybrid generative-discriminative learning

KW - Large margin learning

KW - Missing features

KW - Semi-supervised learning

UR - http://www.scopus.com/inward/record.url?scp=85049300938&partnerID=8YFLogxK

U2 - 10.1016/j.patrec.2018.06.014

DO - 10.1016/j.patrec.2018.06.014

M3 - Article

VL - 112

SP - 131

EP - 137

JO - Pattern recognition letters

T2 - Pattern recognition letters

JF - Pattern recognition letters

SN - 0167-8655

ER -