On the Information Dimension of Stochastic Processes

Bernhard C. Geiger, Tobias Koch

Research output: Contribution to journalArticle

Abstract

In 1959, Rényi proposed the information dimension and the d-dimensional entropy to measure the information content of general random variables. This paper proposes a generalization of information dimension to stochastic processes by defining the information dimension rate as the entropy rate of the uniformly quantized stochastic process divided by minus the logarithm of the quantizer step size 1/m in the limit as m → ∞. It is demonstrated that the information dimension rate coincides with the rate-distortion dimension, defined as twice the rate-distortion function R(D) of the stochastic process divided by - log(D) in the limit as D ↓ 0. It is further shown that among all multivariate stationary processes with a given (matrixvalued) spectral distribution function (SDF), the Gaussian process has the largest information dimension rate and the information dimension rate of multivariate stationary Gaussian processes is given by the average rank of the derivative of the SDF. The presented results reveal that the fundamental limits of almost zero-distortion recovery via compressible signal pursuit and almost lossless analog compression are different in general.
Original languageEnglish
Pages (from-to)6496-6518
JournalIEEE Transactions on Information Theory
Volume65
Issue number10
DOIs
Publication statusPublished - Oct 2019

Fingerprint

Dive into the research topics of 'On the Information Dimension of Stochastic Processes'. Together they form a unique fingerprint.

Cite this