A unifying variational perspective on some fundamental information theoretic inequalities

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

This paper proposes a unifying variational approach for proving and extending some fundamental information theoretic inequalities. Fundamental information theory results such as maximization of differential entropy, minimization of Fisher information (Cramér-Rao inequality), worst additive noise lemma, entropy power inequality, and extremal entropy inequality are interpreted as functional problems and proved within the framework of calculus of variations. Several applications and possible extensions of the proposed results are briefly mentioned.

Original languageEnglish
Article number6566200
Pages (from-to)7132-7148
Number of pages17
JournalIEEE Transactions on Information Theory
Volume59
Issue number11
DOIs
Publication statusPublished - 2013

Fingerprint

Entropy
entropy
Additive noise
Information theory
information theory

Keywords

  • calculus of variations
  • entropy power inequality
  • extremal entropy inequality
  • Maximizing entropy
  • minimizing Fisher information
  • worst additive noise

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Cite this

A unifying variational perspective on some fundamental information theoretic inequalities. / Park, Sangwoo; Serpedin, Erchin; Qaraqe, Khalid.

In: IEEE Transactions on Information Theory, Vol. 59, No. 11, 6566200, 2013, p. 7132-7148.

Research output: Contribution to journalArticle

@article{bf59814be83d4927aec7e5246e460390,
title = "A unifying variational perspective on some fundamental information theoretic inequalities",
abstract = "This paper proposes a unifying variational approach for proving and extending some fundamental information theoretic inequalities. Fundamental information theory results such as maximization of differential entropy, minimization of Fisher information (Cram{\'e}r-Rao inequality), worst additive noise lemma, entropy power inequality, and extremal entropy inequality are interpreted as functional problems and proved within the framework of calculus of variations. Several applications and possible extensions of the proposed results are briefly mentioned.",
keywords = "calculus of variations, entropy power inequality, extremal entropy inequality, Maximizing entropy, minimizing Fisher information, worst additive noise",
author = "Sangwoo Park and Erchin Serpedin and Khalid Qaraqe",
year = "2013",
doi = "10.1109/TIT.2013.2274514",
language = "English",
volume = "59",
pages = "7132--7148",
journal = "IEEE Transactions on Information Theory",
issn = "0018-9448",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "11",

}

TY - JOUR

T1 - A unifying variational perspective on some fundamental information theoretic inequalities

AU - Park, Sangwoo

AU - Serpedin, Erchin

AU - Qaraqe, Khalid

PY - 2013

Y1 - 2013

N2 - This paper proposes a unifying variational approach for proving and extending some fundamental information theoretic inequalities. Fundamental information theory results such as maximization of differential entropy, minimization of Fisher information (Cramér-Rao inequality), worst additive noise lemma, entropy power inequality, and extremal entropy inequality are interpreted as functional problems and proved within the framework of calculus of variations. Several applications and possible extensions of the proposed results are briefly mentioned.

AB - This paper proposes a unifying variational approach for proving and extending some fundamental information theoretic inequalities. Fundamental information theory results such as maximization of differential entropy, minimization of Fisher information (Cramér-Rao inequality), worst additive noise lemma, entropy power inequality, and extremal entropy inequality are interpreted as functional problems and proved within the framework of calculus of variations. Several applications and possible extensions of the proposed results are briefly mentioned.

KW - calculus of variations

KW - entropy power inequality

KW - extremal entropy inequality

KW - Maximizing entropy

KW - minimizing Fisher information

KW - worst additive noise

UR - http://www.scopus.com/inward/record.url?scp=84886665708&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84886665708&partnerID=8YFLogxK

U2 - 10.1109/TIT.2013.2274514

DO - 10.1109/TIT.2013.2274514

M3 - Article

AN - SCOPUS:84886665708

VL - 59

SP - 7132

EP - 7148

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

SN - 0018-9448

IS - 11

M1 - 6566200

ER -