Master semester project: 2007-2008

New Information Theoretical Inequalities


Description & Benefit:

Understand a method for proving inequalities in information theory proposed by R. Yeung. Try to find so-called non-Shannon-type inequalities.
The project will be theoretical. You will understand the method proposed by R. Yeung and get familiar with important concepts in information theory.

Objective:

In [1], R. Yeung describes a systematic method for proving information-theoretic inequalities. As an example, imagine that you want to prove that I(X;Y) >= I(X;Z) for any discrete distribution of the three random variables X, Y, and Z. The proof should only involve the basic Shannon inequalities. R. Youngs method solves this type of problem in a systematic manner and can therefore be written as a computer program.

It turns out that the basic Shannon inequalities (non-negativity of the conditional entropy and the mutual information) are not sufficient to proof all possible inequalities. In other words, there exist inequalities that always hold, however, one can not prove this using the Shannon inequalities.

In this project, we consider these non-Shannon-type inequalities (only a few of them are known). The aim is to understand what additional element is being used in their proof. A further aim would be to find new non-Shannon-type inequalities.

Prerequisites:

  • Solid background in probability and analysis
  • information theory

References:

[1] Raymond W. Yeung, “A Framework for Linear Information Inequalities”, IEEE Transactions on Information Theory, Vol. 43, No. 6, November 1997 http://user-www.ie.cuhk.edu.hk/~ITIP/

Contact:

Etienne Perron (LICOS) * Email: etienne.perron@epfl.ch * Office: INR-033 * Tel: 36457

Supervisor: Prof. Suhas Diggavi


back to master semester projects

Last modified:: %2007/%02/%22 %09:%Feb