The last fifteen years have seen dramatic changes and great improvements in the area of coding theory. Rather than relying on algebraic code constructions it is now recognized that suitably confined “random structures” combined with low complexity iterative decoding algorithms can achieve astonishing performance at unprecendentedly low complexity. This new approach to coding is usually referred to as “iterative decoding.” This paradigm shift poses many new challenging problems but also represents an enormous opportunity. We decided to take full advantage of this opportunity, and to make the analysis and design of these iterative coding systems the present focus of LTHC. In particular, we are interested in a sound understanding of the basic principles underlying such coding systems. The pool of applications which derive significant benefit from any improvement in this area is large and contains many of the most prevalent communication systems today, e.g., wireless systems, optical links operating at rates as high as 40gbits/s and magnetic or holographic storage systems.
Information Theory provides the theoretical basis for many communication systems. We aim to obtain a genuine understanding of the fundamental trade-offs in point-to-point communication as well as in self organizing networks. Here below we list some of the projects we are currently involved.
Recent information theoretic results point to the inefficiency of current communication methodologies in extreme wideband or low-power applications through fading channels. While finding capacity for real cases of interest is a daunting task, we focus on constructing useful bounds and identify good signaling schemes. Thus channel effects like delay and doppler spread, multipaths, estimation of channel coefficients, timing etc and their effects on capacity are analysed. We further investigate the issues associated with multi-access over fading channels. Our constructions try to incorporate the so far neglected component of communication delay into information theoretic framework. Lack of user synchronisation is modeled by imposing higher moment constraints on the input in certain situations.
The study of fundamental limits to various modern communication systems relies on the study of large random matrices. Our focus is on multi-antenna channels and ad hoc networks. - Multi-antenna channels: We have established invariance principles givingthe optimal input covariance, required for thechannels. More has to be saidin the - Ad hoc networks: We have established precise scaling laws for the capacity of arbitrary one-dimensional networks and regular two-dimensional networks. These scaling laws rely on refined estimates of eigenvalues of random large matrices. The extension to the arbitrary two-dimensional case is in progress.
Network information theory is the theory about transmitting or storing information when more than two devices are involved. Within the last decade networks have evolved from static and centrally-controllable to dynamic and self-organised. How do we deal with the transmission and the storage of information when a large number of unreliable devices are involved? We are currently examining three aspects of this question: - Ad-hoc detection: How does one detect an event (like an earthquake or a burglar) using a large number of unprecise devices with very limited communication capabilities? - Multiterminal source coding: Suppose several devices can provide some information about an event. How many bits of information should every device send in order to provide enough information so as to achieve a target criterion? - Spectral properties of random graphs: A network can be modeled by an undirected graph. How does the spectrum of the adjacency matrix relate to the network structural properties, like connectivity or the speed of information diffusion in the network.
In a multiuser communication system the transmitters are generally driven by independent sources generating bursty streams of packets. Information theory views multiuser systems in terms of noise and interference, the issue of bursty packet arrival is dealt withappropriated source coding and no consideration for end-to-end delay is made. On the other hand, network theory directly treats the bursty arrival of packets via resource allocation and distributed scheduling, but ignores the issues of noise and interference. Is there a unified view? In this project, we handle this question by exploring approaches in which both aspects of multiuser communications are dealt with.
A well known (and early) result of Information Theory establishes that feedback does not improve the capacity of memoryless channels. Even though this result does not apply to networks, nor to channels with memory, it has a damping effect on research on the use of feedback. Furthermore, just because it does not improve capacity it does not mean that feedback cannot simplify the design of communication systems. To that end we investigate the use of feedback in communication over unknown channels. As in any network the the communication channels can dramatically change over time, this is a natural question. Recent results show that in some cases, even if the channel is revealed neither to the transmitter nor to the receiver, there exists coding strategies that perform as well as the best coding strategies tuned for the channel under use. In these cases feedback enlarges the cooperation capabilities between the transmitter and the receiver, and ultimately frees them from the knowledge of the channel.
Signal processing for communication
This page will soon be updated
Input device (keyboard) for future
Computer keyboards as we have them today were designed about 130 years ago. It is an interesting fact that earlier models permitted to type faster, but would also cause the “hammers” to jam if the typist was too fast. A clever engineer solved the problem by spreading out commonly used keys and thereby slowing down every typist. For 130 years we have been stuck with that slow keyboard even if jamming is no longer a issue. But the real problem of current keyboards is that they are useless for devices that are smaller than a laptop, that they commit two hands, and that they need to be laid on a somewhat flat support.
We already have a working implementation of the keyboard. The goal is now to develop applications that make efficient use of it. Examples may range from simple (fast) text editors, to a “typing tutorial” program. There should be no limits to your creativity. Alternatively, a student interested in hardware design could develop an integrated circuit version.
LDPC: Graph generation from an ensemble
Generate random instances of a given LDPC ensemble and visualize the corresponding Tanner graph
Perform a greedy “lower triangulation” on the adjacency matrix of a random code instance of a given LDPC ensemble. The encoding complexity is proportional to the square of the “gap.” In the depicted figure the gap is the height at which the “diagonal” hits the vertical axis of the matrix.
Visualization of the peeling decoder for transmission over the BEC
LDPC: Finite-length scaling and error floor for BEC
Compute a finite length approximation of the error probability curve based on the scaling approach.
LDPC: Another scaling demo for BEC
Visualization of the scaling phenomenon of the error probability curve.
Maxwell decoder for the BEC: allows you to compute the MAP and the BP thresholds of a given degree distribution
LDPC: Gaussian multiple-access channels: iterative decoding
Visualization of iterative decoding for transmission over a multi-access channel.
PolarCodes: Web Applet for Polar Codes
Polar coding is a novel capacity-achieving channel coding scheme. This applet shows the functionality of the code and explains the successive cancellation decoding scheme.
PolarCodes: Overview and parameters trade-offs
This tutorial gives an overview of Polar Codes and presents trade-offs between parameters.
Xitip: Information theoretic inequalities prover
Xitip is a software tool to prove Information theoretic inequalities. The tool will check the correctness of any valid information inequality. A valid information inequality refers to a linear inequality involving measures such as (single, conditional or joint ) entropy and mutual informations. Optional information theoretic constraints (such as Markov chains, independence, deterministic functionals) can be provided.
More information along with software downloads can be found on the Xitip website http://xitip.epfl.ch. Linux, Mac, Windows (Cygwin as well as standalone executable) versions are available.
LDPC Simulator