CRM: Centro De Giorgi
logo sns

Probability in Information Science


The group is composed by Fabio Fagnani (Politecnico di Torino), Franco Flandoli (Università di Pisa) and Marco Isopi (Università di Roma La Sapienza).

The group will focus his attention on the following three lines of research:

1. Coding theory:performance analysis and decoding algorithms

Coding theory is the branch of mathematics concerned with transmitting data across noisy channels and recovering the message. Concatenated codes and turbo codes in particular achieve a performance which is close to its theoretical limit (Shannon's bound) are they can be quickly decoded by iterative decoding. While turbo codes are being widely used because of these features, almost no rigorous results are known.

A key step in coding is the choice of a "generic" permutation to concatenate two convolutional codes. This amounts to introducing an extra source of randomness besides the noisy channel and raises interesting probabilistic questions both at the level of the analysis of performance trasmission and decoding. Decoding messages that have been turbo-coded is an N-P hard problem. The so-called "belief propagation" algorithm is an approximate decoding scheme which performs surprisingly well. The question of why tit does so is still far from being answered.

2. Statistical theory of turbulence

A fluid configuration can, under certain circumstances, be unstable to infinitesimal perturbations. In an unstable fluid system, we expect that the growth of perturbations may lead to a loss of our predictive capability. One encounters situations in which fluid velocities seem to vary randomly in space and time. Since it is not possible to develop a deterministic theory of turbulence, one wishes to develop a statistical theory. There is experimental evidence that the vortivcity is concentrated in filament like structures. Phenomenological models based on vorticity curves then arise naturally. The attempt to develop a rigorous statistical theory of such one-dimensional structures both in and out of equilibrium, raises a host of challenging mathematical problems in stochastic processes and related areas.

3. Non classical random graphs and the modelling of complex networks

Complex networks describe a wide range of systems in nature and society.The Internet, collaboration in science, power grids, gene expression regulatory networks can all, at some level, be described as random graph. These are not the classical random graphs of the Erdos-Renyi theory, nor the random subsets of the d-dimensional integer lattice which appear in percolation. Rather these graphs have a small diameter (the so-called small world phenomenon), but are very different from the mean-field models and exhibit b clustering properties. The degree distribution is not the same for all the nodes and the networks are well described as being scale-free. This results in a high degree of error tolerance and an extreme vulnerability to attacks. Moreover scale-free networks are prone to the spreading of infection. In the existing (non rigorous) literature these networks are built one node at the time with a procedure that is not well defined on the limit, but all properties are then studied assuming that we are deling with an infinite object. As the number of heuristic studies grows, there is the need for a rigorous theory which will likely involve percolation, mathematical statistical mechanics, theory of random graphs, as well as classical probability theory and combinatorics.