publicMasters/1InformationTheory.md

2.0 KiB

Cours de Florent.

Information Theory

Coding theory

When we store or transmit data, no system is perfect and some bits of information are incorrectly stred/retrieved or transmitted.

The purpose of this field is to come up with coding and decoding methods that allows to detect and correct errors with a high probablilty.

We shall provide an introduction with simple codes.

Topics covered on the board

  • Binary symmetric channel
  • Coding and decoding one bit to obtain arbitrary error
  • example for a probability of error of 1/6 repeating 3 times, repeating 5 times.

This is essentially a practical version of Shannon's noisy-channel coding theorem.

Details here

So in a nutshell, we can tranform a binary symmetric channel with this repetition trick into a binary symmetric chanel with an arbitrary low error rate.

It is not very practical because, it is achieved at a very high cost in terms of transmited of information compared with the actual information we wish to send.We shall therefore look for cheaper alternatives.

Detection

If we are ready to forget about correction and concentrate on detection there is a very simple trick.

We transmit some bits of information b_1\ldots b_n with one additionnal bit c that is computed via a very simple method from these bits, that is a certain function f of n arguments such that f(b_1\ldots b_n)=c.

At reception of some word b'_1\ldots b'_nc' we check whether f(b'_1\ldots b'_n)=c'. If it does we assume that there is no error (we might be wrong here), it it does not we assume that there is an error and ask for retransmission of this message (we are correct here).

This is used for low level transmission of information, in particular for ascii characters (since we tend to use powers of 2 when transmitting and storing information and we have one available bit when storing the 7 bits of the ascii encoding).

Details here