Information sources. Source entropy. Source coding and the Shannon's theorem on reversible source coding. Elements of Rate-Distortion theory. Channels for the transmission of information. Channel capacity. Shannon's theorem on channel coding. Block codes. Cyclic codes. Convolutional codes.
T. M. Cover, J. A. Thomas: Elements of Information Theory. John Wiley & Sons, New York, 2nd ed 2006
N. Abramson: Information Theory and Coding. McGraw-Hill, New York, 1963.
S. Benedetto, E. Biglieri, V. Castellani: Digital Transmission Theory, Prentice Hall, 1988.
S. Lin, D. J. Costello Jr.: Error Control Coding: Fundamentals and Applications. Prentice-Hall, 1983.
J. G. Proakis: Digital Communications. McGraw-Hill, 4a Ed., 2001.
A. Papoulis, S.U. Pillai, Probability, Random Variables, and Stochastic Processes, 4th ed., McGraw-Hill, 2002.
Learning Objectives
The course aims at providing students with a basic knowledge about the processing of random processes, their representation in a compact form, and the transmission over a noisy channel.
At the end of the course, the student is expected to be able to: classify the different methods and criteria used in estimation theory; apply the most suitable estimation methods in the specific applications and extract the interest parameters in the presence of noise; understand how current standards of data compression works; understand the most popular techniques of channel coding and reliable transmission of data over noisy channels.
Prerequisites
The student is expected to have a basic knowledge about: signals and systems; probability, random variables and processes, and their characterization in time and frequency domain; vector and matrix representation.
Teaching Methods
Lectures
Type of Assessment
Oral exam
Course program
Information sources. Memoryless sources. Measures of information. Source entropy. Sources of information with memory
Introduction to source coding: codes classification. Kraft's and McMillan’s inequalities. Average length of a code. First Shannon's theorem on reversible coding. Huffman coding. Arithmetic coding. Lempel-Ziv coding. Quantization. Distortion and its measure. Rate-Distortion theory. Rate-distortion curve of a source. Rate-distortion function for Gaussian, variables.
Introduction to channel coding. Models for the transmission of information. Channel equivocation. Channel capacity. Decision rules. Error probability. Repetition codes. Hamming distance. Second Shannon's theorem on the reliable transmission over noisy channels. Gaussian channel capacity. Relationship between power spectral density and bit rate or SNR. Shannon limit and region of reliable communication. Error control codes. Detection and correction of errors. Block codes. Linear codes. Hard decoding of linear codes, Cyclic codes. BCH codes. Reed-Solomon codes. Concatenated codes. Interleaving techniques.Convolutional codes. Decoding of convolutional codes: the Viterbi's algorithm. Soft decoding. Gain of a channel code.