Course teached as: B010706 - TEORIA DELL'INFORMAZIONE E DELLA STIMA Second Cycle Degree in TELECOMMUNICATION ENGINEERING
Teaching Language
Italian
Course Content
Minimum variance unbiased estimator. Cramer-Rao lower bound. Maximum likelihood estimation. Least Square estimation. Bayesian approach to estimation. Sources of information. Source entropy. Source coding and the Shannon's theorem on reversible source coding. Rate-Distortion theory. Channels for the transmission of information. Channel capacity. Shannon's theorem on channel coding. Block codes. Cyclic codes. Convolutional codes.
S.M. Kay, Fundamentals of statistical signal processing: Volume I - Estimation theory, Prentice Hall, 1998.
M.H. Hayes, Statistical Digital Signal Processing and Modeling, John Wiley & Sons, 1996.
N. Abramson: Information Theory and Coding. McGraw-Hill, New York, 1963.
T. M. Cover, J. A.Thomas: Elements of Information Theory. John Wiley & Sons, New York, 1991.
K. Sayood: Introduction to Data Compression, Third Edition, Morgan Kaufmann Series in Multimedia Information and Systems, 2005
S. Lin, D. J. Costello Jr.: Error Control Coding: Fundamentals and Applications. Prentice-Hall, 1983.
J. G. Proakis, M. Salehi: Communications Systems Engineering. Prentice-Hall, 1994.
J. G. Proakis: Digital Communications. McGraw-Hill, 4a Ed., 2001.
S. Benedetto, E. Biglieri: Principles of Digital Transmission: with Wireless Applications, Kluwer Academic, 1999.
A. Papoulis, S.U. Pillai, Probability, Random Variables, and Stochastic Processes, 4th ed., McGraw-Hill, 2002.
D. Manolakis, V.K. Ingle, S.M. Kogon, Statistical and Adaptive Signal Processing: Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing, Artech House, 2005.
Learning Objectives
The course aims at providing students with a basic knowledge about the processing of random processes, their representation in a compact form, and the transmission over a noisy channel.
At the end of the course, the student is expected to be able to: classify the different methods and criteria used in estimation theory; apply the most suitable estimation methods in the specific applications and extract the interest parameters in the presence of noise; understand how current standards of data compression works; understand the most popular techniques of channel coding and reliable transmission of data over noisy channels.
Prerequisites
The student is expected to have a basic knowledge about: signals and systems; probability, random variables and processes, and their characterization in time and frequency domain; vector and matrix representation. These topics will be recalled at the beginning of the course.
Teaching Methods
Lectures.
Type of Assessment
The final exam consists of two test. The first one, at student's choice, is either a written test in which the student will have to answer to questions about the topic of the course or a computer program. The second one is an oral exam.
Course program
Review of random variables and random processes. Introduction to the estimation problem. Observed data and signal models. PDF of data. Bias and unbiased estimators. Minimum variance unbiased (MVU) estimators. Cramer-Rao lower bound (CRLB). Fisher information. CRLB and transformation of parameters. CRLB for signals in AWGN. Sufficient statistics. Neyman-Fischer factorization. Rao-Blackwell-Lehmann-Scheffe Theorem. Linear signal model. Estimator for linear signal model and its covariance. Generalized linear signal model. Best linear unbiased estimator (BLUE). Maximum likelihood estimator (MLE). Asymptotic properties of the MLE. Numeric computation of the MLE. MLE for a vector of parameters. Least squares estimator (LS). LS estimator with a linear model. Weighted LS. Geometrical interpretation of the LS estimator. Bayesian approach to estimation. Prior and posterior PDF. MMSE Bayesian estimator. Generalized linear Bayesian model. Bayesian risk. Maximum a posteriori (MAP) estimation, scalar and vector of parameters cases. Examples of MAP estimation. Linear MMSE (LMMSE) estimation. Geometric interpretation of the LMMSE estimation. Wiener filtering.
Sources of information. Memoryless sources. Measures of information. Source entropy. Joint and conditional entropy. Extended source and its entropy. Sources of information with memory (Markov sources). Entropy of Markov sources. Adjoint source. Extension of sources with memory and their entropy. Introduction to source coding: nonsingular codes, univocally decodable codes. Instantaneous codes. Kraft's inequality. McMillan inequality. Average length of a code. First Shannon's theorem on reversible coding (for sources without and with memory). Coding of extended sources. Compact codes. Huffman coding. Arithmetic coding. Lempel-Ziv coding. Relative entropy. Mutual information. Entropy of continuous sources: differential entropy. Differential entropy for a vector of jointly Gaussian variables. Quantization. Uniform and nonuniform quantization. Max-Lloyd algorithm. Waveform coding. Distortion and its measure. Rate-Distortion theory. Rate-distortion curve of a source. Rate-distrotion function for Gaussian, uniform, Laplacian variables.
Introduction to channel coding. Models for the transmission of information: binary symmetric channel, (BSC), discrete memoryless channel, waveform channel, Gaussian channel. Channel matrix. A priori and a posteriori entropy. Channel equivocation. Channel capacity. Noiseless channels (deterministic channels). Cascade of channels. Capacity of the BSC. Decision rules. Error probability. Repetition codes. Hamming distance. Second Shannon's theorem on the reliable transmission over noisy channels. Gaussian channel capacity. Relationship between power spectral density and bit rate or SNR. Shannon limit and region of reliable communication. Error control codes. Detection and correction of errors. Block codes. Linear codes. Code generation matrix. Parity check matrix. Hard decoding of linear codes: syndrome computation. Standard array. Cyclic codes. Generator polynomial. Parity polynomial. Computation of the code generator matrix in systematic form. Decoding of cyclic codes. BCH codes. Reed-Solomon codes. Convolutional codes. Decoding of convolutional codes: the Viterbi's algorithm. Soft decoding. Gain of a channel code. Concatenated codes. Interleaving techniques.