E1 244 — Detection and Estimation Theory (3:0), Jan-Apr 2019

Instructor: Aditya Gopalan, ECE 2.09, Dept. of ECE, E-mail: first-name AT iisc.ac.in

Class time: TTh 11:30—13:00

Place: ECE 1.08

Course Description: The course presents an introductory treatment of the problems of detection and estimation in the framework of statistical inference. Detection, broadly speaking, attempts to answer whether a property is satisfied, while estimation attempts to find the value of a quantity, based on observations or data. The course is theoretical in flavour, and is suitable for beginning graduate students who wish to gain a basic understanding of the tools of mathematical statistics.

Contents: Hypothesis testing, Neyman-Pearson theorem, likelihood ratio test and generalized likelihood ratio test, uniformly most powerful test, multiple-decision problems, detection of deterministic and random signals in Gaussian noise, detection in non-Gaussian noise, sequential detection, introduction to nonparametric testing. Parameter Estimation: Unbiasedness, consistency, Cramer-Rao bound, sufficient statistics, Rao-Blackwell theorem, best linear unbiased estimation, maximum likelihood estimation. Bayesian estimation: MMSE and MAP estimators, Wiener filter, Kalman filter, Levinson-Durbin and innovation algorithms.

Prerequisites: Probability/stochastic processes

Text/References:

(1) H. Vincent Poor. An Introduction to Signal Detection and Estimation (2nd Ed.). Springer-Verlag New York, Inc., New York, NY, USA, 1994.

(2) George Casella and Roger L. Berger. Statistical Inference. Duxbury Press, Pacific Grove, PA, second edition, 2002.

Grading Policy: Homework assignments (including programming exercises): 25%, Midterm exam: 25%, Final exam: 50%

Homework assignments:

Exams:

Lecture record:

  • 1) [3/1/19] Introduction, Hypothesis testing

  • 2) [8/1/19] Bayesian hypothesis testing

  • 3) [15/1/19] Minimax hypothesis testing

  • 4) [17/1/19] Minimax hypothesis testing

  • 5) [19/1/19] Neyman-Pearson hypothesis testing

  • 6) [22/1/19] Neyman-Pearson hypothesis testing

  • 7) [24/1/19] Composite hypothesis testing — Bayesian criterion

  • 8) [29/1/19] Composite hypothesis testing — Neyman-Pearson criterion

  • 9) [31/1/19] Signal detection — known signals in iid noise

  • 10) [5/2/19] Signal detection — locally optimum coherent detection, coherent detection in general Gaussian noise

  • 11) [7/2/19] Signal detection — signals with random parameters, envelope detector

  • 12) [12/2/19] Signal detection — performance analysis of the envelope detector

  • 13) [14/2/19] Signal detection — completely random signals, energy detector

  • 14) [19/2/19] Sequential detection

  • 15) [26/2/19] Sequential detection — SPRTs and Wald’s approximations

  • 16) [7/3/19] Performance analysis of detectors — Chernoff bounds

  • 17) [9/3/19] Introduction to estimation, Method of moments estimators

  • 18) [12/3/19] Maximum likelihood estimators, Bayes estimators

  • 19) [14/3/19] Mean-square error for estimators, Best unbiased estimators

  • 20) [26/3/19] Cramer-Rao Lower bound for unbiased estimation

  • 21) [28/3/19] Sufficient statistics

  • 22) [30/3/19] Minimal sufficient statistics

  • 23) [2/4/19] Rao-Blackwell theorem, Complete statistics, Lehmann-Scheffe theorem

  • 24) [4/4/19] Asymptotic performance — consistency and asymptotic efficiency of MLEs

  • 25) [6/4/19] General decision theory framework for estimators, conditional expectation as the optimal MMSE estimator

  • 26) [9/4/19] State estimation in linear dynamical systems, Kalman-Bucy filter

  • 27) [11/4/19] Kalman-Bucy filter, Linear MMSE estimation theory

  • 28) [13/4/19] Yule-Walker equations, Levinson-Durbin filter


Last updated: 12-Feb-2024, 11:46:36 IST