E1 244 : Detection and Estimation Theory
January 2015
Instructor
Lecture Hours
Lectures: Tuesdays and Thursdays : 09:00 - 10:30
Make-up lectures and problem set discussions: Fridays : 09:00 - 10:30
Location
Course syllabus
Hypothesis testing: Neyman-Pearson theorem, likelihood ratio test and generalised likelihood ratio test, uniformly most powerful test, multiple-decision problem, detection of deterministic and random signals in Gaussian noise, detection in nonGaussian noise, sequential detection.
Parameter estimation: unbiasedness, consistency, Cramer-Rao bound, sufficient statistics, Rao-Blackwell theorem, best linear unbiased estimation, maximum likelihood estimation, method of moments.
Bayesian estimation: MMSE and MAP estimators, Wiener filter, Kalman filter, Levinson-Durbin and innovation algorithms.
Course Grade
- 20/100 : Homeworks - Fortnightly
- 30/100 : Mid-term - Thursday 12 February 2015 (09:00 - 10:30, 1.5 hours)
- 50/100 : Final - Friday 24 April 2015 (09:00 - 12:00, 3 hours)
Homeworks
Reference Texts
- H. V. Poor, An Introduction to Signal Detection and Estimation, Springer-Verlag, 2nd edition, 1994.
- G. Casella and R. L. Berger, Statistical Inference, Cengage Learning, 2nd edition, 2002.
Lecture topics
- 06/01: Lecture 1: Motivating examples
- 08/01: Lecture 2: Bayesian hypothesis testing (Section II-B of Poor)
- 13/01: Lecture 3: Minimax hypothesis testing (Section II-C of Poor)
- 22/01: Lecture 4: Minimax hypothesis testing continued
- 23/01: Lecture 5: Neyman-Pearson lemma (Section II-D of Poor)
- 27/01: Lecture 6: Examples
- 29/01: Lecture 7: Composite hypotheses
- 12/02: Mid-term
- 19/02: Lecture 8: Detector structures (Section III-B of Poor)
- 20/02: Lecture 9: Detector structures - dependent Gaussian noise
- 24/02: Lecture 10: Detector structures - signals with random parameters, stochastic signals, estimator-correlator
- 03/03: Lecture 11: Performance bounds (Section III-C-2 of Poor)
- 05/03: Lecture 12: Sequential detection: Formulation and SPRT (Section III-D of Poor)
- 06/03: Lecture 13: Sequential detection: Wald-Wolfowitz theorem
- 10/03: Lecture 14: Sequential detection: Wald's approximations
- 12/03: Lecture 15: Bayesian estimation
- 13/03: Lecture 16: Bayesian estimation
- 17/03: Lecture 17: Sufficiency, minimal sufficiency, examples
- 19/03: Lecture 18: Factorisation theorem, Rao-Blackwell theorem
- 20/03: Lecture 19: Complete families, examples, complete sufficient statistic
- 24/03: Lecture 20: Exponential families, examples, Cramer-Rao lower bound
- 26/03: Lecture 21: MLE, large sample asymptotics, consistency, efficiency
- 27/03: Lecture 22: Examples, bias-variance tradeoff
- 31/03: Lecture 23: Consistency, asymptotic normality of MLE, recursive algorithms
- 07/04: Lecture 24: Signal estimation and tracking, Kalman-Bucy filter
- 09/04: Lecture 25: Linear MMSE estimation, Orthogonality principle
- 10/04: Lecture 26: Wiener-Hopf equation, Yule-Walker equation and the Levinson algorithm
- 16/04: Lecture 27: Noncausal and causal Wiener-Kolmogorov filtering
- 24/04: Final exam: 9:00 - 12:00