EE278: Course Plan

Ayfer Ozgur, Stanford University, Autumn 2023

Course plan is approximate.

  • Lecture 1: Course Overview

  • Lecture 2: Review of Probability Inequalities and Limit Theorems (References: EE178 notes or Sections 1.6.(1-2) and 1.7.(1-3) from Gallager)

  • Lecture 3-4: Concentration Inequalities, Moment Generating Function, Sub-Gaussian Random Variables (References: Chapter 2 Vershynin and Appendix B of Shalev-Shwartz & Ben-David)

  • Lecture 5-6: Machine Learning, Empirical Risk Minimization, Learning via Uniform Convergence (Reference: Chapters 2-3-4 of Shalev-Shwartz & Ben-David)

  • Lecture 7: Random Vectors, Mean and Covariance Matrix (Reference: Sections 3.1 to 3.4 of Gallager)

  • Lecture 8: Properties of a Covariance Matrix, Spectral Decomposition, Karhunen-Loeve Expansion (Reference: Sections 3.1 to 3.4 of Gallager)

  • Lecture 9: Principal Component Analysis, Gaussian Random Vectors (Reference: Sections 3.1 to 3.4 of Gallager)

  • Lecture 10: Gaussian Random Vectors (Reference: Sections 3.1 to 3.4 of Gallager)

  • Lecture 11: Detection/Hypothesis Testing (Reference: Sections 8.1 to 8.2 of Gallager)

  • Lecture 12: Detection/Hypothesis Testing: Examples (Reference: Sections 8.1 to 8.2 of Gallager)

  • Lecture 13: No class. Democracy day!

  • Lecture 14: Midterm

  • Lecture 15: Detection/Hypothesis Testing for Vector Gaussian Channel, Estimation (Reference: Sections 8.1 to 8.2, Sections 10.1-10.2 of Gallager)

  • Lecture 16: MMSE Estimation, Sufficient Statistics (Sections 10.1-10.2 of Gallager)

  • Lecture 17: Recursive Estimation and Kalman Filtering (Sections 10.1-10.2 of Gallager)

  • Lecture 18: Random Processes, Stationarity (Section 3.6 of Gallager)

  • Lecture 19: Gaussian Random Processes, Auto-Correlation Function (Section 3.6 of Gallager)

  • Lecture 20: Power Spectral Density