EE 376A: Course Outline

Stanford University, Tsachy Weissman, Winter Quarter 2018-19

The course outline and slides/notes/references (if any) will be provided on this page (see introductory lecture slides for tentative course outline).

  • Lecture 1, Jan 8: Introductory lecture [slides]

  • Lecture 2, Jan 10: Information Measures

  • Lecture 3, Jan 15: Asymptotic Equipartition Property (AEP) and near-lossless compression

  • Lecture 4, Jan 17: Variable length compression: Huffman code, Kraft-McMillan inequality

  • Lecture 5, Jan 22: Dmitri Pavlichin - Entropy lower bound, genome + tabular compression, general tips [slides] [Bioinformatics] [IEEE Spectrum]

  • Lecture 6, Jan 24: Shubham Chandak - Stationary processes and entropy rate (Ref: Cover & Thomas 4.1, 4.2), Universal compressors - LZ77 (Ref: C&T 13.4, 13.5), Application to genomic data compression [Slides][Paper]. Additional resources on convergence of LZ [EE376C notes].

  • Lecture 7, Jan 29: Reliable communication I: channel capacity, examples

  • Lecture 8, Jan 31: Bianka Hofmann: From STEM to STEAM [slides]

  • Lecture 9, Feb 5: Mert Pilanci: Polar Codes [slides] [SC decoding example]

  • Lecture 10, Feb 7: Reliable communication II: Fano's inequality, channel coding converse

  • Lecture 11, Feb 12: Reliable communication III: Joint AEP, Achievability using random coding; Introduction to lossy compression

  • Lecture 12, Feb 14: Lossy compression I: rate-distortion function, examples

  • Lecture 13, Feb 19: Lossy compression II: direct and converse parts of main result

  • Lecture 14, Feb 21: Joint source-channel coding and the separation theorem

  • Lecture 15, Feb 26: Rosa Cao: The Allure of Informational Explanations [slides]

  • Lecture 16, Feb 28: Kedar Tatwawadi: Information Theory meets Machine Learning [slides] [SCW’19 panel]

  • Lecture 17, Mar 5: Irena Fischer-Hwang: Image Compression: From theory to practice [slides], Additional resources: [GIF], [PNG 1, PNG 2], [JPEG], [Human compression]

  • Lecture 18, Mar 7: David Tse: Deconstructing the Blockchain to Approach Physical Limits [slides] [paper] [short Youtube video]

  • Lecture 19, Mar 12: Yihui Quek: Intro to quantum information theory. References: Nielsen and Chuang: Sections 1.1, 1.2, 1.3 (background), 1.3.7 (teleportation), 2.3 (super-dense coding). Wilde: Section 6.1, 6.2.3 (super-dense coding), 6.2.4 (teleportation).

  • Lecture 20, Mar 14: Jonathan Dotan: Science of information in entertainment and web 3.0 (Related talk on April 11) (Related course next quarter: GSBGEN 578)
    Concluding remarks by instructor [slides]

Last year's (Winter 2018) course material

The lecture notes from last year are provided below and the lecture videos recorded by SCPD are available on Canvas. The timestamps connecting the topics to the lecture video are available here. The textbook used last year was Elements of Information Theory.

  • Jan 9: Introduction to Information Theory I

  • Jan 11: Introduction to Information Theory II

  • Jan 16: Information Measures

  • Jan 18: Asymptotic Equipartition Property (AEP)

  • Jan 23: Variable-length Lossless Compression

  • Jan 25: Kraft-McMillan Inequality and Huffman Coding

  • Jan 30: Optimality of Huffman Codes, Communication and Channel Capacity

  • Feb 1: Channel Capacity, Information measures for Continuous RVs

  • Feb 6: AWGN channel, Joint AEP

  • Feb 8: Channel Coding Theorem: Direct Part

  • Feb 13: Channel Coding Theorem: Converse Part

  • Feb 15: Lossy Compression and Rate Distortion Theory

  • Feb 20: Method of Types

  • Feb 22: Sanov's Theorem

  • Feb 27: Strong, Conditional and Joint Typicality

  • Mar 1: Strongly Typical Sequences and Rate Distortion

  • Mar 6: Strongly Typical Sequences and Rate Distortion 2

  • Mar 8: Joint Source-Channel Coding

  • Mar 13: Joint Source-Channel Coding 2, Slides

  • Mar 15: Information Theory in Machine Learning

Midterm Final