Teaching Assistant: Aditya Vikram Singh and Shubham Kumar Jha

- Midterm 1 will be held on Saturday, December 5, 2:00pm-5:00pm on MS Teams.
- Homework 2 has been posted. Quiz based on this HW will be held on November 24, 5:00pm.
- Homework 1 has been posted. Quiz based on this HW will be held on November 3, 5:00pm.
- The first (online) class to discuss class logistics on Thursday, October 1, 5:00pm. [MS Teams link]

- Online Classes: Tu-Th 5:00-6:30pm
- You can join the weekly discussion meetings on Thursdays at 5:00pm here: [MS Teams link]
- You can join the class Team by clicking here: [MS Teams link]

- Unit 1: What is information, probabilistic modeling of uncertainty (randomness), review of basic probability, Markov and Chebyshev inequality, Limit theorems [pdf] [video1] [video2] [video3] [video4] [video5]
- Unit 2: The source model, examples, a basic compression problem, Shannon entropy [pdf] [video1] [video2] [video3] [video4] [video5]
- Unit 3: Randomness and uncertainty, total variation distance,generating uniform random variables, generated by uniform random variables, typical sets and entropy [pdf] [video1] [video2] [video3] [video4] [video5]
- Unit 4: Basic formulations of statistical inference, examples, Neyman-Pearson formulation and likelihood ratio tests (LRT), Stein's lemma and KL divergence, properties of KL divergence [pdf] [video1] [video2] [video3] [video4] [video5]
- Unit 5: How many coin tosses are needed to test the bias of a coin?, ML and MAP tests, conditional entropy and mutual information, Fano's inequality (without proof) [pdf] [video1] [video2] [video3] [video4] [video5]
- Unit 6: Chain rules; nonnegativity, boundedness, and convexity/concavity of measures of information; and data processing inequality. [pdf] [video1] [video2] [video3] [video4]
- Unit 7: Fano's inequality, variational formula for KL divergence, Pinsker's inequality, transportation inequality and continuity of entropy. [pdf] [video1] [video2] [video3] [video4] [video5]
- Unit 8: Lower bound for compression and Shannon's source coding theorem, lower bound for hypothesis testing and Stein's lemma, lower bound for randomness generation, a strong converse for source coding, lower bounds for minmax risk. [pdf] [video1] [video2] [video3] [video4] [video5]
- Unit 9: Compression using variable length codes, prefix-free codes and Kraft's inequality, Shannon codes, Huffman codes. [pdf] [video1] [video2] [video3] [video4]
- Unit 10: Minmax redundancy, compression using word frequency and method of types, arithmetic codes, online probability assignment. [pdf] [video1] [video2] [video3] [video4]
- Unit 11: Hash and universal hash functions, fast database using hash tables, information theoretic lower bound. [pdf] [video1] [video2]
- Unit 12: Channel coding and channel capacity, duality bounds and examples. [pdf] [video1] [video2]
- Unit 13: Sphere packing bound for BSC, an optimal scheme for BSC, converse proof, general achievability. [pdf] [video1] [video2] [video3] [video4]
- Unit 14: Gaussian channel, mutual information for general distributions, differential entropy, capacity of Gaussian channel, power allocation and water-filling argument. [pdf] [video1] [video2] [video3] [video4]

Homework: | 30% | Roughly once every 2 weeks |

Mid-term | 30% | Take home (Based on the first 7 units) |

Final exam: | 40% | Take home |

- T. Cover and J. Thomas,
*Elements of Information Theory,*Second edition, Wiley, 2006. - I. Csiszàr and J. Körner,
*Information Theory: Coding Theorems for Discrete Memoryless Systems,*Second edition, Cambridge, 2011. - J. Wolfowitz,
*Coding Theorems of Information Theory,*Probability Theory and Stochastic Processes series, Springer, 1978. - A. Khinchin,
*Mathematical foundations of information theory,*Dover, 2001 edition. - A. Feinstein,
*Foundations of Information Theory,*McGraw-Hill, 1958. - T. S. Han,
*Information spectrum methods in Information Theory,*Stochastic Modeling and Applied Probability series, Springer, 2003. - R. G. Gallager,
*Information Theory and Reliable Communication,*Wiley, 1969.

- Compression of random data: fixed and variable length source coding theorems and entropy
- Hypothesis testing: Stein's lemma and Kullback-Leibler divergence
- Measures of randomness: Properties of Shannon entropy, Kullback-Leibler divergence, and Mutual Information
- Transmission over a noisy channel: channel coding theorems, joint source-channel coding theorem, the Gaussian channel
- Lower bounds on minimax cost in parameter estimation using Fano's inequality
- Quantisation and lossy compression: rate-distortion theorem

- Communication engineer
- Computer scientists interested in data compression
- Computer scientists interested in complexity theory
- Cryptographers interested in notions of security and quantum cryptography
- Data scientists interested in information theoretic methods and measures of information
- Statisticians interested in information theoretic lower bounds
- Physicists interested in Quantum Information Theory