E2 201 Information Theory (Aug-Dec 2020)

Instructor: Himanshu Tyagi
Teaching Assistant: Aditya Vikram Singh and Shubham Kumar Jha

Latest Announcements

  • Midterm 1 will be held on Saturday, December 5, 2:00pm-5:00pm on MS Teams.
  • Homework 2 has been posted. Quiz based on this HW will be held on November 24, 5:00pm.
  • Homework 1 has been posted. Quiz based on this HW will be held on November 3, 5:00pm.
  • The first (online) class to discuss class logistics on Thursday, October 1, 5:00pm. [MS Teams link]

Time & Venue

  • Online Classes: Tu-Th 5:00-6:30pm
  • You can join the weekly discussion meetings on Thursdays at 5:00pm here: [MS Teams link]
  • You can join the class Team by clicking here: [MS Teams link]

Homework

  • Homework 1 (Quiz on November 3) [pdf]
  • Homework 2 (Quiz on November 24) [pdf]

Agenda

We will be sharing recorded video lectures and the corresponding notes will be shared below.

Prerequisites

Undergraduate level probability and appetite for formal mathematical reasoning.

Grading

Homework:   30%   Roughly once every 2 weeks
Mid-term   30%   Take home (Based on the first 7 units)
Final exam:   40%   Take home

References

While there are many very good textbooks available for Information Theory, there is no required textbook for this course. Here I list a subset from which I will draw my lecture notes:
  1. T. Cover and J. Thomas, Elements of Information Theory, Second edition, Wiley, 2006.
  2. I. Csiszàr and J. Körner, Information Theory: Coding Theorems for Discrete Memoryless Systems, Second edition, Cambridge, 2011.
  3. J. Wolfowitz, Coding Theorems of Information Theory, Probability Theory and Stochastic Processes series, Springer, 1978.
  4. A. Khinchin, Mathematical foundations of information theory, Dover, 2001 edition.
  5. A. Feinstein, Foundations of Information Theory, McGraw-Hill, 1958.
  6. T. S. Han, Information spectrum methods in Information Theory, Stochastic Modeling and Applied Probability series, Springer, 2003.
  7. R. G. Gallager, Information Theory and Reliable Communication, Wiley, 1969.

Course description

Information Theory is the science for measuring, preserving, transmitting, and estimating information in random data. It was initially proposed by Shannon as a mathematical theory of communication more than five decades ago. It provides the fundamental limits of performance for transmission of messages generated by a random source over a noisy communication channel. On the one hand, Information Theory has been the driving force behind the revolution in digital communication and has led to various practical data compression and error correcting codes that meet the fundamental theoretical limits of performance. On the other hand, over the years, techniques and concepts from Information Theory have found applications well beyond communication theory. In this course, we will introduce the basic notions and results of Information Theory, keeping in mind both its fundamental role in communication theory and its varied applications beyond communication theory. This course, and the follow-up advanced courses to be offered in the future, will be of interest to students from various backgrounds.

Broad list of topics

  • Compression of random data: fixed and variable length source coding theorems and entropy
  • Hypothesis testing: Stein's lemma and Kullback-Leibler divergence
  • Measures of randomness: Properties of Shannon entropy, Kullback-Leibler divergence, and Mutual Information
  • Transmission over a noisy channel: channel coding theorems, joint source-channel coding theorem, the Gaussian channel
  • Lower bounds on minimax cost in parameter estimation using Fano's inequality
  • Quantisation and lossy compression: rate-distortion theorem

Who should take this course

  • Communication engineer
  • Computer scientists interested in data compression
  • Computer scientists interested in complexity theory
  • Cryptographers interested in notions of security and quantum cryptography
  • Data scientists interested in information theoretic methods and measures of information
  • Statisticians interested in information theoretic lower bounds
  • Physicists interested in Quantum Information Theory