E2 207 Concentration Inequalities (Aug-Dec 2020)

Instructors: Aditya Gopalan and Himanshu Tyagi

Latest Announcements

  • The first (online) class to discuss class logistics on Thursday, October 1, 2:00pm. [MS Teams link]

Time & Venue

  • Online classes: Tuesdays and Thursdays 2pm-3:30pm
  • Link to join the weekly discussion on Thursday at 2pm: [MS Teams link]
  • Link to send a request to join the class Team: [MS Teams link]

Homework

Agenda

We will be sharing recorded video lectures and the corresponding notes will be shared below.
  • Lecture 1: Introduction to the course: Law of large numbers, central limit theorem (with Berry-Esseen correction), large deviation; Markov and Chebyshev inequalities; course overview; numerical evidence of concentration.

Course description

It is well-known that functions of large numbers of random quantities tend to behave rather predictably and `less randomly’ than their constituents. For instance, the laws of large numbers tell us that the average of many independendent random variables is asymptotically the expected value; higher-order refinements such as the central limit theorem and large deviations techniques uncover the asymptotic rate at which this reduction in randomness takes place. However, if one is interested in sharper estimates, for the probability of deviation from the typical value, for a fixed number of observations, for functions other than the average, or for functions of dependent random variables, one must take recourse to more specific measure concentration bounds. Perhaps the most basic, nontrivial examples in this regard are the Markov and Chebyshev inequalities, which are encountered in a first course on probability. This graduate-level course on concentration inequalities will cover the basic material on this classic topic as well as introduce several advanced topics and techniques. The utility of the inequalities derived will be illustrated by drawing on applications from electrical engineering, computer science and statistics.

  1. Introduction & motivation: Limit results and concentration bounds
  2. Chernoff bounds: Hoeffding’s inequality, Bennett’s inequality, Bernstein’s inequality
  3. Variance bounds: Efron-Stein inequality, Poinca ́re inequality
  4. The entropy method and bounded difference inequalities
  5. Log-Sobolev inequalities and hypercontractivity
  6. Isoperimetric inequalities (Talagrand's convex distance inequality)
  7. The transportation method
  8. Influences and threshold phenomena

Textbook Stephane Boucheron, Gabor Lugosi, and Pascal Massart, ``Concentration Inequalities,” Oxford University Press, 2013.

Prerequisites

A course on either probability, random processes or measure theory. Basic mathematical maturity and working familiarity with probability calculations.

Grading

Homework (2-4)   60%
Literature review (to be done in groups of size 2)   30%
Class Participation   10%