E2 206 Information and Communication Complexity (Jan-May 2017)

Instructor: Himanshu Tyagi

Latest Announcements

  • HW2 posted. Due on April 30.
  • List of course projects posted here.
  • HW1 posted. Due on Wednesday, February 15.
  • First class on Wednesday, January 4.

Time & Venue

  • Class: W 10:00 am - 12:30 pm in ECE 1.07

Homework

Agenda

Starred lectures are 2.5 hours long.
    1. Basics and motivation
  • Lecture 1: Tree protocols; deterministic, randomized, and distributional communication complexity; relation between randomized and distributional communication complexity
  • Lecture 2: Random hashing for equality; 2-universal hash family; resolving a list of guesses; Slepian-Wolf theorem
  • Lecture 3*: Leftover hash lemma (several versions); Function graph coloring and protocols for function computation
  • Lecture 4: Simulating a zero-error 1-round protocol using Slepian-Wolf compression; improved simulation of 1-round protocols; amortized distributional complexity and information complexity
  • Lecture 5*: Proof of lower bound for amortized distributional complexity; properties of interactive protocols; Application 1: Circuit complexity, circuit depth and the P vs NP question
  • Lecture 6*: Karchmer-Widgerson duality for circuit depth and communication complexity of relations; lower bound for depth of circuit for finding a linear size matching;
  • Lecture 7*: Lower bound for matching (contd.); reduction of matching to set-disjointness. Application 2: Streaming algorithms; Misra-Gries sketch; reduction of majority to index and the lower bound for storage required for 1-pass computation of majority.
  • 2. Lower bounds
  • Lecture 8*: Lower bounds for deterministic communication complexity: Fooling set, rectangle size, rank lower bound; bounds among various measures of communication complexity: Randomized vs. deterministic, simple vs. interactive, private vs. public coins, boosting probability of correctness
  • Lecture 9*: Rectangle lower bound for distributional communication complexity; Razborov's lower bound for randomized communication complexity of set-disjointness
  • Lecture 10*: Partition bound and its relation to other bounds; Information complexity approach; Information complexity based lower bound for set-disjointness
  • Lecture 11*: Holenstein's consistent sampling; Braverman-Rao sampling with communication
  • Lecture 12*: BBCR direct-sum theorem using Information complexity; introduction to parallel repetition lemma
  • 3. Special Topics
  • Coming up ...
    Holenstein's proof of parallel repetition lemma; Lifting query complexity bounds to communication complexity (Gòòz-Pitassi-Watson approach); Communication complexity of distributed inference.

Course description

How many bits must two (or more parties) communicate to compute a function of their collective data? This fundamental problem of communication complexity, besides being of independent interest, has emerged as a standard tool for establishing lower bounds for computation complexity, circuit complexity, memory requirements for streaming algorithms and many more problems in theoretical computer science. E2206 is a graduate level course aimed at introducing communication complexity to information theorists. In addition to covering the basic concepts, special emphasis will be placed on the use of information theoretic techniques. In particular, we shall cover the recent developments based on relating information complexity to communication complexity.
  1. Deterministic communication complexity; Randomized communication complexity
  2. Distributional communication complexity; Yao’s minimax theorem
  3. Interactive data compression and function computation
  4. Variable partition model: Computation over networks and Complexity of implementing VLSI circuits
  5. Set disjointness: Space complexity of streaming algorithms
  6. Communication complexity of relations: Circuit complexity
  7. Lower bounds for deterministic communication complexity
  8. Reduction based lower bounds
  9. Partition lower bound
  10. Information complexity based lower bounds
References
  1. E. Kushilevitz and N. Nisan, Communication Complexity, Cambridge University Press, 1997.
  2. We will draw most of the material from research papers and online lecture notes.

Prerequisites

A course on Information Theory or Computational Complexity Theory. Basic mathematical maturity and working familiarity with probability calculations.

Grading

Homework (2-4)   60%
Literature review (to be done in groups of size 2)   30%
Class Participation   10%