E2 237, Fall 2024

Statistical Learning Theory


Lectures


Homework


Tests and grading policy


Mid Term Hours: 08:30-10:00
Mid Term Venue: EC 1.08, ECE main building

Final Hours: 14:00-17:00
Final Venue: EC 1.07, ECE main building

  • 13 Sep 2024: Mid Term 1 (25)
  • 18 Oct 2024: Mid Term 2 (25)
  • 22 Nov 2024: Final Exam (50)

Course Syllabus


  • Binary classification: SVM, kernel methods
  • Complexity bounds: bias complexity trade-off, Rademacher complexity, VC-dimension
  • Multiclass classification: decision trees, nearest neighbours
  • Estimation: parameter estimation, nonparametric regression
  • Optimization: stochastic gradient descent, minimax
  • Decision theory: statistical decision theory, large-sample asymptotics
  • Information theoretic bounds: mutual information method, lower bound via hypothesis testing, entropic bounds for statistical estimation, strong data processing inequality

Prerequisite


Instructor’s approval is required for crediting this course. Course requires a background in the first graduate course in probability theory and random processes.

Description


The aim of this course is to provide performance guarantees on various data driven algorithms for classification, estimation, and decision problems under uncertainty. The guarantees are provided by the upper and lower bounds on the algorithm accuracy as a function of the number of samples. The upper bounds are derived from the classical complexity results and the lower bounds follow from information theoretic techniques.

Teams/GitHub/Overleaf Information


Teams

We will use Microsoft Teams for all the course related communication.
Please do not send any email regarding the course.
You can signup for the course team Statistical-Learning-2024 using the following code 3m0ywvq.

Instructor


Parimal Parag
Office: EC 2.17
Hours: By appointment.

Time and Location


Classroom: EC 1.07, ECE main building
Hours: Tue/Thu 08:30-10:00.

Teaching Assistants


Varshini Mylabathula
Email: varshinim@iisc
Hours: By appointment.

References


Information Theory: From Coding to Learning, Yury Polyanskiy and Yihong Wu, Cambridge University Press, 2023.

Information-theoretic Methods for High-dimensional Statistics, Yihong Wu, Lecture notes.

High-Dimensional Statistics: A Non-asymptotic Viewpoint, Martin Wainwright, Cambridge University Press, 2019.

Foundations of Machine Learning, Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar, 2nd edition, MIT Press, 2018.

Introduction to Statistical Learning Theory, Olivier Bousquet, Stephane Boucheron, and Gabor Lugosi, Lecture notes.