General Info:

- Lecturer:
Madhu
Sudan; MD 339; email: first name at cs dot
harvard dot edu; Office Hours: Thursday 2:30-4pm

- TF: TBD

- Time and Location: TuTh 1-2:30 in MD 119.

- Course announcement
- Grading Policy
- Scribe availability (Sample files for scribes: sample.tex; Uses: preamble.tex and fullpage.sty)
- Piazza site. (Please make sure you've joined this site to ensure you're getting all announcements.)

**Lecture Notes:
**

- Lecture 1: Introduction. Shearer's
Lemma. Plan for course. (My
notes. Scribe notes (tex,
pdf). )

- References for today's lecture:
[CFGShearer],
[Radhakrishnan].
Advanced reading [Friedgut],
[EllisFKY]

- Some cool 2d projections of 3d objects [Cover of Winkler's book], [Demaines' at work; scroll to the bottom!] (Thanks to Erik Demaine for the links!)
- Lecture 2: Entropy. Compression: Asymptotic and Single-shot. (My notes. Scribe notes (tex, pdf). )
- Sources for today's lecture: [Notes from Li and Tulsiani's course]
- Lecture 3: Basics of Entropy, Information, Relative Entropy etc. (My notes. Scribe notes (tex, pdf). )
- Sources for today's lecture:
[Original source was test by Cover
and Thomas, Chapter 2. Scribed notes from
Lectures 2 and 3 of this
course might also be helpful.]

- Lecture 4: CANCELLED (Madhu Travelling)
- Lecture 5: Entropy and Counting - 1: Sums of binomial coefficients. Coin-weighting problem. Presented by Vasileios and Manolis. (Scribe notes (tex, pdf). )
- Some reading materials (for lectures 5 and 6) [Radhakrishnan], [Galvin], [Babu&Radhakrishnan], [GavinskyLSS]
- Lecture 6: Entropy and Counting -
2: Bregman's lemma. Moore bound. Presented by Vasileios
and Manolis. (Scribe notes (tex,
pdf).)

- Lecture 7: Hypothesis testing,
total variation distance and Pinsker's lemma. Presented
by Angela. (Angela's
Notes. Scribe notes (tex,
pdf). )

- Reading material: Lecture
5 and Lecture
6 of Li and Tulsiani's course. Lecture
3 and Lecture
4 of Gurusami's course.

- Lecture 8: Stability in Shearer's
Lemma. Presented by Mustazee. (Scribe notes (tex, pdf).)

- Reading material [EllisFKY].
- Lecture 9: Introduction to Communication Complexity. (My notes. Scribe notes (tex, pdf).)
- Some References: [ChattopadhyayPitassi], [Radhakrishnan's lecture notes].
- Lecture 10: Set Disjointness-I: Information Complexity. Presented by Alex W. (My notes, Alex's notes, Scribed notes (tex, pdf)).
- Set disjointness. Reading material: [Bar-YossefJKS], Lecture 12 and Lecture 13 of Harsha et al's course.
- Lecture 11: Set Disjointness-II: Hellinger Distance. Presented by Minjae. (My chaotic notes (Part 1, Part 2). Scribed notes (tex, pdf))
- Lectures 12-14: Direct Sum in Communication Complexity and Internal Information Complexity.
- Lecture
12: Introduce Internal Information Complexity.
Direct Sum for IC. (My
notes, Scribe notes (tex, pdf)).

- Lecture
13: Compression of low IC protocols. (My notes, Scribe
notes (tex, pdf))

- Lecture
14: Information Equals (!) Amortized Communication.
Presented by Guannan. (Scribe notes (tex, pdf)).

- Reading material for above lectures:
- Compressing Interactive
Communication [BarakBCR],
Lecture
16 from Harsha et al. course.

- Amortized communication
complexity. Reading material: [BravermanRao]

- Lecture 15: Data Structure Lower
Bounds via Communication Complexity. (Scribe notes (tex, pdf).)

- Reading materials: [BroMiltersenNSW], [Patrascu].
- Lecture 16: Streaming lower bounds via Communication Complexity, (Scribe notes (tex, pdf).).
- Reading materials: [AlonMatiasSzegedy], [Woodruff], [Bar-YossefJKST], Harsha et al. lecture notes for Lecture 22.
- Lecture 17: Algorithmic Lovasz Local Lemma. Presented by Themis and Ali. (Scribe notes (tex, pdf).)
- Reading material: [MoserTardos].

- Lectures 18-19: Parallel
Repetition Theorem. Presented by Akshay, Pritish, and
Tanay. (Notes for Lecture 18 (tex, pdf). Notes for Lecture
19 (tex, pdf).)

- Lectures 20-21: Cancelled.
- Lecture 22: Extension complexity.
Presented by Sitan. Reading material: [BravermanMoitra].
Scribe notes (tex, pdf).

- Lecture 23: Graph Entropy and
Sorting. Presented by Jack M. Reading material: Lectures
4 and
5 from Anup Rao's course. Scribe notes (tex, pdf).

- Lecture 24: Guest lecture by Thomas
Steinke on Adaptive Data Analysis. Thomas's notes.

- Lecture 25: What we didn't cover:
Topics chosen by students. See piazza
for some posts.

**Paper List (still under
construction):
**

- An Information Statistics
Approach to Data Stream and Communication Complexity.

- A Direct
Sum Theorem in Communication Complexity via Message
Compression.

- How to Compress Interactive
Communication.

- Parallel
Repetition: Simplifications and the No-Signaling Case.

- The Homomorphism
Domination Exponent.

- A
Constructive Proof of the General Lovasz Local Lemma.

- Reed-Muller
Codes Achieve Capacity on the Binary Erasure Channel
under MAP Decoding.

- Reed-Muller
Codes Achieve Capacity on Erasure Channels.

- Entropy-based
Bounds on Dimension Reduction in L1.

- Survey:
Entropy and Counting.

- Survey: Graph
Entropy.

**"Related" Courses:
**

- Information Theory and
its Applications in Theory of Computation (Guruswami and
Cheraghchi at CMU).

- Information Theory in
Computer Science (Rao at the University of Washington)

- Information and Coding
Theory
(Tulsiani and Li at the University of Chicago).

- Information Theory in
Computer Science (Braverman at Princeton).

- Communication
Complexity (Harsha, Mahajan, and Radhakrishnan at
TIFR/IMSc).

- Information, Communication and Complexity Theory (Chakrabrarti at Dartmouth).
- Notes from "Communication
Complexity (for Algorithm Designers)" (Tim
Roughgarden at Stanford)