\documentclass[11pt]{article}
\usepackage{amsfonts}
\usepackage{fullpage}
\usepackage{amsmath}
%\usepackage{epsfig}
\begin{document}
\input{preamble.tex}
\newcommand{\on}{\operatorname}
\lecture{13}{March 08, 2016}{Madhu Sudan}{Wyatt Mackey}
%%%% body goes in here %%%%
\section{Today's Class}
\begin{itemize}
\item We will complete the proof of the direct sum theorem, $CC(f^{\otimes n}) \ge \sqrt{n} \Omega(CC(f)-1)$
\item Compression of low information protocols
\begin{itemize}
\item Protocols, Trees, Priors
\item Correlated sampling
\item Path sampling
\item Analysis
\end{itemize}
\end{itemize}
\section{Review}
We showed last time that if $f^{\otimes n}$ has protocol with $\operatorname{Inf} \le I, \on{Com} \le C$, then $f$ has a protocol with $\on{Inf} \le I/n, \on{Comm} \le C$.
Today, we will show that if $f$ has a protocol with $\on{Inf} = I$ and $\on{Comm} = C$, then it has a protocol with $\on{Comm}$ growing $O(\sqrt{IC} \log C)$. We will do this by Compressing Interactive Communication.
\section{Protocols as Trees with Priors}
We can view protocols as trees, by beginning, for instance, with Alice speaking, and 0s or 1s take us down a particular path. Eventually we will reach a leaf. Call this protocol $\pi$. Our simulations, then, will do the following: Alice and Bob engage in some conversation following a protocol $\pi'$. At the end of this, Alice will output a long interaction $\pi_A$, and Bob $\pi_B$, and we want to guarantee that $\pi_A = \pi_B$ with high probability. Moreover, we want the distribution of $\pi_A, \pi_B \approx \pi$.
We have $(X, Y) \sim \mu$. Alice knows $X$, Bob knows $X \sim \mu_{X|Y}$. Given a node $V$ owned by Bob, say that the probability of going right from this node is $p_V$. Then let $P_V^A$ be the probability that Alice things we'll go right. Similarly, $P_V^B$ is the probability Bob thinks he'll go right. Then if $\pi = \pi_1\pi_2...\pi_k$, where $k = C = $ Comm. of the protocol, then we know $p_V \leftrightarrow \pi_i | \pi_{< i} X, Y$, $p_V^A \leftrightarrow \pi_i| \pi_{