\documentclass[10pt]{article}
\usepackage{amsfonts}
\newtheorem{define}{Definition}
%\newcommand{\Z}{{\bf Z}}
%\usepackage{psfig}
\usepackage{graphicx}
\oddsidemargin=0.15in
\evensidemargin=0.15in
\topmargin=-.5in
\textheight=9in
\textwidth=6.25in
\begin{document}
\input{preamble.tex}
\lecture{4}{September 20, 2004}{Madhu Sudan}{Victor Chen}
\section{Overview}
In this lecture, we shall examine the asymptotic performance of codes, prove some simple negative results concerning the parameters of codes, and analyze random and greedy codes.
\section{Parameters of a Code}
Let us recall the main parameters of a code from Lecture 3. For a code $C=(n,k,d)_q$, $n$ denotes the blocklength, $k$ denotes the message or information length, $d$ denotes the minimum distance of $C$, and $q$ denotes the alphabet size.
When $C$ is linear, we use a matrix like notation using square brackets and write $C=[n,k,d]_q$.
In practice, it is possible that the message can be processed by a streaming encoding scheme. However, in this course, we can always assume that the message length and blocklength are fixed. \\
To study the asymptotics of a code, we normalize the parameters.
\begin{define}
For an infinite family of code $C=\{(n_i,k_i,d_i)_q\}^{\infty}_{i=1}$, the rate of $C$ is defined to be
$R= \lim\ \inf_{i\rightarrow \infty} \{\frac{k_i}{n_i}\}$ and the relative distance is defined to be
$\delta = \lim\ \inf_{i\rightarrow \infty} \{\frac{d_i}{n_i}\}$.
\end{define}
We use $\lim \inf$ to guarantee that these two limits exist. Also, note that $q_i$ does not depend on $n_i$ in this definition. However, in certain cases, it is advantageous to construct an intermediary code where $q_i$ actually depends on $n_i$.
\section{Some Negative Results}
We now examine some negative results. The first we consider is the Singleton bound, due to R. C. Singleton. It perhaps is more appropriate to call it the projection bound as the following simple proof illustrates.
\begin{theorem}[Singleton Bound]
For a code $C=(n,k,d)_q$, $d \leq n - k + 1$. Asymptotically, $R + \delta \leq 1$.
\end{theorem}
\begin{proof}
$C$ has $q^k$ codewords, and we project all codewords of $C$ onto the first $k-1$ coordinates.
By the Pigeonhole Principle, two codewords must have the same projection since the total number of
possible projections is $q^{k-1}$. These two codewords must agree on the first $k-1$ coordinates, and
hence, they differ in at most $n-(k-1)$ coordinates.
\end{proof}
One may think that a more careful analysis may yield a tighter bound, but we shall see codes that meet the Singleton Bound in a later lecture. However, note that $q$ does not play a role in this bound. To bring $q$ into the picture, we re-examine the Hamming bound studied in a previous lecture. Recall that a code $C=(n,k,d)_q$ is $(d-1)/2$ error correcting. So balls of radius $(d-1)/2$ around each codewords in the space $\Sigma^n$ do not overlap. Define Vol$(r,n)$ to be the number of points in a ball of radius $r$ in $\Sigma^n$. Then clearly $q^k \cdot Vol_q(\frac{d-1}{2}, n) \leq q^n$.
Consider the binary case when $q=2$:
\begin{eqnarray*}
2^k \cdot Vol_2(\frac{d-1}{2},n) & \leq & 2^n, \\
2^k \cdot 2^{H_2(\frac{d-1}{2n})n} & \approx\leq & 2^n, \\
k+H_2(\frac{d-1}{2n})n & \approx\leq & n + o(n) \\
R + H_2(\delta/2) & \leq 1,
\end{eqnarray*}
where the second line follows for $p \leq 1/2$ and Stirling's formula for factorial. Hence, we have the following binary Hamming bound (also known as the volume or packing bound):
\begin{theorem}[Hamming Bound]
For an infinite family of binary code with rate $R$ and relative distance $\delta$, $R+H_2(\delta/2) \leq 1$.
\end{theorem}
Now we examine the $q$-ary Hamming bound $q^k \cdot Vol_q(\frac{d-1}{2}, n) \leq q^n$ again. First observe that $Vol_q(pn,n)=\sum_{i=0}^{pn} {n \choose i}(q-1)^i$, which is dominated by ${n\choose pn}(q-1)^{pn}$ for $0