\documentclass[10pt]{article}
\usepackage{amsfonts}
\newtheorem{define}{Definition}
%\newcommand{\Z}{{\bf Z}}
\usepackage{psfig}
\oddsidemargin=0.15in
\evensidemargin=0.15in
\topmargin=-.5in
\textheight=9in
\textwidth=6.25in
\begin{document}
\input{preamble.tex}
\lecture{1}{September 7, 2005}{Madhu Sudan}{Swastik Kopparty}
\section{Motivation}
\subsection{Historical}
Historically, the first algorithms were for manipulations of numbers.
Take a simple example like that of multiplying two $n$ bit numbers.
We know 3 algorithms:
\begin{itemize}
\item Repeated addition - $exp(n)$ time
\item Long multiplication - $n^2$ time
\item FFT based - $n $polylog$(n)$ time
\end{itemize}
Increasing algebraic sophistication got us better running times.
The roots of both words "algebra" and "algorithm" are related.
"Algorithm" comes from Al-Khwarizmi, author of "Al Jabr", from which
"algebra" comes.
\subsection{Problems}
Algebra is a source of many interesting problems, very related to important computational problems.
They all fall under the theme "Find a solution subject to many constraints".
Consider the following problems:
\begin{itemize}
\item Sorting
\item SAT: find $x_1, ... x_n$ s.t. $\forall$ clause $C \in {C_1, \ldots, C_m}$, $C_i(x) = true$.
\item Linear equations: find $x_1, ... x_n$ s.t. $\forall$ linear functions $l \in {l_1, \ldots l_m}$, $l_i(x) = b_i$.
Notice the similarity.
\item Linear programming is similar
\item Root finding: given $a_0, \ldots a_n$, find $x$ s.t. $\sum_i a_i x^i = 0$
\item Primality testing: test if $N$ is a prime number.
\end{itemize}
\subsection{Many intriguing \em{hard} problems}
Consider the following natural problem on finding solutions to a system of polynomial equations:
\begin{itemize}
\item Given $P_1, \ldots P_m$ polynomials in variables $x_1, \ldots, x_n$ with $0,1$ coefficients
\item Find $x_1, \ldots, x_n$ real numbers satisfying
\end{itemize}
This problem is NP hard and is solvable in PSPACE.
Assuming the Extended Riemann Hypothesis, it is in the Polynomial Hierarchy! This is representative
of the very intriguing algorithmic nature of algebraic problems.
Factoring integers, of course, is another such question.
The permanent of a matrix is an important function. Recall that
$$det(A) = \sum_{\pi \in S_n} (-1)^{sign(\pi)} \prod_{i = 1}^n A_{i\pi(i)}.$$
Analogously define the permanent
$$per(A) = \sum_{\pi \in S_n} \prod_{i = 1}^n A_{i\pi(i)}.$$
We don't know a polynomial time algorithm to compute the permanent of a matrix.
The fact that the permanent is computationally difficult and also a low degree
polynomial has many ramifications for complexity theory.
\section{Contents of the course}
\subsubsection{Review}
Groups, Rings, Fields. Notable is Gauss' lemma, a suffucient condition for factorization to be
well defined
\subsubsection{Sample of Algorithms}
FFT, Primality Testing, Matrix Multiplication
\subsubsection{Factorization of polynomials}
Over finite fields, square roots over finite fields. Over rationals, Lenstra-Lenstra-Lovasz algorithm.
Polynomials in several variables.
\subsubsection{Ideals, Varieties, Algorithms}
After a book by the same title by Cox, Little, O'Shea. Complexity of solving a system of
polynomial equations in several variables.
\subsubsection{Complexity Results}
This will be of a survey nature.
\subsection{Assignment}
Email madhu@mit.edu with
\begin{itemize}
\item Who you are
\item Your background
\item Why you are in the course
\end{itemize}
\end{document}