## Quantification and Representation of Information: Introduction

### Recapitulation

#### The Structure of Communication Processes

The Shannon Diagram

### Quantification of Information: "Information" is a measure of the decrease of uncertainty

The Information Theory Primer is a beautiful and "gentle" introduction to information theory by  Thomas D. Schneider of the NIH  Laboratory of Molecular Biology (local pdf copy).  See also Information Is Not Entropy, Information Is Not Uncertainty!

Another excellent introductory tutorial is "Claude Shannon and Information Theory" (URL is no longer available), a 1999 Sophomore College presentation by Marina Kassianidou, Vivek Srinivasan and Brent Villalobo of Stanford University (local copy).

Definitions of self information
Simple examples of self information - "entropy" - calculations: page 1 and page 2
Probabilistic notions in language:
Letter models of english and artificial languages
Word models of english

### Simple Discrete or Digital Codes - Digital Information

Morse (Vail) Code
ASCII Codes (local copy)

### UPC (Universal Product Code)

#### Definitions:

• "A signal that has a continuous nature rather than a pulsed or discrete nature. Note: Electrical or physical analogies, such as continuously varying voltages, frequencies, or phases, may be used as analog signals."

• "A nominally continuous electrical signal that varies in some direct correlation with another signal impressed on a transducer. Note: For example, an analog signal may vary in frequency, phase, or amplitude in response to changes in physical phenomena, such as sound, light, heat, position, or pressure."
• "The representation of information with a continuously variable physical quantity, such as voltage.  Because of this constant changing of the wave shape with regard to its passing a given point in time or space, an analog signal might have a virtually indefinite number of states or values.  This contrasts with a digital signal that… has a very limited number of discrete states." (source)

#### A prime example of analog information are the auditory signals carried by sound:

Sound wave visualization

Other CSCI E129 resources for sound visualization (sigsum, xpsound, and sndwave)

Complex sound signals: Bat signal

Complex sound signals: Human speech signal

### Spectral Analysis

With such complex information signals how can one even expect to any sort of quantitative understanding of the information content of an auditory message?  Spectral analysis (begining with Joseph Fourier's 1822 thesis) gives us the tools to achieve just such a quantitative understanding. The basic idea is that any time varying signal -- no matter how complex -- can be represented as a sum of sinusoidal components.  Does this help?  Yes, we shall see that a particular measure of the complexity of this spectral representation -- viz., the "bandwidth" -- directly relates to the information content of the message.  However, before we can appreciate the notions of spectral (Fourier) analysis or decomposition, we first explore the subject of spectral (Fourier) synthesis.

Spectral Synthesis:

Fourier Recipes
Fourier Synthesis: a nice spectral applet from Professor Fu-Kwun Hwang of the Department of Physics, National Taiwan Normal University: one of many collected at interactive tutorials.