A Brief Tutorial on:

Information Theory, Excess Entropy and Statistical Complexity:

Discovering and Quantifying Statistical Structure



Course Materials

I produced these lecture notes during July of 1997 for use in conjunction with a series of three lectures I gave at the Santa Fe Institute. I did a light edit of these notes in April of 1998 and again in October of 2002. These notes are still a little rough in some places and aren't as thorough as they should be. Nevertheless, they are (I believe) mostly error-free and could be of some use to others.

If you're looking for an introduction to computational mechanics, I also recommend a recent paper of mine, Discovering Non-Critical Organization: Statistical Mechanical, Information Theoretic, and Computational Views of Patterns in One-Dimensional Spin Systems. This paper has fairly lengthy review of information theory and computational mechanics. It's written at a slightly higher level than these lecture notes, but nevertheless should be quite readable.

For more on Shannon entropy convergence and how this leads to a variety of measures of structure and memory, you might want to see Regularities Unseen, Randomness Observed: Levels of Entropy Convergence, another recent paper.

The Lecture Notes


Other tutorials and pedagogical pieces can be found at http://www.santafe.edu/projects/CompMech/tutorials/CompMechTutorials.html




Links to Helpful Places



(Last Updated 10 October 2002.)



Page maintained by David Feldman
College of the Atlantic
105 Eden St.
Bar Harbor, Maine 04609
dave@hornacek.coa.edu