Practical way of explaining "Information Theory"

I was going to recommend Feynman for pop-sci purposes, but on reflection I think it might be a good choice for easing into a serious study as well. You can't really know this stuff without getting the math, but Feynman is so evocative that he sneaks the math into without scaring the horses.

Feynman Lectures on Computation http://ecx.images-amazon.com/images/I/51BKJV58A9L._SL500_AA240_.jpg

Covers rather more ground than just information theory, but good stuff and pleasant to read. (Besides, I am obligated to pull for Team Physics. Rah! Rah! Rhee!)


Shanon's original paper "A mathematical theory of communication" is one very very important resource for studying this theory. Nobody NOBODY should miss it.

By reading it you will understand how Shanon arrived at the theory which should clear most of the doubts.

Also studying workings of Huffman compression algorithm will be very helpful.

EDIT:

An Introduction to Information Theory

John R. Pierce

seems good according to the amazon reviews (I haven't tried it).

[by Googleing "information theory layman" ]


My own view on "Information Theory" is that it's essentially just applied math / statistics but because it's being applied to communications / signals it's been called "Information Theory".

The best way to start understanding the concepts is to set yourself a real task. Say for example take a few pages of your favourite blog save it as a text file and then attempt to reduce the size of the file whilst ensuring you can still reconstruct the file completely (I.e. lossless compression). You'll start for example replacing all the instances of and with a 1 for example....

I'm always of the opinion learning by doing will be the best approach