Found 4 comments on HN
I was introduced to Shannon's theory through Pierce's text, which is also surprisingly good and cheap [1].

[1] https://www.amazon.com/Introduction-Information-Theory-Symbo...

The information theory (mathematical) definition of randomness is explained nicely in Chapter V of An Introduction to Information Theory:

http://www.amazon.com/Introduction-Information-Theory-Symbol...

In the particular case of a series of n equally-probably, independent events, the entropy is given as H = - log 1/n, measured in bits. For example, the entropy of a fair die is - log 1/6 = 2.58 bits per throw.

In this case, the random event is words chosen from a word list. Four words are chosen from fifty thousand, with each word having equal probability of being chosen. So the entropy (measure of randomness) is - log 1/50,000 = 15.6 bits per word, or 62.4 bits per four-word combination. (The script also adds random numbers or symbols, to add up to 90 bits.)

bqe · 2014-04-22 · Original thread
If you want an actual introduction to Information Theory, I'd recommend "An Introduction to Information Theory: Symbols, Signals and Noise".

It deals with complicated information theory topics similar to Kolmogorov Complexity (albeit by a different name) in an easy-to-approach way. Highly recommended.

[1]: http://www.amazon.com/An-Introduction-Information-Theory-Mat...

pizza · 2013-05-14 · Original thread
Information theory is a really cool, really practical form of math / computer science. I highly suggest reading up on it (I'm using this book to get through the basic ideas: http://www.amazon.com/Introduction-Information-Theory-Symbol...).

Basically it has to do with measuring entropy / uncertainty in a message, and I'm finding it has applications almost everywhere, from file compression to image editing to highway design to just the way I talk and communicate information. Fun stuff!

Get dozens of book recommendations delivered straight to your inbox every Thursday.