- The Conquest of Happiness by Bertrand Russell was a pretty good one. He has a lot of ideas that were ahead of their time (positive psychology, etc). You can see a lot of parallels between his ideas and modern Stoicism (although Russell criticized it elsewhere, I think he came to some of the same conclusions).
- Introduction To Mathematical Philosophy by Bertrand Russell. Another Russell one. I think this is probably the clearest and easiest to understand explanation I've ever read of the underpinnings of mathematical foundations. It's written in a style that should be accessible to almost anyone with a high school education. He wrote it while he was in prison (for refusing to be drafted) during WW1. Apparently he left a copy of it to the prison warden.
- An Enquiry Concerning Human Understanding by David Hume. This is worth reading because it is the motivation for basically all of modern philosophy of science (at least in the west). It's also pretty easy to read and if you read it you'll be able to more easily understand other books and papers that are responses to it.
- Turing's Cathedral by George Dyson. This book should be required reading for every programmer or aspiring programmer IMO. I learned so much about the history of computing that I didn't know before reading this. You will not regret buying this one.
- I Am A Strange Loop by Douglas Hofstadter. Obviously everyone knows about GEB, but he also wrote a shorter follow up that in my opinion expresses his ideas much more clearly. I think that even if you disagree with him, it's worth reading because there are so many things you can take away from this book. For example, he talks about his wife's death, and ties that into his theory of mind and explains the unstated purposes of why we have funerals/wakes for people.
- An Introduction to Information Theory by John R. Pierce. For someone like me who doesn't really have a very strong math background, this was a very clear intro to the ideas behind information theory, and why they're important historically. I would recommend this to anyone who feels like they need a gentle intro to the ideas and motivation for them. Dover mathematics books in general are great.
- Borrow: The American Way of Debt by Louis Hyman. This is a fantastic historical overview of personal credit in the US that covers the past 120 years or so. I learned a ton from reading this that I had no clue about. Recommended to anyone who wants to understand the origins of credit cards / loans, and how society came to embrace being in debt.
https://archive.org/details/in.ernet.dli.2015.222834/page/n7
https://people.umass.edu/klement/imp/imp-ebk.pdf
https://archive.org/details/humeenquiry00humerich/page/n7
https://www.amazon.com/Turings-Cathedral-Origins-Digital-Uni...
https://www.amazon.com/Am-Strange-Loop-Douglas-Hofstadter/dp...
https://www.amazon.com/Introduction-Information-Theory-Symbo...
https://www.amazon.com/Borrow-American-Debt-Louis-Hyman/dp/0...
[1] https://www.amazon.com/Introduction-Information-Theory-Symbo...
http://www.amazon.com/Introduction-Information-Theory-Symbol...
In the particular case of a series of n equally-probably, independent events, the entropy is given as H = - log 1/n, measured in bits. For example, the entropy of a fair die is - log 1/6 = 2.58 bits per throw.
In this case, the random event is words chosen from a word list. Four words are chosen from fifty thousand, with each word having equal probability of being chosen. So the entropy (measure of randomness) is - log 1/50,000 = 15.6 bits per word, or 62.4 bits per four-word combination. (The script also adds random numbers or symbols, to add up to 90 bits.)
It deals with complicated information theory topics similar to Kolmogorov Complexity (albeit by a different name) in an easy-to-approach way. Highly recommended.
[1]: http://www.amazon.com/An-Introduction-Information-Theory-Mat...
Basically it has to do with measuring entropy / uncertainty in a message, and I'm finding it has applications almost everywhere, from file compression to image editing to highway design to just the way I talk and communicate information. Fun stuff!
The systems I’ve worked in immediately abstract strings, shapes in images, etc, into the mathematical shape and gaps between edges.
If you dig into an arbitrary array in a variety of places, the fields contains coordinates, not “Hi Mom, kids are ok, blah blah”.
It’s measuring the white space in a thing, where everything but the feature you’re currently interested in is white space; what’s between the features I want?
Then comparing that to results of other data structures that had the same white space measuring applied.
Does it not do what you said you do you not want to believe it?
I think the issue is the companies being incredibly disingenuous about how this all works.
At the root is elementary information theory: https://www.amazon.com/dp/0486240614
Formal language is 5,000 years old. Human intuition for quantitative assessment of hunger, warmth, supply stocks, tool building, etc is much older. IMO human language is noise obscuring obviousness. It’s the desktop metaphor of cognition. “Please internalize my language versus observe for yourself.”