• Godel, Escher, Bach : https://www.amazon.com/Gödel-Escher-Bach-Eternal-Golden/dp/0...
• Crafting Interpreters : https://www.amazon.com/Crafting-Interpreters-Robert-Nystrom/...
• SICP : https://www.amazon.com/Structure-Interpretation-Computer-Pro...
[0] https://www.amazon.com/G%C3%B6dels-Proof-Ernest-Nagel/dp/081...
[1] https://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden...
1. https://www.amazon.com/Computational-Complexity-Approach-San...
2. https://www.amazon.com/Quantum-Computing-since-Democritus-Aa...
3. https://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden...
4. https://www.amazon.com/Introduction-Theory-Computation-Micha...
[1] https://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden...
Gödel, Escher, Bach - An Eternal Golden Braid, by Douglas Hofstadter. Anytime I start reading a story which contains recursiveness my mind will feel warped and stuck in a loop at times.
http://amzn.to/1KijebX (affiliate link)
http://www.amazon.com/gp/product/0465026567 (non affiliate link)
http://www.amazon.com/Phi-A-Voyage-Brain-Soul-ebook/dp/B0078...
Or for other views, you might check out V.S Ramachandran (neuroscience): http://www.amazon.com/Brief-Tour-Human-Consciousness-Imposto...
Jeff Hawkins (computer science): http://www.amazon.com/On-Intelligence-Jeff-Hawkins/dp/080507...
Hofstadter (mathematics, cognitive science): http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/...
Those are some of my favorite popular-press books on the subject.
http://en.wikipedia.org/wiki/G%C3%B6del's_incompleteness_the... http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/...
http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
http://www.amazon.com/Elements-Computing-Systems-Building-Pr...
http://www.amazon.com/Structure-Interpretation-Computer-Prog...
http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/...
http://www.amazon.com/Pragmatic-Programmer-Journeyman-Master...
Good luck!
http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/...
http://www.amazon.com/G%C3%83%C2%B6del-Escher-Bach-Eternal-G...
Having read this book would, in a fair world, be worth more on your CV/resumé than a large proportion of comp-sci degrees.
Even better, go out of your front door to a real bookshop and get them to order you one. Who knows, you might speak to someone! BONUS!
You can actually ask your question there as well in case this question gets unnoticed on HN; Quora people are very smart and pretty responsive
see http://en.wikipedia.org/wiki/Stochastic_process, http://en.wikipedia.org/wiki/Random_walk and do a search for Random Processes or Stochastic Processes on Amazon bookstore
Read about Entropy: http://en.wikipedia.org/wiki/Entropy A good book on Information theory can help you put it in context: http://www.amazon.com/s/ref=nb_sb_ss_c_1_18?url=search-alias...
Check out GMP http://gmplib.org/
If you're philosophically inclined read some existentialists, they deal a lot with irrationality and chaos: http://en.wikipedia.org/wiki/Existentialism
If you're financially inclined read Random Walk Down Wall Street: http://www.amazon.com/Random-Walk-Down-Wall-Street/dp/039331 and the Black Swan: http://www.amazon.com/Black-Swan-Impact-Highly-Improbable/dp... you may want to check out his other book as well, it is rather non-technical: http://www.amazon.com/Fooled-Randomness-Hidden-Chance-Market...
To learn more on how Wall Street deals with the stock market randomness read some books on Time Series analysis and forecasting, e.g the classic http://www.amazon.com/Time-Analysis-Forecasting-Probability-...
If you are a data scientist in heart read this great Q&A thread: http://www.quora.com/How-do-I-become-a-data-scientist
I wish I could help you with a link to a clear non-technical introductory article but this is all I've got. As random as it gets:)
Probably some good introductory book on science will fit the bill, science after all deals primarily with randomness. You may want to check out http://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/...
I found it in a used bookstore for $5 once, totally by accident. It's probably the best nonfiction book I've ever started (I unfortunately never bothered to finish, which isn't so good. It's still in my backpack in case I ever get stuck anywhere...).
I would recommend, at 14, getting him utterly hooked on the mindset of CS and related subjects. Godel, Escher, Bach, etc. If you can get him fascinated with the field, he'll find all the information he needs on his own better than any list of required reading you'll get.
http://video.google.com/videoplay?docid=7654043762021156507
The book is only $26, last I looked. The course software is all Open Source.
You'll see how digital logic is used to construct components like logic units and memory, which is then used to construct a computer, for which you create a computer language, which you then use to write an operating system, and finally, you program games on it.
This will then give you the wherewithal to really understand Godel-Escher-Bach:
http://www.amazon.com/Godel-Escher-Bach-Eternal-Golden/dp/04...
If you want, you can read the book first, but then go through the course and read it again. The 1st time you read it, much of it will be lost on you, but the 2nd time, you'll have many Ah-HA! moments.
One key is Automata Theory. Understand that, and you can understand what you are trying to ask about.
"zero shot" = ask an LLM to do it with a prompt
"few shot" = show a model (maybe an LLM) a few examples; LLMs perform well with "in context learning" which means giving a prompt AND showing some examples
"many shot" = train a model with many (typically 1000s) of examples.
The more training examples you have, the better results you get. A lot of people are seduced by ChatGPT because it promises fast results without a lot of hard work, rigorous thinking, and such, but you get back what you put in.
My RSS reader and agent YOShInOn uses
https://sbert.net/
to transform documents into vectors and then I apply classical ML techniques such as the support vector machine, logistic regression, k-means clustering and such. I used to do the same things with bag-of-words model, BERT-like models give a significant boost to the accuracy, are simple to implement, and run quickly. I can write a script that tests 1000s of alternative models a day.
The main classification YOShInOn does is "will I like this content?" which is a rather fuzzy problem that won't retest perfectly. I tried applying a fine-tuned model to this problem and after a few days of trying different things I developed a procedure that took 30 minutes to make a model about as good as my classical ML model take took more like 30 seconds to train. If my problem wasn't so fuzzy I'd benefit more from the fine tuning and someday I might apply YOShInOn to make a training set for a better defined problem but I am delighted with the system I have now because it does things that I've dreamed of for 20 years.
The whole "prompting" model is dangerously seductive for various reasons but the low-down is that language is treacherous. This classic book
https://www.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden...
is not such an easy read but it contains some parables that explain why making a chatbot do what people would like a chatbot will be like endlessly pushing a bubble under the rug and these problems are not about the technology behind the chatbot but about the problem that they are trying to solve.