Found in 17 comments on Hacker News
melling · 2020-12-31 · Original thread
I’m up to Chapter 6 in ISLR

https://github.com/melling/ISLR

Would Elements of Statistical Learning be my next book?

I’ve seen the Bishop book highly recommended too, and it has been mentioned in this post.

https://www.amazon.com/Pattern-Recognition-Learning-Informat...

selmat · 2019-02-22 · Original thread
From my experience, these resources are worth read:

[1] Pattern Recognition and Machine Learning (Information Science and Statistics) by Christopher M. Bishop

Andreas Brandmaier's permutation distribution clustering is a method rooted in the dissimilarities between time series, formalized as the divergence between their permutation distributions. Personally, I think this is your "best" option http://cran.r-project.org/web/packages/pdc/index.html

Eamonn Keogh's SAX (Symbolic Aggregate Approximation) and iSAX routines develop "shape clustering" for time series

http://www.cs.ucr.edu/~eamonn/SAX.htm

There are approaches based on text compression algorithms that remove the redundancy in a sequence of characters (or numbers), creating a kind of distance or density metric that can be used as inputs to clustering, see, e.g.:

http://link.springer.com/chapter/10.1007/978-0-387-84816-7_4

This paper by Rob Hyndman Dimension Reduction for Clustering Time Series Using Global Characteristics, discusses compressing a time series down to a small set of global moments or metrics and clustering on those:

http://www.robjhyndman.com/papers/wang2.pdf

Chapter 15 in Aggarwal and Reddy's excellent book, Data Clustering, is devoted to a wide range (a laundry list, really) of time-series clustering methods (pps 357-380). The discussion provides excellent background to many of the issues specific to clustering a time series"

http://users.eecs.northwestern.edu/~goce/SomePubs/Similarity...

...and a lot more.

-- URL --

[1] https://www.amazon.com/Pattern-Recognition-Learning-Informat...

> study textbooks. Do exercises. Treat it like academic studying

This. Highly recommend Russel & Norvig [1] for high-level intuition and motivation. Then Bishop's "Pattern Recognition and Machine Learning" [2] and Koller's PGM book [3] for the fundamentals.

Avoid MOOCs, but there are useful lecture videos, e.g. Hugo Larochelle on belief propagation [4].

FWIW this is coming from a mechanical engineer by training, but self-taught programmer and AI researcher. I've been working in industry as an AI research engineer for ~6 years.

[1] https://www.amazon.com/Artificial-Intelligence-Modern-Approa...

[2] https://www.amazon.com/Pattern-Recognition-Learning-Informat...

[3] https://www.amazon.com/Probabilistic-Graphical-Models-Princi...

[4] https://youtu.be/-z5lKPHcumo

Yadi · 2018-10-01 · Original thread
In machine learning, hands down these are some of the best related textbooks:

- [0] Pattern Recognition and Machine Learning (Information Science and Statistics)

and also:

- [1] The Elements of Statistical Learning

- [2] Reinforcement Learning: An Introduction by Barto and Sutton

- [3] The Deep Learning by Aaron Courville, Ian Goodfellow, and Yoshua Bengio

- [4] Neural Network Methods for Natural Language Processing (Synthesis Lectures on Human Language Technologies) by Yoav Goldberg

Then some math tid-bits:

[5] Introduction to Linear Algebra by Strang

----------- links:

- [0] [PDF](http://users.isr.ist.utl.pt/~wurmd/Livros/school/Bishop%20-%...)

- [0][AMZ](https://www.amazon.com/Pattern-Recognition-Learning-Informat...)

- [2] [amz](https://www.amazon.com/Reinforcement-Learning-Introduction-A...)

- [2] [site](https://www.deeplearningbook.org/)

- [3] [amz](https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma...)

- [3] [pdf](http://incompleteideas.net/book/bookdraft2017nov5.pdf)

- [4] [amz](https://www.amazon.com/Language-Processing-Synthesis-Lecture...)

- [5] [amz](https://www.amazon.com/Introduction-Linear-Algebra-Gilbert-S...)

I made the same transition earlier in my career. One book on deep learning that meets your requirements is [0]. It’s readable, covers a broad set of modern topics, and has pragmatic tips for real use cases.

For general machine learning, there are many, many books. A good intro is [1] and a more comprehensive, reference sort of book is [2]. Frankly, by this point, even reading the documentation and user guide of scikit-learn has a fairly good mathematical presentation of many algorithms. Another good reference book is [3].

Finally, I would also recommend supplementing some of that stuff with Bayesian analysis, which can address many of the same problems, or be intermixed with machine learning algorithms, but which is important for a lot of other reasons too (MCMC sampling, hierarchical regression, small data problems). For that I would recommend [4] and [5].

Stay away from bootcamps or books or lectures that seem overly branded with “data science.” This usually means more focus on data pipeline tooling, data cleaning, shallow details about a specific software package, and side tasks like wrapping something in a webservice.

That stuff is extremely easy to learn on the job and usually needs to be tailored differently for every different project or employer, so it’s a relative waste of time unless it is the only way you can get a job.

[0]: < https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma... >

[1]: < https://www.amazon.com/Pattern-Classification-Pt-1-Richard-D... >

[2]: < https://www.amazon.com/Pattern-Recognition-Learning-Informat... >

[3]: < http://www.web.stanford.edu/~hastie/ElemStatLearn/ >

[4]: < http://www.stat.columbia.edu/~gelman/book/ >

[5]: < http://www.stat.columbia.edu/~gelman/arm/ >

neel8986 · 2018-01-17 · Original thread
This is kind of a masters degree course i created for myself to get knowledge of Machine Learning from bottoms up

First, you need a strong mathematical base. Otherwise, you can copy paste an algorithm or use an API but you will not get any idea of what is happening inside Following concepts are very essential

1) Linear Algebra (MIT https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra... ) 2) Probability (Harvard https://www.youtube.com/watch?v=KbB0FjPg0mw )

Get some basic grasp of machine learning. Get a good intuition of basic concepts

1) Andrew Ng coursera course (https://www.coursera.org/learn/machine-learning)

2) Tom Mitchell book (https://www.amazon.com/Machine-Learning-Tom-M-Mitchell/dp/00...)

Both the above course and book are super easy to follow. You will get a good idea of basic concepts but they lack in depth. Now you should move to more intense books and courses

You can get more in-depth knowledge of Machine learning from following sources

1)Nando machine learning course ( https://www.youtube.com/watch?v=w2OtwL5T1ow)

2)Bishops book (https://www.amazon.in/Pattern-Recognition-Learning-Informati...)

Especially Bishops book is really deep and covers almost all basic concepts.

Now for recent advances in Deep learning. I will suggest two brilliant courses from Stanford

1) Vision ( https://www.youtube.com/watch?v=NfnWJUyUJYU )

2) NLP ( https://www.youtube.com/watch?v=OQQ-W_63UgQ)

The Vision course by Karparthy can be a very good introduction to Deep learning. Also, the mother book for deep learning ( http://www.deeplearningbook.org/ )is good

partycoder · 2017-11-14 · Original thread
The AI for humans series is some reasonable, high level approach. http://www.heatonresearch.com/aifh/

After you've got a grasp of what these things are doing then you can move into the how. For that you will need some math background, with emphasis in calculus and probability.

After that, you can take a look at PRML. https://www.amazon.com/Pattern-Recognition-Learning-Informat...

Some people might prefer seeing things from another approach. http://pgm.stanford.edu/

Good luck.

rm999 · 2016-12-04 · Original thread
The introduction clarifies what the authors mean. In this context "model" isn't about implementing a supervised model, it's about "modeling" your problem to build a bespoke algorithm that closely matches the problem. Unsupervised methods like clustering would probably fit in here too.

I haven't read much of this early access book yet, but I'd give the authors a lot of benefit of the doubt. Christopher Bishop wrote one of my favorite machine learning books (I read it after my graduate study in machine learning and it filled in a lott of the gaps): https://www.amazon.com/Pattern-Recognition-Learning-Informat...

todd8 · 2016-10-09 · Original thread
Depending on your level of programming ability, one algorithm a day, IMHO, is completely doable. A number of comments and suggestions say that one per day is an unrealistic goal (yes, maybe it is) but the idea of setting a goal and working through a list of algorithms is very reasonable.

If you are just learning programming, plan on taking your time with the algorithms but practice coding every day. Find a fun project to attempt that is within your level of skill.

If you are a strong programmer in one language, find a book of algorithms using that language (some of the suggestions here in these comments are excellent). I list some of the books I like at the end of this comment.

If you are an experienced programmer, one algorithm per day is roughly doable. Especially so, because you are trying to learn one algorithm per day, not produce working, production level code for each algorithm each day.

Some algorithms are really families of algorithms and can take more than a day of study, hash based look up tables come to mind. First there are the hash functions themselves. That would be day one. Next there are several alternatives for storing entries in the hash table, e.g. open addressing vs chaining, days two and three. Then there are methods for handling collisions, linear probing, secondary hashing, etc.; that's day four. Finally there are important variations, perfect hashing, cuckoo hashing, robin hood hashing, and so forth; maybe another 5 days. Some languages are less appropriate for playing around and can make working with algorithms more difficult, instead of a couple of weeks this could easily take twice as long. After learning other methods of implementing fast lookups, its time to come back to hashing and understand when its appropriate and when alternatives are better and to understand how to combine methods for more sophisticated lookup methods.

I think you will be best served by modifying your goal a bit and saying that you will work on learning about algorithms every day and cover all of the material in a typical undergraduate course on the subject. It really is a fun branch of Computer Science.

A great starting point is Sedgewick's book/course, Algorithms [1]. For more depth and theory try [2], Cormen and Leiserson's excellent Introduction to Algorithms. Alternatively the theory is also covered by another book by Sedgewick, An Introduction to the Analysis of Algorithms [3]. A classic reference that goes far beyond these other books is of course Knuth [4], suitable for serious students of Computer Science less so as a book of recipes.

After these basics, there are books useful for special circumstances. If your goal is to be broadly and deeply familiar with Algorithms you will need to cover quite a bit of additional material.

Numerical methods -- Numerical Recipes 3rd Edition: The Art of Scientific Computing by Tuekolsky and Vetterling. I love this book. [5]

Randomized algorithms -- Randomized Algorithms by Motwani and Raghavan. [6], Probability and Computing: Randomized Algorithms and Probabilistic Analysis by Michael Mitzenmacher, [7]

Hard problems (like NP) -- Approximation Algorithms by Vazirani [8]. How to Solve It: Modern Heuristics by Michalewicz and Fogel. [9]

Data structures -- Advanced Data Structures by Brass. [10]

Functional programming -- Pearls of Functional Algorithm Design by Bird [11] and Purely Functional Data Structures by Okasaki [12].

Bit twiddling -- Hacker's Delight by Warren [13].

Distributed and parallel programming -- this material gets very hard so perhaps Distributed Algorithms by Lynch [14].

Machine learning and AI related algorithms -- Bishop's Pattern Recognition and Machine Learning [15] and Norvig's Artificial Intelligence: A Modern Approach [16]

These books will cover most of what a Ph.D. in CS might be expected to understand about algorithms. It will take years of study to work though all of them. After that, you will be reading about algorithms in journal publications (ACM and IEEE memberships are useful). For example, a recent, practical, and important development in hashing methods is called cuckoo hashing, and I don't believe that it appears in any of the books I've listed.

[1] Sedgewick, Algorithms, 2015. https://www.amazon.com/Algorithms-Fourth-Deluxe-24-Part-Lect...

[2] Cormen, et al., Introduction to Algorithms, 2009. https://www.amazon.com/s/ref=nb_sb_ss_i_1_15?url=search-alia...

[3] Sedgewick, An Introduction to the Analysis of Algorithms, 2013. https://www.amazon.com/Introduction-Analysis-Algorithms-2nd/...

[4] Knuth, The Art of Computer Programming, 2011. https://www.amazon.com/Computer-Programming-Volumes-1-4A-Box...

[5] Tuekolsky and Vetterling, Numerical Recipes 3rd Edition: The Art of Scientific Computing, 2007. https://www.amazon.com/Numerical-Recipes-3rd-Scientific-Comp...

[6] https://www.amazon.com/Randomized-Algorithms-Rajeev-Motwani/...

[7]https://www.amazon.com/gp/product/0521835402/ref=pd_sim_14_2...

[8] Vazirani, https://www.amazon.com/Approximation-Algorithms-Vijay-V-Vazi...

[9] Michalewicz and Fogel, https://www.amazon.com/How-Solve-Heuristics-Zbigniew-Michale...

[10] Brass, https://www.amazon.com/Advanced-Data-Structures-Peter-Brass/...

[11] Bird, https://www.amazon.com/Pearls-Functional-Algorithm-Design-Ri...

[12] Okasaki, https://www.amazon.com/Purely-Functional-Structures-Chris-Ok...

[13] Warren, https://www.amazon.com/Hackers-Delight-2nd-Henry-Warren/dp/0...

[14] Lynch, https://www.amazon.com/Distributed-Algorithms-Kaufmann-Manag...

[15] Bishop, https://www.amazon.com/Pattern-Recognition-Learning-Informat...

[16] Norvig, https://www.amazon.com/Artificial-Intelligence-Modern-Approa...

torustic · 2016-08-21 · Original thread
In retrospect, my other comment was stupidly obtuse. Both too technical (in the sense of specificity) and too unstructured (in the sense of presentation order). A more appropriate path from CS might be analogous (well, inverse if anything) to the path Robert Goldblatt has taken. It dips into nonstandard analysis, but not totally without reason. Some subset of the following, with nLab and Wikipedia supplementing as necessary:

0. Milewski's "Category Theory for Programmers"[0]

1. Goldblatt's "Topoi"[1]

2. McLarty's "The Uses and Abuses of the History of Topos Theory"[2] (this does not require [1], it just undoes some historical assumptions made in [1] and, like everything else by McLarty, is extraordinarily well-written)

3. Goldblatt's "Lectures on the Hyperreals"[3]

4. Nelson's "Radically Elementary Probability Theory"[4]

5. Tao's "Ultraproducts as a Bridge Between Discrete and Continuous Analysis"[5]

6. Some canonical machine learning text, like Murphy[6] or Bishop[7]

7. Koller/Friedman's "Probabilistic Graphical Models"[8]

8. Lawvere's "Taking Categories Seriously"[9]

From there you should see a variety of paths for mapping (things:Uncertainty) <-> (things:Structure). The Giry monad is just one of them, and would probably be understandable after reading Barr/Wells' "Toposes, Triples and Theories"[10].

The above list also assumes some comfort with integration. Particularly good books in line with this pedagogical path might be:

9. Any and all canonical intros to real analysis

10. Malliavin's "Integration and Probability"[11]

11. Segal/Kunze's "Integrals and Operators"[12]

Similarly, some normative focus on probability would be useful:

12. Jaynes' "Probability Theory"[13]

13. Pearl's "Causality"[14]

---

[0] https://bartoszmilewski.com/2014/10/28/category-theory-for-p...

[1] https://www.amazon.com/Topoi-Categorial-Analysis-Logic-Mathe...

[2] http://www.cwru.edu/artsci/phil/UsesandAbuses%20HistoryTopos...

[3] https://www.amazon.com/Lectures-Hyperreals-Introduction-Nons...

[4] https://web.math.princeton.edu/%7Enelson/books/rept.pdf

[5] https://www.youtube.com/watch?v=IS9fsr3yGLE

[6] https://www.amazon.com/Machine-Learning-Probabilistic-Perspe...

[7] https://www.amazon.com/Pattern-Recognition-Learning-Informat...

[8] https://www.amazon.com/Probabilistic-Graphical-Models-Princi...

[9] http://www.emis.de/journals/TAC/reprints/articles/8/tr8.pdf

[10] http://www.tac.mta.ca/tac/reprints/articles/12/tr12.pdf

[11] https://www.springer.com/us/book/9780387944098

[12] https://www.amazon.com/Integrals-Operators-Grundlehren-mathe...

[13] http://www.med.mcgill.ca/epidemiology/hanley/bios601/Gaussia...

[14] https://www.amazon.com/Causality-Reasoning-Inference-Judea-P...

shogunmike · 2015-05-12 · Original thread
Some good books on Machine Learning:

Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Flach): http://www.amazon.com/Machine-Learning-Science-Algorithms-Se...

Machine Learning: A Probabilistic Perspective (Murphy): http://www.amazon.com/Machine-Learning-Probabilistic-Perspec...

Pattern Recognition and Machine Learning (Bishop): http://www.amazon.com/Pattern-Recognition-Learning-Informati...

There are some great resources/books for Bayesian statistics and graphical models. I've listed them in (approximate) order of increasing difficulty/mathematical complexity:

Think Bayes (Downey): http://www.amazon.com/Think-Bayes-Allen-B-Downey/dp/14493707...

Bayesian Methods for Hackers (Davidson-Pilon et al): https://github.com/CamDavidsonPilon/Probabilistic-Programmin...

Doing Bayesian Data Analysis (Kruschke), aka "the puppy book": http://www.amazon.com/Doing-Bayesian-Data-Analysis-Second/dp...

Bayesian Data Analysis (Gellman): http://www.amazon.com/Bayesian-Analysis-Chapman-Statistical-...

Bayesian Reasoning and Machine Learning (Barber): http://www.amazon.com/Bayesian-Reasoning-Machine-Learning-Ba...

Probabilistic Graphical Models (Koller et al): https://www.coursera.org/course/pgm http://www.amazon.com/Probabilistic-Graphical-Models-Princip...

If you want a more mathematical/statistical take on Machine Learning, then the two books by Hastie/Tibshirani et al are definitely worth a read (plus, they're free to download from the authors' websites!):

Introduction to Statistical Learning: http://www-bcf.usc.edu/~gareth/ISL/

The Elements of Statistical Learning: http://statweb.stanford.edu/~tibs/ElemStatLearn/

Obviously there is the whole field of "deep learning" as well! A good place to start is with: http://deeplearning.net/

microDude · 2013-03-08 · Original thread
I found Christopher Bishop's book to have a good deal of knowledge in it. Plus, he compliments many ideas with graphical figures, which is a big plus. http://www.amazon.com/Pattern-Recognition-Learning-Informati...

Also, if you needed more information about optimization methods all of Stephen Boyd's books are really good, just check out his entire website for information. http://www.stanford.edu/~boyd/

JamieEi · 2011-02-03 · Original thread
Pattern Recognition and Machine Learning, Bishop http://www.amazon.com/Pattern-Recognition-Learning-Informati...
tfh · 2010-05-30 · Original thread
The author is Chris Bishop... who wrote one of the "essential" machine learning books :

http://www.amazon.com/Pattern-Recognition-Learning-Informati...

cschmidt · 2007-11-13 · Original thread
If you want a fairly easy read without too many equations, try:

Data Mining: Practical Machine Learning Tools and Techniques (Second Edition) http://www.cs.waikato.ac.nz/~ml/weka/book.html

Which goes nicely with the Weka open source ML toolkit http://www.cs.waikato.ac.nz/ml/weka/

(although it is a good read without the toolkit)

If you want a bit more math, I really like the recent (Oct 2007) book:

Pattern Recognition and Machine Learning by Christopher M. Bishop http://www.amazon.com/Pattern-Recognition-Learning-Informati...

It is nicely self contained, going through all the stats you'll need.

Fresh book recommendations delivered straight to your inbox every Thursday.