Found in 3 comments on Hacker News
bordercases · 2018-01-23 · Original thread
It gains a bit more unity when you try to interpret it through certain lenses.

In particular, one problem with discrete mathematics as it's taught, is that it doesn't separate the methods of counting from the set of objects that you need to count.

There are a couple of ways around this. One good way is to look at all combinatoric identities as referring to the number of ways you can connect some set to some other set. Sometimes they're called "choices", "mappings", functions, whatever. You can talk about the function and sets separate from the numbers, and the numbers drop out of properties of the set. Doing this removes a layer of interpretation and guesswork even if it ups the abstraction a bit.

Additionally, discrete math just looked at as the math of algorithms also gets you far. Sedgewick's Analysis of Algorithms book is actually a discrete math book in disguise, since it gives a system of notation that can describe basically any combinatorical object separate from the counting method -- and then maps it to the counting method.

todd8 · 2016-10-09 · Original thread
Depending on your level of programming ability, one algorithm a day, IMHO, is completely doable. A number of comments and suggestions say that one per day is an unrealistic goal (yes, maybe it is) but the idea of setting a goal and working through a list of algorithms is very reasonable.

If you are just learning programming, plan on taking your time with the algorithms but practice coding every day. Find a fun project to attempt that is within your level of skill.

If you are a strong programmer in one language, find a book of algorithms using that language (some of the suggestions here in these comments are excellent). I list some of the books I like at the end of this comment.

If you are an experienced programmer, one algorithm per day is roughly doable. Especially so, because you are trying to learn one algorithm per day, not produce working, production level code for each algorithm each day.

Some algorithms are really families of algorithms and can take more than a day of study, hash based look up tables come to mind. First there are the hash functions themselves. That would be day one. Next there are several alternatives for storing entries in the hash table, e.g. open addressing vs chaining, days two and three. Then there are methods for handling collisions, linear probing, secondary hashing, etc.; that's day four. Finally there are important variations, perfect hashing, cuckoo hashing, robin hood hashing, and so forth; maybe another 5 days. Some languages are less appropriate for playing around and can make working with algorithms more difficult, instead of a couple of weeks this could easily take twice as long. After learning other methods of implementing fast lookups, its time to come back to hashing and understand when its appropriate and when alternatives are better and to understand how to combine methods for more sophisticated lookup methods.

I think you will be best served by modifying your goal a bit and saying that you will work on learning about algorithms every day and cover all of the material in a typical undergraduate course on the subject. It really is a fun branch of Computer Science.

A great starting point is Sedgewick's book/course, Algorithms [1]. For more depth and theory try [2], Cormen and Leiserson's excellent Introduction to Algorithms. Alternatively the theory is also covered by another book by Sedgewick, An Introduction to the Analysis of Algorithms [3]. A classic reference that goes far beyond these other books is of course Knuth [4], suitable for serious students of Computer Science less so as a book of recipes.

After these basics, there are books useful for special circumstances. If your goal is to be broadly and deeply familiar with Algorithms you will need to cover quite a bit of additional material.

Numerical methods -- Numerical Recipes 3rd Edition: The Art of Scientific Computing by Tuekolsky and Vetterling. I love this book. [5]

Randomized algorithms -- Randomized Algorithms by Motwani and Raghavan. [6], Probability and Computing: Randomized Algorithms and Probabilistic Analysis by Michael Mitzenmacher, [7]

Hard problems (like NP) -- Approximation Algorithms by Vazirani [8]. How to Solve It: Modern Heuristics by Michalewicz and Fogel. [9]

Data structures -- Advanced Data Structures by Brass. [10]

Functional programming -- Pearls of Functional Algorithm Design by Bird [11] and Purely Functional Data Structures by Okasaki [12].

Bit twiddling -- Hacker's Delight by Warren [13].

Distributed and parallel programming -- this material gets very hard so perhaps Distributed Algorithms by Lynch [14].

Machine learning and AI related algorithms -- Bishop's Pattern Recognition and Machine Learning [15] and Norvig's Artificial Intelligence: A Modern Approach [16]

These books will cover most of what a Ph.D. in CS might be expected to understand about algorithms. It will take years of study to work though all of them. After that, you will be reading about algorithms in journal publications (ACM and IEEE memberships are useful). For example, a recent, practical, and important development in hashing methods is called cuckoo hashing, and I don't believe that it appears in any of the books I've listed.

[1] Sedgewick, Algorithms, 2015.

[2] Cormen, et al., Introduction to Algorithms, 2009.

[3] Sedgewick, An Introduction to the Analysis of Algorithms, 2013.

[4] Knuth, The Art of Computer Programming, 2011.

[5] Tuekolsky and Vetterling, Numerical Recipes 3rd Edition: The Art of Scientific Computing, 2007.



[8] Vazirani,

[9] Michalewicz and Fogel,

[10] Brass,

[11] Bird,

[12] Okasaki,

[13] Warren,

[14] Lynch,

[15] Bishop,

[16] Norvig,

emmanueloga_ · 2015-08-11 · Original thread is my favorite. The implementations of the algorithms are really nice (quite a feat, considering the lang is Java :-p) and each section builds on top of a previous one, when needed (e.g. many algorithms require a Symbol Table, which is built in one of the chapters of the first volume). The resulting implementations are incredible terse but at the same time clean and easy to understand!

Cormen's book is a bit more thedious to read and I don't know it that well but I think is a little bit more rigorous with math. Sedgewick's book doesn't mention asymptotic limits or BigO notation, rather talks about the run time / memory requirements in more informal terms, like "run time being proportional to the square of the number of input items", etc. But, there's another book from Sedgewick (and Flajolet!) if you want to go deeper into analysis: (1).

Skienna's book is great to have a wider outline on all sorts of algorithms, including some less widely known and (I think) not covered in either of Sedgewick's or Cormen's book. It is, if I'm allowed the pun, more of a breadth first than a depth first approach to the subject.


Fresh book recommendations delivered straight to your inbox every Thursday.