In my experience with FP you start with more of a top down approach in contrast to the bottom up approach discussed in the blog post.
There are many good resources on this. I've referenced a few below. Of course there are many more.
[1] https://www.amazon.com/Pearls-Functional-Algorithm-Design-Ri...
[2] https://www.amazon.com/Purely-Functional-Data-Structures-Oka...
[3] https://www.amazon.com/Algorithm-Design-Haskell-Richard-Bird...
[4] https://www.amazon.com/Structure-Interpretation-Computer-Pro...
For example, if you have a vector of 100 items, and you "mutate" that by adding an item (actually creating a new vector), the language doesn't allocate a new 101-length vector. Instead, we can take advantage of the assumption of immutability to "share structure" between both vectors, and just allocate a new vector with two items (the new item, and a link to the old vector.) The same kind of idea can be used to share structure in associative data structures like hash-maps.
I'm no expert on this, so my explanation is pretty anemic and probably somewhat wrong. If you're curious, the book "Purely Functional Data Structures" [0] covers these concepts in concrete detail.
[0]: https://www.amazon.com/Purely-Functional-Data-Structures-Oka...
Also available as a free PDF: https://www.cs.cmu.edu/~rwh/theses/okasaki.pdf
I think the author is using the term "purely functional data structure" to mean a data structure that lends itself to an implementation in a purely functional language [1,2].
[1] https://en.wikipedia.org/wiki/Purely_functional_data_structu...
[2] https://www.amazon.com/Purely-Functional-Structures-Chris-Ok...
IMHO required reading if you're doing any heavy FP work.
It's a reasonably complicated topic, but the basic idea is that since the data structures are immutable, much of their structure can be shared between versions with few changes. Most of these data structures end up using a Tree of some sort. Performance characteristics can be influenced to an extent by using bucketing to control the width/height of the tree.
[1] : Purely Functional Data Structures (Amazon: https://www.amazon.com/Purely-Functional-Structures-Chris-Ok...) [2] http://hypirion.com/musings/understanding-persistent-vector-... [3] https://www.youtube.com/watch?v=7BFF50BHPPo
Digital Electronics using [1] Operating Systems using [2] Functional Data Structures using [3] Graphics Algorithms [4]
Any recommendations for these subjects sincerely appreciated. Thanks.
[1] https://www.amazon.com/Digital-Design-Computer-Architecture-... [2] https://www.amazon.com/Modern-Operating-Systems-Andrew-Tanen... [3] https://www.amazon.com/Purely-Functional-Structures-Chris-Ok... [4] https://www.amazon.com/Graphics-Visualization-Principles-Alg...
The more you practice, the more you can, the more you want to, the more you enjoy it, the less it tires you.” ― Robert A. Heinlein, The Cat Who Walks Through Walls
If you are just learning programming, plan on taking your time with the algorithms but practice coding every day. Find a fun project to attempt that is within your level of skill.
If you are a strong programmer in one language, find a book of algorithms using that language (some of the suggestions here in these comments are excellent). I list some of the books I like at the end of this comment.
If you are an experienced programmer, one algorithm per day is roughly doable. Especially so, because you are trying to learn one algorithm per day, not produce working, production level code for each algorithm each day.
Some algorithms are really families of algorithms and can take more than a day of study, hash based look up tables come to mind. First there are the hash functions themselves. That would be day one. Next there are several alternatives for storing entries in the hash table, e.g. open addressing vs chaining, days two and three. Then there are methods for handling collisions, linear probing, secondary hashing, etc.; that's day four. Finally there are important variations, perfect hashing, cuckoo hashing, robin hood hashing, and so forth; maybe another 5 days. Some languages are less appropriate for playing around and can make working with algorithms more difficult, instead of a couple of weeks this could easily take twice as long. After learning other methods of implementing fast lookups, its time to come back to hashing and understand when its appropriate and when alternatives are better and to understand how to combine methods for more sophisticated lookup methods.
I think you will be best served by modifying your goal a bit and saying that you will work on learning about algorithms every day and cover all of the material in a typical undergraduate course on the subject. It really is a fun branch of Computer Science.
A great starting point is Sedgewick's book/course, Algorithms [1]. For more depth and theory try [2], Cormen and Leiserson's excellent Introduction to Algorithms. Alternatively the theory is also covered by another book by Sedgewick, An Introduction to the Analysis of Algorithms [3]. A classic reference that goes far beyond these other books is of course Knuth [4], suitable for serious students of Computer Science less so as a book of recipes.
After these basics, there are books useful for special circumstances. If your goal is to be broadly and deeply familiar with Algorithms you will need to cover quite a bit of additional material.
Numerical methods -- Numerical Recipes 3rd Edition: The Art of Scientific Computing by Tuekolsky and Vetterling. I love this book. [5]
Randomized algorithms -- Randomized Algorithms by Motwani and Raghavan. [6], Probability and Computing: Randomized Algorithms and Probabilistic Analysis by Michael Mitzenmacher, [7]
Hard problems (like NP) -- Approximation Algorithms by Vazirani [8]. How to Solve It: Modern Heuristics by Michalewicz and Fogel. [9]
Data structures -- Advanced Data Structures by Brass. [10]
Functional programming -- Pearls of Functional Algorithm Design by Bird [11] and Purely Functional Data Structures by Okasaki [12].
Bit twiddling -- Hacker's Delight by Warren [13].
Distributed and parallel programming -- this material gets very hard so perhaps Distributed Algorithms by Lynch [14].
Machine learning and AI related algorithms -- Bishop's Pattern Recognition and Machine Learning [15] and Norvig's Artificial Intelligence: A Modern Approach [16]
These books will cover most of what a Ph.D. in CS might be expected to understand about algorithms. It will take years of study to work though all of them. After that, you will be reading about algorithms in journal publications (ACM and IEEE memberships are useful). For example, a recent, practical, and important development in hashing methods is called cuckoo hashing, and I don't believe that it appears in any of the books I've listed.
[1] Sedgewick, Algorithms, 2015. https://www.amazon.com/Algorithms-Fourth-Deluxe-24-Part-Lect...
[2] Cormen, et al., Introduction to Algorithms, 2009. https://www.amazon.com/s/ref=nb_sb_ss_i_1_15?url=search-alia...
[3] Sedgewick, An Introduction to the Analysis of Algorithms, 2013. https://www.amazon.com/Introduction-Analysis-Algorithms-2nd/...
[4] Knuth, The Art of Computer Programming, 2011. https://www.amazon.com/Computer-Programming-Volumes-1-4A-Box...
[5] Tuekolsky and Vetterling, Numerical Recipes 3rd Edition: The Art of Scientific Computing, 2007. https://www.amazon.com/Numerical-Recipes-3rd-Scientific-Comp...
[6] https://www.amazon.com/Randomized-Algorithms-Rajeev-Motwani/...
[7]https://www.amazon.com/gp/product/0521835402/ref=pd_sim_14_2...
[8] Vazirani, https://www.amazon.com/Approximation-Algorithms-Vijay-V-Vazi...
[9] Michalewicz and Fogel, https://www.amazon.com/How-Solve-Heuristics-Zbigniew-Michale...
[10] Brass, https://www.amazon.com/Advanced-Data-Structures-Peter-Brass/...
[11] Bird, https://www.amazon.com/Pearls-Functional-Algorithm-Design-Ri...
[12] Okasaki, https://www.amazon.com/Purely-Functional-Structures-Chris-Ok...
[13] Warren, https://www.amazon.com/Hackers-Delight-2nd-Henry-Warren/dp/0...
[14] Lynch, https://www.amazon.com/Distributed-Algorithms-Kaufmann-Manag...
[15] Bishop, https://www.amazon.com/Pattern-Recognition-Learning-Informat...
[16] Norvig, https://www.amazon.com/Artificial-Intelligence-Modern-Approa...
They also have serious disadvantages : they can't be memory managed in the traditional way (since they tend to reuse other instances' memory in complex ways), and thus require a GC (refcounting can work, but ...). They are VERY allocation intensive, and they are worse than most non-persistent data structures. Assuming an O(1) allocator they can match non-persistent data structures in O-ness (ie. when making an invalid assumption that is quite popular in academia. In practice memory allocation is O(1) for small values, then O(n^2) once you get close to the system's memory capacity (scanning for holes in a long list) but don't go over it, and then O(oh fuck it let's just reboot this bloody BOAT ANCHOR) when crossing that line).
Clojure is famous for having good persistent data structures. Rich Hickey went touring academia touting the benefits of immutable/persistent/functional data structures : https://www.youtube.com/watch?v=dGVqrGmwOAw&feature=youtu.be...
There's also a famous book: https://www.amazon.com/Purely-Functional-Structures-Chris-Ok...
http://www.amazon.com/Purely-Functional-Structures-Chris-Oka...
0. http://www.amazon.com/Purely-Functional-Structures-Chris-Oka...
Edit: PDF version: http://www.cs.cmu.edu/~rwh/theses/okasaki.pdf
I can whole heatedly recommend Pearls of Functional Algorithm Design http://www.amazon.com/Pearls-Functional-Algorithm-Design-Ric...
It's a good cross between two other excellent books:
- Jon Bentley's Programming Pearls http://www.amazon.com/Programming-Pearls-2nd-Jon-Bentley/dp/...
and
- Chris Okasaki's Purely Function Data Structures http://www.amazon.com/Purely-Functional-Structures-Chris-Oka....
If you haven't read all three, its well worth your while to do so!
And of course if you are going down the rabbit hole of reading Perls of Functional Algorithm Design then you need to read the "how to read Pearls of Functional Algorithm design" as well.
http://www.atamo.com/blog/how-to-read-pearls-by-richard-bird...
http://www.amazon.ca/Purely-Functional-Structures-Chris-Okas...
It's one of the few cs books that I refer to regularly when coding.
http://www.amazon.com/Purely-Functional-Structures-Chris-Oka...
Also check out this link for additional data structures worked on since then or simply not included in the book:
http://cstheory.stackexchange.com/questions/1539/whats-new-i...
http://www.amazon.ca/Purely-Functional-Structures-Chris-Okas...
I just can't say enough good things about it. The code is in ml but there is a Haskell port of the data structures.
It's a small book so its easy to read and digest in a couple of weeks.
Right. That's what I'm saying. Clojure does not care to be backwards compatible with Lisp.
> Your taxonomy of immutability and persistence is interesting
That's not mine.
Clojure took its base data structures from Haskell and modern ML.
Not Lisp.
See:
http://www.cs.cmu.edu/~rwh/theses/okasaki.pdf
http://en.wikipedia.org/wiki/Persistent_data_structure
The book comes with examples in ML and Haskell.
http://www.amazon.com/Purely-Functional-Structures-Chris-Oka...
Lots of algorithms don't work with this kind of copying semantics, but you have all of Purely Functional Data Structures [1] to help.
[1] http://www.amazon.com/Purely-Functional-Structures-Chris-Oka...
You should definitely check it out. If you already know Haskell and you love algorithms, that book will probably be more interesting than the Intro to Haskell book.
Also check out Okasaki's book of functional data structures: http://www.amazon.com/Purely-Functional-Structures-Chris-Oka...
http://www.amazon.com/Purely-Functional-Structures-Chris-Oka...
It's hard to find a dissertation that reads like a novel
That's why he turned it into a book: http://www.amazon.com/Purely-Functional-Structures-Chris-Oka...
edit: for those who are too lazy to google, here's the book on amazon http://www.amazon.com/Purely-Functional-Structures-Chris-Oka...