I haven't read much of this early access book yet, but I'd give the authors a lot of benefit of the doubt. Christopher Bishop wrote one of my favorite machine learning books (I read it after my graduate study in machine learning and it filled in a lott of the gaps): https://www.amazon.com/Pattern-Recognition-Learning-Informat...
If you are just learning programming, plan on taking your time with the algorithms but practice coding every day. Find a fun project to attempt that is within your level of skill.
If you are a strong programmer in one language, find a book of algorithms using that language (some of the suggestions here in these comments are excellent). I list some of the books I like at the end of this comment.
If you are an experienced programmer, one algorithm per day is roughly doable. Especially so, because you are trying to learn one algorithm per day, not produce working, production level code for each algorithm each day.
Some algorithms are really families of algorithms and can take more than a day of study, hash based look up tables come to mind. First there are the hash functions themselves. That would be day one. Next there are several alternatives for storing entries in the hash table, e.g. open addressing vs chaining, days two and three. Then there are methods for handling collisions, linear probing, secondary hashing, etc.; that's day four. Finally there are important variations, perfect hashing, cuckoo hashing, robin hood hashing, and so forth; maybe another 5 days. Some languages are less appropriate for playing around and can make working with algorithms more difficult, instead of a couple of weeks this could easily take twice as long. After learning other methods of implementing fast lookups, its time to come back to hashing and understand when its appropriate and when alternatives are better and to understand how to combine methods for more sophisticated lookup methods.
I think you will be best served by modifying your goal a bit and saying that you will work on learning about algorithms every day and cover all of the material in a typical undergraduate course on the subject. It really is a fun branch of Computer Science.
A great starting point is Sedgewick's book/course, Algorithms [1]. For more depth and theory try [2], Cormen and Leiserson's excellent Introduction to Algorithms. Alternatively the theory is also covered by another book by Sedgewick, An Introduction to the Analysis of Algorithms [3]. A classic reference that goes far beyond these other books is of course Knuth [4], suitable for serious students of Computer Science less so as a book of recipes.
After these basics, there are books useful for special circumstances. If your goal is to be broadly and deeply familiar with Algorithms you will need to cover quite a bit of additional material.
Numerical methods -- Numerical Recipes 3rd Edition: The Art of Scientific Computing by Tuekolsky and Vetterling. I love this book. [5]
Randomized algorithms -- Randomized Algorithms by Motwani and Raghavan. [6], Probability and Computing: Randomized Algorithms and Probabilistic Analysis by Michael Mitzenmacher, [7]
Hard problems (like NP) -- Approximation Algorithms by Vazirani [8]. How to Solve It: Modern Heuristics by Michalewicz and Fogel. [9]
Data structures -- Advanced Data Structures by Brass. [10]
Functional programming -- Pearls of Functional Algorithm Design by Bird [11] and Purely Functional Data Structures by Okasaki [12].
Bit twiddling -- Hacker's Delight by Warren [13].
Distributed and parallel programming -- this material gets very hard so perhaps Distributed Algorithms by Lynch [14].
Machine learning and AI related algorithms -- Bishop's Pattern Recognition and Machine Learning [15] and Norvig's Artificial Intelligence: A Modern Approach [16]
These books will cover most of what a Ph.D. in CS might be expected to understand about algorithms. It will take years of study to work though all of them. After that, you will be reading about algorithms in journal publications (ACM and IEEE memberships are useful). For example, a recent, practical, and important development in hashing methods is called cuckoo hashing, and I don't believe that it appears in any of the books I've listed.
[1] Sedgewick, Algorithms, 2015. https://www.amazon.com/Algorithms-Fourth-Deluxe-24-Part-Lect...
[2] Cormen, et al., Introduction to Algorithms, 2009. https://www.amazon.com/s/ref=nb_sb_ss_i_1_15?url=search-alia...
[3] Sedgewick, An Introduction to the Analysis of Algorithms, 2013. https://www.amazon.com/Introduction-Analysis-Algorithms-2nd/...
[4] Knuth, The Art of Computer Programming, 2011. https://www.amazon.com/Computer-Programming-Volumes-1-4A-Box...
[5] Tuekolsky and Vetterling, Numerical Recipes 3rd Edition: The Art of Scientific Computing, 2007. https://www.amazon.com/Numerical-Recipes-3rd-Scientific-Comp...
[6] https://www.amazon.com/Randomized-Algorithms-Rajeev-Motwani/...
[7]https://www.amazon.com/gp/product/0521835402/ref=pd_sim_14_2...
[8] Vazirani, https://www.amazon.com/Approximation-Algorithms-Vijay-V-Vazi...
[9] Michalewicz and Fogel, https://www.amazon.com/How-Solve-Heuristics-Zbigniew-Michale...
[10] Brass, https://www.amazon.com/Advanced-Data-Structures-Peter-Brass/...
[11] Bird, https://www.amazon.com/Pearls-Functional-Algorithm-Design-Ri...
[12] Okasaki, https://www.amazon.com/Purely-Functional-Structures-Chris-Ok...
[13] Warren, https://www.amazon.com/Hackers-Delight-2nd-Henry-Warren/dp/0...
[14] Lynch, https://www.amazon.com/Distributed-Algorithms-Kaufmann-Manag...
[15] Bishop, https://www.amazon.com/Pattern-Recognition-Learning-Informat...
[16] Norvig, https://www.amazon.com/Artificial-Intelligence-Modern-Approa...
0. Milewski's "Category Theory for Programmers"[0]
1. Goldblatt's "Topoi"[1]
2. McLarty's "The Uses and Abuses of the History of Topos Theory"[2] (this does not require [1], it just undoes some historical assumptions made in [1] and, like everything else by McLarty, is extraordinarily well-written)
3. Goldblatt's "Lectures on the Hyperreals"[3]
4. Nelson's "Radically Elementary Probability Theory"[4]
5. Tao's "Ultraproducts as a Bridge Between Discrete and Continuous Analysis"[5]
6. Some canonical machine learning text, like Murphy[6] or Bishop[7]
7. Koller/Friedman's "Probabilistic Graphical Models"[8]
8. Lawvere's "Taking Categories Seriously"[9]
From there you should see a variety of paths for mapping (things:Uncertainty) <-> (things:Structure). The Giry monad is just one of them, and would probably be understandable after reading Barr/Wells' "Toposes, Triples and Theories"[10].
The above list also assumes some comfort with integration. Particularly good books in line with this pedagogical path might be:
9. Any and all canonical intros to real analysis
10. Malliavin's "Integration and Probability"[11]
11. Segal/Kunze's "Integrals and Operators"[12]
Similarly, some normative focus on probability would be useful:
12. Jaynes' "Probability Theory"[13]
13. Pearl's "Causality"[14]
---
[0] https://bartoszmilewski.com/2014/10/28/category-theory-for-p...
[1] https://www.amazon.com/Topoi-Categorial-Analysis-Logic-Mathe...
[2] http://www.cwru.edu/artsci/phil/UsesandAbuses%20HistoryTopos...
[3] https://www.amazon.com/Lectures-Hyperreals-Introduction-Nons...
[4] https://web.math.princeton.edu/%7Enelson/books/rept.pdf
[5] https://www.youtube.com/watch?v=IS9fsr3yGLE
[6] https://www.amazon.com/Machine-Learning-Probabilistic-Perspe...
[7] https://www.amazon.com/Pattern-Recognition-Learning-Informat...
[8] https://www.amazon.com/Probabilistic-Graphical-Models-Princi...
[9] http://www.emis.de/journals/TAC/reprints/articles/8/tr8.pdf
[10] http://www.tac.mta.ca/tac/reprints/articles/12/tr12.pdf
[11] https://www.springer.com/us/book/9780387944098
[12] https://www.amazon.com/Integrals-Operators-Grundlehren-mathe...
[13] http://www.med.mcgill.ca/epidemiology/hanley/bios601/Gaussia...
[14] https://www.amazon.com/Causality-Reasoning-Inference-Judea-P...
Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Flach): http://www.amazon.com/Machine-Learning-Science-Algorithms-Se...
Machine Learning: A Probabilistic Perspective (Murphy): http://www.amazon.com/Machine-Learning-Probabilistic-Perspec...
Pattern Recognition and Machine Learning (Bishop): http://www.amazon.com/Pattern-Recognition-Learning-Informati...
There are some great resources/books for Bayesian statistics and graphical models. I've listed them in (approximate) order of increasing difficulty/mathematical complexity:
Think Bayes (Downey): http://www.amazon.com/Think-Bayes-Allen-B-Downey/dp/14493707...
Bayesian Methods for Hackers (Davidson-Pilon et al): https://github.com/CamDavidsonPilon/Probabilistic-Programmin...
Doing Bayesian Data Analysis (Kruschke), aka "the puppy book": http://www.amazon.com/Doing-Bayesian-Data-Analysis-Second/dp...
Bayesian Data Analysis (Gellman): http://www.amazon.com/Bayesian-Analysis-Chapman-Statistical-...
Bayesian Reasoning and Machine Learning (Barber): http://www.amazon.com/Bayesian-Reasoning-Machine-Learning-Ba...
Probabilistic Graphical Models (Koller et al): https://www.coursera.org/course/pgm http://www.amazon.com/Probabilistic-Graphical-Models-Princip...
If you want a more mathematical/statistical take on Machine Learning, then the two books by Hastie/Tibshirani et al are definitely worth a read (plus, they're free to download from the authors' websites!):
Introduction to Statistical Learning: http://www-bcf.usc.edu/~gareth/ISL/
The Elements of Statistical Learning: http://statweb.stanford.edu/~tibs/ElemStatLearn/
Obviously there is the whole field of "deep learning" as well! A good place to start is with: http://deeplearning.net/
Also, if you needed more information about optimization methods all of Stephen Boyd's books are really good, just check out his entire website for information. http://www.stanford.edu/~boyd/
After you've got a grasp of what these things are doing then you can move into the how. For that you will need some math background, with emphasis in calculus and probability.
After that, you can take a look at PRML. https://www.amazon.com/Pattern-Recognition-Learning-Informat...
Some people might prefer seeing things from another approach. http://pgm.stanford.edu/
Good luck.