In retrospect, my other comment was stupidly obtuse. Both too technical (in the sense of specificity) and too unstructured (in the sense of presentation order). A more appropriate path from CS might be analogous (well, inverse if anything) to the path Robert Goldblatt has taken. It dips into nonstandard analysis, but not totally without reason. Some subset of the following, with nLab and Wikipedia supplementing as necessary:
0. Milewski's "Category Theory for Programmers"[0]
1. Goldblatt's "Topoi"[1]
2. McLarty's "The Uses and Abuses of the History of Topos Theory"[2] (this does not require [1], it just undoes some historical assumptions made in [1] and, like everything else by McLarty, is extraordinarily well-written)
3. Goldblatt's "Lectures on the Hyperreals"[3]
4. Nelson's "Radically Elementary Probability Theory"[4]
5. Tao's "Ultraproducts as a Bridge Between Discrete and Continuous Analysis"[5]
6. Some canonical machine learning text, like Murphy[6] or Bishop[7]
From there you should see a variety of paths for mapping (things:Uncertainty) <-> (things:Structure). The Giry monad is just one of them, and would probably be understandable after reading Barr/Wells' "Toposes, Triples and Theories"[10].
The above list also assumes some comfort with integration. Particularly good books in line with this pedagogical path might be:
9. Any and all canonical intros to real analysis
10. Malliavin's "Integration and Probability"[11]
11. Segal/Kunze's "Integrals and Operators"[12]
Similarly, some normative focus on probability would be useful:
0. Milewski's "Category Theory for Programmers"[0]
1. Goldblatt's "Topoi"[1]
2. McLarty's "The Uses and Abuses of the History of Topos Theory"[2] (this does not require [1], it just undoes some historical assumptions made in [1] and, like everything else by McLarty, is extraordinarily well-written)
3. Goldblatt's "Lectures on the Hyperreals"[3]
4. Nelson's "Radically Elementary Probability Theory"[4]
5. Tao's "Ultraproducts as a Bridge Between Discrete and Continuous Analysis"[5]
6. Some canonical machine learning text, like Murphy[6] or Bishop[7]
7. Koller/Friedman's "Probabilistic Graphical Models"[8]
8. Lawvere's "Taking Categories Seriously"[9]
From there you should see a variety of paths for mapping (things:Uncertainty) <-> (things:Structure). The Giry monad is just one of them, and would probably be understandable after reading Barr/Wells' "Toposes, Triples and Theories"[10].
The above list also assumes some comfort with integration. Particularly good books in line with this pedagogical path might be:
9. Any and all canonical intros to real analysis
10. Malliavin's "Integration and Probability"[11]
11. Segal/Kunze's "Integrals and Operators"[12]
Similarly, some normative focus on probability would be useful:
12. Jaynes' "Probability Theory"[13]
13. Pearl's "Causality"[14]
---
[0] https://bartoszmilewski.com/2014/10/28/category-theory-for-p...
[1] https://www.amazon.com/Topoi-Categorial-Analysis-Logic-Mathe...
[2] http://www.cwru.edu/artsci/phil/UsesandAbuses%20HistoryTopos...
[3] https://www.amazon.com/Lectures-Hyperreals-Introduction-Nons...
[4] https://web.math.princeton.edu/%7Enelson/books/rept.pdf
[5] https://www.youtube.com/watch?v=IS9fsr3yGLE
[6] https://www.amazon.com/Machine-Learning-Probabilistic-Perspe...
[7] https://www.amazon.com/Pattern-Recognition-Learning-Informat...
[8] https://www.amazon.com/Probabilistic-Graphical-Models-Princi...
[9] http://www.emis.de/journals/TAC/reprints/articles/8/tr8.pdf
[10] http://www.tac.mta.ca/tac/reprints/articles/12/tr12.pdf
[11] https://www.springer.com/us/book/9780387944098
[12] https://www.amazon.com/Integrals-Operators-Grundlehren-mathe...
[13] http://www.med.mcgill.ca/epidemiology/hanley/bios601/Gaussia...
[14] https://www.amazon.com/Causality-Reasoning-Inference-Judea-P...