Found in 1 comment on Hacker News
The relevant book for this is Measuring and Managing Information Risk: A FAIR Approach by Freund and Jones[0].

Both books are worth reading; Hubbard's influence on FAIR is noticeable and positive. FAIR has the advantage that it comes with a fairly built-out ontology for assembling data or estimates. The OP touches on the top level (Loss Event Magnitude and Loss Event Frequency), but the ontology goes quite deep and can be used at multiple levels of detail.

The calculations are not difficult, I've implemented them twice in proofs-of-concept, including one that produces pretty charts.

The difficult part, to be honest, is that developing good estimates is difficult and frequently uncomfortable and the gains are not easily internalised.

Additionally, serious tool support is lacking in the places where it would make a difference -- issue trackers, for example.

[0] https://www.amazon.com/Measuring-Managing-Information-Risk-A...

edit -- Another good book in this area is Waltzing with Bears by DeMarco & Lister. A short, funny, insightful read, as you'd expect from the authors of PeopleWare: https://www.amazon.com/Waltzing-Bears-Managing-Software-Proj...

Fresh book recommendations delivered straight to your inbox every Thursday.