Found in 4 comments on Hacker News
sriku · 2021-02-15 · Original thread
There is some danger in publicly adopting a "science has become religion" position. Just as "trust the science" can be used as a rhetorical weapon, "science has become another religion" can be too .. and usually to more dangerous effect because a lot of what is essentially known bullshit (see gets accommodated by the latter compared to the former.

There are reasonable accepted empirical approaches in accepted use today. Any team that follows these methods and reports honestly can be trusted more than a team that doesn't.

Also "trust" is a loaded word .. but mostly we just mean "predictability" - ex: to "trust someone" means to be sure that if you know what they've said, you can predict what they'll do. Much of empirical science is about offering up data and models that aid such predictability, so declaring "trust the science is extinct" is an outright rejection of these empirical methods (ex: randomised control trials) that have taken a long time to mature and take root as standard practice.

What is needed though is to be able to separate the science from policy making. As Dietrich Dörner has shown in "The Logic of Failure" [1], folks in the hard sciences don't fare very well when policy making in systems with complex causally interconnected parts is tasked upon them due to learnt heuristics that don't fare well in that world.

Let science do its thing - which is inform and educate. Let policies be made by those with more full understanding of the system into which changes need to be effected.


bretthopper · 2011-04-29 · Original thread
I've been noticing a trend recently when reading about large scale failures of any system: it's never just one thing.

AWS EBS outage, Fukushima, Chernobyl, even the great Chicago Fire (forgive me for comparing AWS to those events).

Sure there's always a "root" cause, but more importantly, it's the related events that keep adding up to make the failure even worse. I can only imagine how many minor failures happen world wide on a daily basis where there's only a root cause and no further chain of events.

Once a system is sufficiently complex, I'm not sure it's possible to make it completely fault-tolerant. I'm starting to believe that there's always some chain of events which would lead to a massive failure. And the more complex a system is, the more "chains of failure" exist. It would also become increasingly difficult to plan around failures.

edit: The Logic of Failure is recommended to anyone wanted to know more about this subject:

I believe the scope of than answer is greater than a HN thread, but I might just be wussing out. Hopefully others will engage you. If not, happy to take it offline.

I will say this: be careful of selection bias! Looking back, sure, if I show you a thousand examples that ended poorly your response will be something like "But they weren't really smart. Look how poorly it all turned out!" This is, at best, circular reasoning. The important thing is that, at the time, these folks were the best and brightest and put in charge for that very reason.

Good starting point:

Fresh book recommendations delivered straight to your inbox every Thursday.