There are reasonable accepted empirical approaches in accepted use today. Any team that follows these methods and reports honestly can be trusted more than a team that doesn't.
Also "trust" is a loaded word .. but mostly we just mean "predictability" - ex: to "trust someone" means to be sure that if you know what they've said, you can predict what they'll do. Much of empirical science is about offering up data and models that aid such predictability, so declaring "trust the science is extinct" is an outright rejection of these empirical methods (ex: randomised control trials) that have taken a long time to mature and take root as standard practice.
What is needed though is to be able to separate the science from policy making. As Dietrich Dörner has shown in "The Logic of Failure" [1], folks in the hard sciences don't fare very well when policy making in systems with complex causally interconnected parts is tasked upon them due to learnt heuristics that don't fare well in that world.
Let science do its thing - which is inform and educate. Let policies be made by those with more full understanding of the system into which changes need to be effected.
[1] https://www.amazon.com/Logic-Failure-Recognizing-Avoiding-Si...
AWS EBS outage, Fukushima, Chernobyl, even the great Chicago Fire (forgive me for comparing AWS to those events).
Sure there's always a "root" cause, but more importantly, it's the related events that keep adding up to make the failure even worse. I can only imagine how many minor failures happen world wide on a daily basis where there's only a root cause and no further chain of events.
Once a system is sufficiently complex, I'm not sure it's possible to make it completely fault-tolerant. I'm starting to believe that there's always some chain of events which would lead to a massive failure. And the more complex a system is, the more "chains of failure" exist. It would also become increasingly difficult to plan around failures.
edit: The Logic of Failure is recommended to anyone wanted to know more about this subject: http://www.amazon.com/Logic-Failure-Recognizing-Avoiding-Sit...
I will say this: be careful of selection bias! Looking back, sure, if I show you a thousand examples that ended poorly your response will be something like "But they weren't really smart. Look how poorly it all turned out!" This is, at best, circular reasoning. The important thing is that, at the time, these folks were the best and brightest and put in charge for that very reason.
Good starting point: http://www.amazon.com/Logic-Failure-Recognizing-Avoiding-Sit...
- "The Collapse of Complex Societies" by Joseph Tainter (Amazon link: https://www.amazon.com/Collapse-Complex-Societies-Studies-Ar...)
- "Normal Accidents" by Chuck Perrow (Amazon link: https://www.amazon.com/Normal-Accidents-Living-High-Risk-Tec...)
Tainter deals directly with the Roman Empire, but the nutshell is the cost of complexity begins to outweigh its returns, requiring more and more resources just to maintain the status quo, until the entire thing becomes weak and susceptible to failures large and small.
- "Panarchy: Understanding Human and Natural Systems" (Amazon link: https://www.amazon.com/Panarchy-Understanding-Transformation...) is also a fantastic book drawing from ecosystem science and proposes a general model for this. It's pretty well accepted in ecological circles but has been criticised for a lack of empirical data. The general model is the same as Tainter's though.
- "The Logic of Failure" by Dietrich Dorner is also a classic! (https://www.amazon.com/Logic-Failure-Recognizing-Avoiding-Si...)
Well worth reading all of the above.