Sadly this often happens because often times the machine works 99% of the time. So when the thing starts failing in 1% of the cases people will dismiss it because it worked in the previous 99% of the time. People have trained themselves to trust the results so questioning it does not even enter their minds.
The system itself does not actually do what it says it is doing (The Operational Fallacy).
This people and engineers as well often forget. Just because the system says it does X does not mean it actually does it. People would do well to cultivate a healthy skepticism towards those complex IT systems. They might not fail most of the time but they inevitably will.
It does remind me though of the book Systemantics: https://www.amazon.com/Systemantics-Systems-Work-Especially-...
One of the core thesis in the book is :
The system itself does not actually do what it says it is doing (The Operational Fallacy).
This people and engineers as well often forget. Just because the system says it does X does not mean it actually does it. People would do well to cultivate a healthy skepticism towards those complex IT systems. They might not fail most of the time but they inevitably will.