Found in 13 comments on Hacker News
Buttons840 · 2023-01-20 · Original thread
Reminds me of what we humorously learn from The System Bible[0]: "When a fail-safe system fails, it fails by failing to fail safely."

The book mentions 3-miles island, where a problem in a secondary system (an added safety system) spread and caused the system as a whole to fail. This is a tongue-in-cheek way of illustrating a serious issue when designing systems, though I wonder if the interpretation of what happened at 3-miles island is a bit of a stretch? (And I may misremember the book.)

"The accident to unit 2 happened at 4 am on 28 March 1979 when the reactor was operating at 97% power. It involved a relatively minor malfunction in the secondary cooling circuit which caused the temperature in the primary coolant to rise..."[1]

[0]: [1]:

bradgessler · 2022-03-07 · Original thread
I just redefined technology as “something made by humans that barely works”

If you look at the stacks we build everything on top of, it’s insanely brittle and barely works. We simply lie to ourselves that it’s otherwise so we can get out of bed in the morning actually ship something that’s useful to another human being.

If you don’t believe me, look at the organization or source code behind the black box you’ve built everything on. By some miracle it works, but barely.

I’ve found the best book that describes this is “The Systems Bible” —

You’ll chuckle as you read it and learn to embrace the miracle that anything works.

bradgessler · 2022-02-18 · Original thread
I’ve developed a heuristic where anytime I hear phrases like “affordable housing”, “make college affordable”, “affordable healthcare” I simply translate it to “unaffordable”.

Inevitably when these become government programs, they simply throw money at the problem without addressing fundamental supply and demand issues and accomplish very little except drive up prices. Then, when investors see prices going up and to the left, they rationally throw money at it in a way to make even more money, which drives up prices even more.

There’s so many charts like that clearly show the trend. It’s insane.

There’s a really good book I recommend called “Systems Bible” [] that beautifully articulates this phenomenon in a way that applies to government policy, software systems, management, org structures, or any complex system of people or machines.

smacktoward · 2019-03-21 · Original thread
Any HN reader who has not read Gall's book Systemantics (whose latest printing is under the title The Systems Bible: really ought to rectify that. It's not perfect, but it will give you lots of things to think about, and Gall's witty writing style makes it a fun, easy read.

(I pulled my copy out just a few weeks ago so I could quote a different Gallism that was relevant to an HN discussion:

smacktoward · 2019-01-23 · Original thread
I've read it, and own a copy. It's very good. There are bits you can argue with, and some of the examples used (like the one cited here) are a little too just-so, but it is absolutely a book that will make you think. And it's funny too, which is nice.

(If you want to pick it up, be aware that in its most current printing the title changed from Systemantics to The Systems Bible:

smacktoward · 2019-01-08 · Original thread
The classic (and hilarious) book The Systems Bible ( has some deliciously tart words on this subject:

We present the Fundamental Law of Administrative Workings (F.L.A.W.): THINGS ARE WHAT THEY ARE REPORTED TO BE...

The net effect of this Law is to ensure that people in Systems are never dealing with the real world that the rest of us have to live in, but instead with a filtered, distorted, and censored version which is all that can get past the sensory organisms of the System itself...

This effect has been studied in detail by a small group of dedicated General Systemanticists. In an effort to introduce quantitative methodology into this important area of research they have paid particular attention to the amount of information that reaches, or fails to reach, the attention of the relevant administrative officer or corresponding Control Unit.

The crucial variable, they have found, is the fraction Ro/Rs, where Ro represents the amount of Reality which fails to reach the Control Unit, and Rs equals the total amount of Reality presented to the System. The fraction Ro/Rs varies from zero (full awareness of outside reality) to unity (no reality getting through). It is known, naturally enough, as the COEFFICIENT OF FICTION.

ekke · 2018-10-21 · Original thread
This is fascinating. For anyone interested in a slightly odd but unique and in-depth view of Systems design and failure, would like to recommend "The Systems Bible: The Beginner's Guide to Systems Large and Small" by John Gall.

chubot · 2017-09-03 · Original thread
Yeah that quote definitely crossed my mind. However I think the OP is confusing ideas vs. systems:

The next big idea invariably seems to grow out of the next small idea; ideas that are big from the beginning almost never work.

That doesn't really make sense with respect to ideas. The real quote is about SYSTEMS, from this book:

The system is the realization of the idea. You can have a big idea, but you can't implement it all at once. TBL had a big idea, which is necessarily a big system. So he grew it from a very small piece of code (HTTP 1.0 was ridiculously simple.) There was an unbroken chain from small system to big system.

The misleading thing about Linux is that it IS IN FACT a big idea -- it's just not a technological idea. We already knew how to write monolithic kernels. But the real innovation is the software development process. The fact that thousands of programmers can ship a working kernel with little coordination is amazing. That Linus wrote git is not an accident; he's an expert in software collaboration and evolution.

Linux is a universal hardware abstraction layer, which is an easy idea in theory, but extremely difficult in practice until Linus figured out how to make it work.

So Linux is a big idea too, as well as a small system that grew into a big system.


This reminds me of Paul Graham's advice:

Let me conclude with some tactical advice. If you want to take on a problem as big as the ones I've discussed, don't make a direct frontal attack on it. Don't say, for example, that you're going to replace email. If you do that you raise too many expectations. Your employees and investors will constantly be asking "are we there yet?" and you'll have an army of haters waiting to see you fail. Just say you're building todo-list software. That sounds harmless.

Empirically, the way to do really big things seems to be to start with deceptively small things. Want to dominate microcomputer software? Start by writing a Basic interpreter for a machine with a few thousand users. Want to make the universal web site? Start by building a site for Harvard undergrads to stalk one another.

I think that's pretty much in line with what's said here. You can have a big idea, a big 10 year goal, but you have to break in into steps. Gates had an explicit goal of "a PC on every desk" and Zuckerberg had an explicit goal of "connecting the world" (at some point, not at the very beginning). But they necessarily started small.

chadaustin · 2015-07-06 · Original thread
Some of my personal favorite resources on this topic:

I really enjoyed these books, but I am not super well-read in this area, so there may be better ones out there. You could try searching for "systems theory" and see what other resources are out there.

Systems theory is a very broad topic, so you'll find it attached to many specific disciplines, but the general idea is that you can take a bunch of simple things, hook them together, and produce a "being" that has totally weird behavior in aggregate.

bcbrown · 2013-09-17 · Original thread
>The lesson, applicable to games, startups, governance, and indeed systems engineering in general, is that systems react.

A great humorous look at how systems react is:

I found it very insightful, and very relevant to software engineering, too.

gruseom · 2011-04-29 · Original thread
read Systemantics

Oh yes. It's a classic that deserves to be much better known. Anybody engaged with complex systems - such as software or software projects - will find all kinds of suggestive things in there. As for "dry"... come now, it's hilarious and has cartoons.

Basically, just get it. Here, I'll help:

(They ruined the title but it's the same book.)

gruseom · 2008-09-14 · Original thread
The book that this is taken from is one of my all-time favorites, a quirky classic that deserves to be better known. Underneath its irreverence are quite a few profound insights. (I'd quote a couple but I leant out my copy.)

More here:

Edit: oh yeah, the book is

gruseom · 2008-05-06 · Original thread
Just popped into my mind: do you know the book "Systemantics" (also known as "The Systems Bible") by John Gall? It's not on education but is very much in the space we're discussing. It's a brilliant (and hilarious) underground classic. I think you would like it. A lot of people here would. It's irreverent and subversive and incredibly smart and not rigid.

One of its more famous aphorisms (relevant to the software startup crowd) is "A complex system that works is invariably found to have evolved from a simple system that worked".

Fresh book recommendations delivered straight to your inbox every Thursday.