One of our biases is towards conservation of mental energy. If we can, we choose the easiest path. Which means that we actively prefer experts who are smart, well-informed, and certain of their beliefs. Because we can then be comfortable handing our thinking over to them. Why bother questioning the expert when we'd surely agree if we just knew enough? And then confirmation bias keeps us from ever questioning their results.
The result, as https://www.amazon.com/Expert-Political-Judgment-Good-Know/d... documents, is that our most popular and highest paid pundits have those characteristics. But those experts are ALSO the ones who are worst at making predictions about the world. As I pointed out in https://www.lesswrong.com/posts/TMgfapzbt5qp4Hszf/doubt-cert..., we should really be concluding "likely has cognitive bias" instead of "the expert says it, so case closed".
The documented result is that we wind up trusting the experts that we should trust least!
There is a very simple litmus test to tell which are likely to fade, versus which aren't. How certain are they?
https://www.amazon.com/Expert-Political-Judgment-Good-Know/d... reports on a long-term study about pundits. Basically they can be divided into two groups:
1. Hedgehogs. Have one overarching theory that they are certain of.
2. Foxes. Can pick and choose from a variety of sources.
Both groups are smart, informed, and interesting. But when you follow them for a long time, a clear pattern emerges.
- Foxes do much better at making future predictions that come true.
- Hedgehogs become far more popular pundits, and generally wind up getting paid far better. Most pundits with popular shows are hedgehogs.
Why? My theory is that we listen to pundits because it is comfortable to outsource our thinking to them. We find that comfortable if they are smart, well-informed, and certain. It is easy for us to think, "Well if this smart and well-informed guy is so certain, I'd surely agree if I did the work. So now I don't have to bother."
There is a problem here. We become certain when it is easy for us to think a thing true, and hard for us to think it might be false. We feel that the evidence is truly overwhelming. It may be overwhelming. But it is more likely that we're simply being intellectually dishonest. So we actively choose intellectually dishonest pundits who agree with our presumptions, and then become sure that they are right. We enjoy listening to them. But, being intellectually dishonest, they are probably wrong. And now we're emotionally committed to their brand of insanity!
Try this rule of thumb out. Assume that a person who is certain, is probably wrong. And when you find yourself feeling certain, nurse that little doubt about how you REALLY know. It takes time, but consistently making this choice can change your life. For a start, you'll start actually thinking about things that you currently only think you're thinking about.
Statistics is ultimately counting, and therefore is incredibly vulnerable to discretion in choosing "what counts". Take the unemployment rate as one example. When people realize that its not equivalent to "people who do not have a job", how can you complain that they trust statistics less?
Expert authority is in decline because it should be, as there is an increasing body of evidence that experts, from politics to medicine, have almost no advantage in forecasting power than the average person.
Why should "experts" (often just pundits) have any authority when they have consistently demonstrated they deserve very little?
Finally, the political slant of this article, going along with the decried "fake news", blaming the election results on these declines in authority, is pathetic. It's basically an extension of "the other side is filled with stupids" and has no credibility, no matter how you dress it in professional journalistic veneer.
EDIT: the intent here was to expose overconfidence and vague predictions, not pay fan service to ethereum or suggest an actual bet. if anybody is interested in how to make proper predictions, I recommend the books by Philip Tetlock, especially the latest called Superforecasting .
EDIT2: I wasn't aware my views are so controversial, so here is some more background: If somebody was convinced something couldn't happen, he'd assign a probability of 0 to that event. If that person wanted to act according to her believes, taking on bets, no matter the odds, would have positive expected utility. Since almost nobody takes on such bets, it suggests that we generally over-exaggerate when we say things like "impossible" or "sorry for you loss", hence we are being overconfident.
The other is vagueness. By not being clear about what exactly we are predicting, we're leaving the door open to back out of it later. In fact, Tetlock has found that, by making vague predictions, experts could later convince themselves (and others) that they were "close", skewing their sense of accuracy. Unfortunately, when subject to a prediction tournament with strict rules, they would score no better than random .
Ehrlich gets credit in my book for making concrete falsifiable predictions and acknowledging some are wrong. He follows a classic pattern identified by Tetlock in claiming his major error was a matter of timing.
1. To what extent did his efforts help forestall the catastrophe he predicted? Any at all? In the article, his efforts are cited as having a significant impact on Indian family planning policies. Any way to measure the impact on the population numbers?
2. Was anybody citing Borlaug and the Green Revolution at the time he made his prediction? Was the Green Revolution complete at that time or its impacts a foregone conclusion? If not, how much more probable might his predictions have been absent this revolution? (I guess question hints at another excuse-making Tetlock pattern: the historical counter-factual or "I just got unlucky!")
Fresh book recommendations delivered straight to your inbox every Thursday.