Nassim Taleb on Intellectual Idiots, the Lindy effect, and Ensemble Probabilities

Nassim Taleb on intellectual idiots, the Lindy effect, rational survival, and ensemble probabilities

The intellectual yet idiot

In Skin in the Game, Nassim Taleb has a colorful name for the highbrow person of the establishment that makes routine thinking errors: “intellectual yet idiot”. Such individuals, he says, “mistake absence of evidence for evidence of absence”; “get first-order logic right, but not second-order (or higher effects)”; and “think in statics not dynamics”.

“Cass Sunstein and Richard Thaler types — those who want to “nudge” us into some behavior — much of what they would classify as “rational” or “irrational”… comes from their misunderstanding of probability theory and cosmetic use of first-order models. They are also prone to mistake the ensemble for the linear aggregation of its components — that is, they think that our understanding of single individuals allows us to understand crowds and markets, or that our understanding of ants allows us to understand ant colonies.”

Nassim Taleb. (2017). Skin in the Game.

The absence of evidence

For starters, they do not see the asymmetry between history and perception, that “history is largely peace punctuated by wars, not wars punctuated by peace”. Yet the very opposite seems to be the common impression. Taleb attributes this in part to the availability heuristic, which describes our tendency to overweight information that comes immediately to mind.1 

Indeed, journalists and historians look for events, not non-events. We talk about the airplane that crashes, not the thousands of flights that make it every day. And while such outsized reactions “helps us to be prudent and careful in daily life, … it does not help with scholarship”. Do not ignore the “absence of data points” or mistake “intensity for frequency”, Taleb reminds. 

1 Hans Rosling makes a similar argument in Factfulness: Ten Reasons We’re Wrong About the World. He describes how our propensity for negativity, straight lines, generalizations, fear and urgency can distort our worldview and decision-making quality.

Over-narrate and overload

The intellectual-yet-idiot manifests in many more ways. The news media, for example, has an incessant need to relate daily price changes in financial markets to national and global events. While some stories are meaningful signals, a lot of it is noise, randomness, and pure hogwash. They know it. We know it. And yet the practice persists. 

This speaks to a broader problem of overfitting and over-narrating. Life is more complex and nonlinear than a game of baseball. We cannot explain most real world events in one headline. Just as well, “it is harder for us to reverse-engineer than [to] engineer”. We might understand the forces of evolution, “but cannot replicate them owing to their causal opacity”. 

Relatedly, Taleb shares a bugbear that I’ve long held of my peers in consulting and finance: the tendency to inundate our clients with models and charts and call it rigor. While sometimes warranted, it is often used an overloading tactic. The intellectual-yet-idiot finds a false sense of security in numbers and whizzbang decks that he or she does not fully understand. 

People mistake empiricism for a flood of data. Just a little bit of significant data is needed when one is right, particularly when it is disconfirmatory empiricism, or counterexamples… Traders, when they make profits, have short communications; when they lose they drown you in details, theories, and charts. … So I’ve discovered, with experience, that when you buy a thick book with tons of graphs and tables used to prove a point, you should be suspicious. It means something didn’t distill right!”

Nassim Taleb. (2017). Skin in the Game. 

The Lindy effect

So, “who will judge the experts?” “Who will guard the guard?” The answer, Taleb says, is survival. Ideas, he suggests, have a life expectancy of sorts. Aging, accidents, and competition, as a function of time, will eliminate the old and usher in the new. That’s not to say that only good ideas will persist over time. Dangerous ideas can endure just as well. Religious cults, pyramid schemes, and intellectual idiots thrive to this day for this exact reason.

Related to time and survival is the Lindy proof or Lindy effect. “That which is Lindy”, Taleb writes, “is what ages in reverse” — “its life expectancy lengthens with time, conditional on survival”. 

Individual organisms, for example, are not Lindy proof. Senescence is inescapable. Death awaits every animal. Might Homo sapiens as a species, or insects as a taxonomic class, be Lindy-like? I suspect so. We are a rather adaptive lot. But our descendants will be the ones to verify this someday.

Many social constructs, like ideas and institutions, appear Lindy proof as well. They include scientific theories that survive expert scrutiny and religions that achieve critical mass. In each case, their position appears to improve with the benefit of time. As Taleb explains:

The reason science works isn’t because there is a proper “scientific method” derived by some nerds in isolation, … rather it is because scientific ideas are Lindy-prone… An idea will fail if it is not useful, and can be therefore vulnerable to the falsification of time… The longer an idea has been around without being falsified, the longer its future life expectancy… For things to survive, they necessarily need to fare well in the risk dimension, that is, be good at not dying… Survival comes first, truth, understanding, and science later.”

Nassim Taleb. (2017). Skin in the Game.

Rationality and survival

Rationality, Taleb argues, is poorly defined in our everyday use of the word. Many books in behavioral psychology and economics, for example, pronounce the irrationality of their subjects in controlled experiments that people do not encounter in real life. 

What’s more, most papers rarely account for subject circumstances. For example, can you truly say, for a given gamble in some behavioral experiment, whether someone is too risk averse or risk seeking without knowing their personal history and portfolio?

The flaw in psychology papers is to believe that the subject doesn’t take any other tail risks anywhere outside the experiment and, crucially, will never again take any risk at all. … You cannot possibly ignore all the other financial risks he [or she] is taking… All these risks add up, and the attitude of the subject reflects them all.”

Nassim Taleb. (2017). Skin in the Game. 

As Mervyn King and John Kay suggest in Radical Uncertainty, it helps to distinguish between economic rationality and evolutionary rationality. Taleb shares a similar view. Because of “causal opacity” and the filters of time, the only practical definition of rationality he accepts “is that which allows for survival”.2

As Taleb summarizes: 

Rationality does not depend on explicit verbalistic explanatory factors; it is only what aids survival, what avoids ruin. Why?… Not everything that happens happens for a reason, but everything that survives survives for a reason… Rationality is risk management, period.”

Nassim Taleb. (2017). Skin in the Game. 

Rationality and risk management

In this way, cultural and social norms that persist to this day, no matter how outlandish they may seem, are rational in the sense that it survives somehow. Superstition and “religion exists to enforce tail risk management across generations”, Taleb writes.

The question we must ask, of course, is risk management for whom? In some settings, institutions provide a framework for coordination, cooperation, and survival. But in others, they degenerate into extractive regimes, designed for a ruling elite at the expense of everyone else.

2 Taleb highlights three thinkers who have influenced his views on rationality: political scientist Herbert Simon (bounded rationality), mathematician Ken Binmore (revealed preferences), and psychologist Gerd Gigerenzer (ecological rationality).

Time and ensemble probabilities

When it comes to risk management, we must also distinguish between ensemble and time in probability. Ensemble probability “is concerned with a collection of people”, while time probability looks at “a single person through time”.

Indeed, if ten thousand people each smoke a packet of cigarettes, their lives will probably carry on. But if you smoke ten thousand packs yourself, you might need an ambulance. From finance to healthcare, the distinction between ensemble and time, and risk and ruin, is important.

People are sometimes derided for being more afraid of terrorists than diabetes. The latter, after all, is the widespread killer of advanced economies. But while this is statistical fact, it underplays the multiplicative risk of terrorism, which can proliferate like an epidemic. 

Don’t misconstrue this, however, as commentary about the reasonableness of national fears and policy priorities. It is simply a reminder about the comparability of different risks when accumulation and multiplication come into focus. As Taleb stresses, the “interactions matter”.

“Never cross a river if it is on average four feet deep… [And] never compare a multiplicative, systemic, and fat-tailed risk to a non-multiplicative, idiosyncratic, and thin-tailed one… Risk and ruin are different things”.

Nassim Taleb. (2017). Skin in the Game.

One idiot’s hope

With that, we return to Taleb’s lament on the intellectual-yet-idiot. Is there any hope for us error-prone mortals? To be frank, it doesn’t look so good.

True, we can train ourselves to think critically. But the human mind isn’t equipped to handle complexity and probability effectively in an everyday sense. (And our education system does us no favors by training us in textbook problems that contain only neat solutions.)

With that said, however, the average person can still enjoy one advantage over the intellectual-yet-idiot. It’s better, after all, to be a cat that knows it is a cat than a cat that believes it is a tiger. Intellectual humility and honesty can go a long way.

Sources and further reading