Trying to perceive things that aren't true

David Brooks in today's NYT : "My sense is that this financial crisis is going to amount to a coming-out party for behavioral economists and others who are bringing sophisticated psychology to the realm of public policy. At least these folks have plausible explanations for why so many people could have been so gigantically wrong about the risks they were taking.

 

Nassim Nicholas Taleb has been deeply influenced by this stream of research. Taleb not only has an explanation for what’s happening, he saw it coming. His popular books “Fooled by Randomness” and “The Back Swan” were broadsides at the risk-management models used in the financial world and beyond.

 

In “The Black Swan,” Taleb wrote, “The government-sponsored institution Fannie Mae, when I look at its risks, seems to be sitting on a barrel of dynamite, vulnerable to the slightest hiccup.” Globalization, he noted, “creates interlocking fragility.” He warned that while the growth of giant banks gives the appearance of stability, in reality, it raises the risk of a systemic collapse — “when one fails, they all fail.”

 

Taleb believes that our brains evolved to suit a world much simpler than the one we now face. His writing is idiosyncratic, but he does touch on many of the perceptual biases that distort our thinking: our tendency to see data that confirm our prejudices more vividly than data that contradict them; our tendency to overvalue recent events when anticipating future possibilities; our tendency to spin concurring facts into a single causal narrative; our tendency to applaud our own supposed skill in circumstances when we’ve actually benefited from dumb luck.

 

And looking at the financial crisis, it is easy to see dozens of errors of perception. Traders misperceived the possibility of rare events. They got caught in social contagions and reinforced each other’s risk assessments. They failed to perceive how tightly linked global networks can transform small events into big disasters.

 

Taleb is characteristically vituperative about the quantitative risk models, which try to model something that defies modelization. He subscribes to what he calls the tragic vision of humankind, which “believes in the existence of inherent limitations and flaws in the way we think and act and requires an acknowledgement of this fact as a basis for any individual and collective action.” If recent events don’t underline this worldview, nothing will.

 

If you start thinking about our faulty perceptions, the first thing you realize is that markets are not perfectly efficient, people are not always good guardians of their own self-interest and there might be limited circumstances when government could usefully slant the decision-making architecture (see “Nudge” by Thaler and Cass Sunstein for proposals). But the second thing you realize is that government officials are probably going to be even worse perceivers of reality than private business types. Their information feedback mechanism is more limited, and, being deeply politicized, they’re even more likely to filter inconvenient facts.

 

This meltdown is not just a financial event, but also a cultural one. It’s a big, whopping reminder that the human mind is continually trying to perceive things that aren’t true, and not perceiving them takes enormous effort."

 

I've finally realized why I'm so drawn to Brooks. We share so much in terms of the way we think about humanity, yet our conclusions based on that thinking are so often diametrically opposed.

Add new comment