Introduction

In the fall of 2004, a European media mogul invited me to Munich to partake in what was described as an “informal exchange of intellectuals.” I had never considered myself an “intellectual”—I had studied business, which made me quite the opposite, really—but I had also written two literary novels and that, I guessed, must have qualified me for such an invitation.

Nassim Nicholas Taleb was sitting at the table. At that time, he was an obscure Wall Street trader with a penchant for philosophy. I was introduced to him as an authority on the English and Scottish Enlightenment, particularly the philosophy of David Hume. Obviously I had been mixed up with someone else. Stunned, I nevertheless flashed a hesitant smile around the room and let the resulting silence act as proof of my philosophical prowess. Right away, Taleb pulled over a free chair and patted the seat. I sat down. After a cursory exchange about Hume, the conversation mercifully shifted to Wall Street. We marveled at the systematic errors in decision making CEOs and business leaders make—ourselves included. We chatted about the fact that unexpected events seem much more likely in retrospect. We chuckled about why it is that investors cannot part with their shares when they drop below acquisition price.

Following the event, Taleb sent me pages from his manuscript, a gem of a book, which I commented on and partly criticized. These went on to form part of his international best seller, The Black Swan. The book catapulted Taleb into the intellectual all-star league. Meanwhile, my appetite whetted, I began to devour books and articles written by cognitive and social scientists on topics such as “heuristics and biases,” and I also increased my e-mail conversations with a large number researchers and started to visit their labs. By 2009, I realized that, alongside my job as a novelist, I had become a student of social and cognitive psychology.

The failure to think clearly, or what experts call a “cognitive error,” is a systematic deviation from logic—from optimal, rational, reasonable thought and behavior. By “systematic,” I mean that these are not just occasional errors in judgment but rather routine mistakes, barriers to logic we stumble over time and again, repeating patterns through generations and through the centuries. For example, it is much more common that we overestimate our knowledge than we underestimate it. Similarly, the danger of losing something stimulates us much more than the prospect of making a similar gain. In the presence of other people we tend to adjust our behavior to theirs, not the opposite. Anecdotes make us overlook the statistical distribution (base rate) behind it, not the other way round. The errors we make follow the same pattern over and over again, piling up in one specific, predictable corner like dirty laundry, while the other corner remains relatively clean (i.e., they pile up in the “overconfidence corner,” not the “underconfidence corner”).

To avoid frivolous gambles with the wealth I had accumulated over the course of my literary career, I began to put together a list of these systematic cognitive errors, complete with notes and personal anecdotes—with no intention of ever publishing them. The list was originally designed to be used by me alone. Some of these thinking errors have been known for centuries; others have been discovered in the last few years. Some come with two or three names attached to them. I chose the terms most widely used. Soon I realized that such a compilation of pitfalls was not only useful for making investing decisions but also for business and personal matters. Once I had prepared the list, I felt calmer and more levelheaded. I began to recognize my own errors sooner and was able to change course before any lasting damage was done. And, for the first time in my life, I was able to recognize when others might be in the thrall of these very same systematic errors. Armed with my list, I could now resist their pull—and perhaps even gain an upper hand in my dealings. I now had categories, terms, and explanations with which to ward off the specter of irrationality. Since Benjamin Franklin’s kite-flying days, thunder and lightning have not grown less frequent, powerful, or loud—but they have become less worrisome. This is exactly how I feel about my own irrationality now.

Friends soon learned of my compendium and showed interest. This led to a weekly newspaper column in Germany, Holland, and Switzerland, countless presentations (mostly to medical doctors, investors, board members, CEOs, and government officials), and eventually to this book.

Please keep in mind three things as you peruse these pages: First, the list of fallacies in this book is not complete. Undoubtedly new ones will be discovered. Second, the majority of these errors are related to one another. This should come as no surprise. After all, all brain regions are linked. Neural projections travel from region to region in the brain; no area functions independently. Third, I am primarily a novelist and an entrepreneur, not a social scientist; I don’t have my own lab where I can conduct experiments on cognitive errors, nor do I have a staff of researchers I can dispatch to scout for behavioral errors. In writing this book, I think of myself as a translator whose job is to interpret and synthesize what I’ve read and learned—to put it in terms others can understand. My great respect goes to the researchers who, in recent decades, have uncovered these behavioral and cognitive errors. The success of this book is fundamentally a tribute to their research. I am enormously indebted to them.

This is not a how-to book. You won’t find “seven steps to an error-free life” here. Cognitive errors are far too engrained to rid ourselves of them completely. Silencing them would require superhuman willpower, but that isn’t even a worthy goal. Not all cognitive errors are toxic, and some are even necessary for leading a good life. Although this book may not hold the key to happiness, at the very least it acts as insurance against too much self-induced unhappiness.

Indeed, my wish is quite simple: If we could learn to recognize and evade the biggest errors in thinking—in our private lives, at work, or in government—we might experience a leap in prosperity. We need no extra cunning, no new ideas, no unnecessary gadgets, no frantic hyperactivity—all we need is less irrationality.