The Undoing Project cover

The Undoing Project

A Friendship That Changed Our Minds

byMichael Lewis

★★★★
4.09avg rating — 75,055 ratings

Book Edition Details

ISBN:0393354776
Publisher:W. W. Norton & Company
Publication Date:2017
Reading Time:10 minutes
Language:English
ASIN:0393354776

Summary

In a world where the human mind is both a labyrinth and a marvel, Daniel Kahneman and Amos Tversky emerged as the audacious trailblazers who dared to map its intricacies. "The Undoing Project" by Michael Lewis captures the electrifying partnership of these two pioneering psychologists, whose groundbreaking insights into human decision-making have redefined our understanding of choice and reason. From the bustling halls of academia to the practical realms of economics, medicine, and governance, their Nobel-winning theories illuminate the invisible biases that guide our every move. With a narrative as captivating as it is enlightening, this book chronicles not just their scientific achievements but also the profound friendship that fueled a seismic shift in how we perceive reality. Perfect for readers eager to unravel the enigmas of human behavior, Lewis crafts a tale of intellectual courage and camaraderie that resonates far beyond the pages.

Introduction

Picture this: You're a basketball scout watching a promising young player. He's got all the right moves, the perfect build, and stellar college stats. Every instinct screams "future NBA star." Yet somehow, your gut feeling proves spectacularly wrong. Or imagine you're a doctor seeing a patient with classic symptoms that perfectly match a rare condition you just read about. The diagnosis seems obvious, but you're missing something far more common and dangerous. These aren't isolated mistakes—they're windows into the systematic errors that shape how we all think, judge, and decide. This fascinating journey into the human mind reveals why our brains, despite their remarkable capabilities, consistently fool us in predictable ways. Through the groundbreaking work of two Israeli psychologists who revolutionized our understanding of judgment and decision-making, we discover that the very mental shortcuts that help us navigate daily life also lead us astray in crucial moments. You'll learn to recognize the hidden biases that influence everything from medical diagnoses to investment choices, understand why experts often perform worse than simple algorithms, and gain practical tools to make better decisions when the stakes matter most.

When Basketball Scouts and Algorithms Clash

Daryl Morey stood in the Houston Rockets' interview room, watching a seven-foot-two-inch giant from India struggle to answer basic questions about basketball. Satnam Singh had been discovered barefoot in a Punjab village at age fourteen, brought to America to learn the game, and now sat before NBA executives who had to decide whether to invest millions in his potential. The young man's hands were the biggest anyone had measured, his feet size 22, and he claimed he was still growing. When asked about his strengths, Singh mentioned his post-up game and mid-range shooting. When asked who he resembled in the NBA, he confidently replied "Jowman and Shkinoonee"—his pronunciation of Yao Ming and Shaquille O'Neal. Morey, the Rockets' general manager, had revolutionized basketball by replacing traditional scouting with statistical analysis. He'd built algorithms that could predict player success better than veteran scouts who'd watched basketball for decades. Yet here he was, faced with a player who defied all data—no game footage, no college statistics, no measurable track record. Singh was like a puzzle with missing pieces. The Rockets would pass on him, and Dallas would take him in the second round, leaving everyone to wonder what they'd missed or avoided. This scene captures a fundamental truth about human judgment: we're constantly making decisions with incomplete information, yet we feel compelled to act with confidence. Morey had learned that even his sophisticated models couldn't eliminate the need for human judgment—they could only make it more reliable. The key insight wasn't that data was perfect, but that human intuition, left unchecked, was systematically flawed. Basketball scouts consistently overvalued players who looked the part while missing hidden gems who didn't fit their mental prototype of success. Whether you're hiring employees, choosing investments, or making any significant decision, your brain is running pattern-matching software that worked well for our ancestors but often misfires in modern contexts. The solution isn't to abandon intuition entirely, but to understand its limitations and build systems that compensate for our predictable blind spots.

Two Minds That Changed How We Think

Danny Kahneman arrived at his Hebrew University seminar in spring 1969 expecting another routine academic discussion. Instead, his guest speaker Amos Tversky presented research suggesting that people were "conservative Bayesians"—that when faced with uncertainty, humans behaved roughly like intuitive statisticians, making mostly rational judgments with only minor errors. Danny listened with growing incredulity. The experiment Amos described involved people drawing colored chips from bags and updating their probability estimates with each draw. The researchers concluded that people were pretty good at this statistical reasoning, just a bit too cautious in updating their beliefs. Danny couldn't contain himself. The whole premise struck him as absurd. He'd taught statistics at Hebrew University and knew firsthand that people—even smart people—were terrible intuitive statisticians. They drew sweeping conclusions from tiny samples, ignored base rates, and consistently misjudged probabilities. After the seminar, Danny "pushed Amos into the wall," as he later put it, challenging every assumption behind the research. Most remarkably, Amos—known for never losing an argument—didn't fight back. Instead, something shifted in his thinking. That confrontation sparked one of the most productive intellectual partnerships in modern psychology. Danny, the perpetually doubtful Holocaust survivor, brought deep skepticism about human judgment. Amos, the confident Israeli war hero, contributed mathematical rigor and fearless theorizing. Together, they began systematically documenting the ways human judgment goes wrong. Their collaboration was so complete that they flipped coins to determine whose name appeared first on their papers, and colleagues often couldn't tell who had contributed which ideas. What made their partnership revolutionary wasn't just their individual brilliance, but their complementary weaknesses. Danny's insecurity made him constantly question his own thinking, while Amos's confidence allowed him to pursue bold theories. Their friendship teaches us that the best thinking often emerges from productive disagreement between people who trust each other enough to challenge fundamental assumptions. Innovation requires both the courage to question established wisdom and the humility to admit when you're wrong.

The Mental Shortcuts That Fool Us All

In their makeshift laboratory, Danny and Amos crafted deceptively simple questions that revealed profound truths about human thinking. They asked Israeli high school students to estimate birth orders in families: Which was more likely in a family with six children—GBGBBG or BBBGGG? Nearly everyone chose the first sequence, even though both are equally probable. The reason? The mixed sequence looked more "random" than the clustered one, matching our mental stereotype of how chance should appear. Another experiment involved "Tom W.," a fictional graduate student described as highly intelligent but lacking creativity, with a need for order and little sympathy for others. When asked to predict Tom's field of study, people overwhelmingly guessed computer science, completely ignoring the base rate—that only seven percent of graduate students were in computer science at the time. They were so swayed by Tom's resemblance to their stereotype of a computer scientist that they disregarded statistical reality. These weren't isolated errors but symptoms of systematic mental shortcuts—heuristics—that our brains use to navigate uncertainty. The "representativeness heuristic" leads us to judge probability by similarity to mental prototypes. The "availability heuristic" makes us overweight easily recalled examples. After seeing news coverage of airplane crashes, we overestimate flying risks. After watching a medical drama, we worry about rare diseases. Our judgments are hijacked by whatever comes most readily to mind. Danny and Amos discovered that these mental shortcuts aren't random glitches—they're predictable features of human cognition. We consistently mistake small samples for representative ones, ignore base rates when given vivid details, and let recent experiences distort our probability assessments. Before making important decisions, pause and ask: Am I being swayed by a compelling story rather than statistical reality? Am I letting vivid examples override base rates? These simple questions can dramatically improve your decision-making.

From Hospital Errors to Better Decisions

Dr. Don Redelmeier rushed to the Sunnybrook Hospital operating room where a young car crash victim lay with multiple fractures and a dangerously irregular heartbeat. The medical team had already diagnosed the problem: the woman had a history of thyroid issues, and hyperthyroidism commonly causes heart rhythm disturbances. They were ready to treat her thyroid condition when Redelmeier asked everyone to pause. Something bothered him about their quick diagnosis. "Hyperthyroidism is a classic cause of irregular heart rhythm," he told the team, "but it's an infrequent cause of irregular heart rhythm." Despite the compelling narrative—thyroid history plus heart problems equals thyroid-caused heart problems—Redelmeier insisted they search for more statistically likely causes. That's when they discovered her collapsed lung, missed on the initial X-ray. They treated the lung, her heartbeat normalized, and subsequent tests showed her thyroid was perfectly normal. Their initial diagnosis, though logically coherent, was dead wrong. This case exemplifies how these insights transformed medical practice. Redelmeier, inspired by this work since reading their research as a teenager, became a specialist in medical decision-making. He recognized that doctors, like everyone else, fall prey to the representativeness heuristic—seeing a pattern that fits their expectations and stopping their search for alternatives. The availability heuristic also wreaks havoc in hospitals, where recent dramatic cases skew physicians' probability assessments. The revolution extended far beyond medicine. Wall Street traders learned to recognize how recent market moves distorted their risk perceptions. Government officials discovered why expert predictions so often failed. Sports teams used these insights to build better evaluation systems. The common thread was recognizing that expertise doesn't eliminate cognitive biases—it can actually amplify them by increasing confidence in flawed judgments. The key insight for any professional is that systematic errors in thinking aren't character flaws but features of human cognition. The solution isn't to try harder or think more—it's to build systems that account for predictable biases. Create checklists that force consideration of base rates. Seek out disconfirming evidence. Most importantly, cultivate intellectual humility about the limits of human judgment.

Summary

The human mind is not a rational calculator but a pattern-seeking machine that relies on mental shortcuts, leading to systematic and predictable errors in judgment and decision-making. Start questioning your first instincts, especially when they feel obviously right—that confidence often signals the presence of bias rather than accuracy. Build decision-making processes that force you to consider base rates, seek alternative explanations, and actively look for evidence that contradicts your initial impressions. Whether you're diagnosing patients, evaluating job candidates, or making investment choices, remember that the most compelling stories are often the most misleading ones. The goal isn't to eliminate human judgment but to understand its limitations and create systems that help you think more clearly when it matters most.

Download PDF & EPUB

To save this Black List summary for later, download the free PDF and EPUB. You can print it out, or read offline at your convenience.

Book Cover
The Undoing Project

By Michael Lewis

0:00/0:00