
How Minds Change
The Surprising Science of Belief, Opinion, and Persuasion
Book Edition Details
Summary
"How Minds Change (2022) is a deep dive into the science and experience of why we believe, why we persist in our beliefs, and why, sometimes, we stop believing. More than that, it’s a guide to changing minds – not through manipulation or coercion, but through empathy, deep canvassing, and understanding the power of group-think and the limits of reasoning."
Introduction
Human minds possess an extraordinary capacity for transformation, yet this ability remains one of our most misunderstood psychological phenomena. While we witness dramatic shifts in public opinion on issues ranging from same-sex marriage to climate change, we simultaneously observe stubborn resistance to facts and evidence in other domains. This paradox reveals a fundamental gap in our understanding of how beliefs actually form, persist, and evolve. The exploration ahead challenges the prevailing assumption that minds change primarily through rational argument and factual presentation. Instead, it reveals a complex interplay of neuroscience, social psychology, and evolutionary biology that governs our capacity for intellectual transformation. Through examining cases of dramatic belief revision—from conspiracy theorists abandoning their theories to hate group members embracing tolerance—we uncover the hidden mechanisms that either facilitate or obstruct mental change. The journey traverses multiple disciplines, weaving together insights from brain imaging studies, field experiments in political persuasion, and intimate portraits of individuals who have undergone profound worldview shifts. This interdisciplinary approach illuminates not merely what causes minds to change, but why certain conditions prove essential for transformation while others create impenetrable resistance to new information.
The Neuroscience of Belief Formation and Cognitive Resistance
The human brain constructs reality through a sophisticated process of prediction and pattern recognition, creating what neuroscientists term our "umwelt"—a subjective perceptual world unique to each individual. This construction process begins with raw sensory data but quickly moves beyond mere perception to create complex models of how the world operates. These models, built through repeated experiences and reinforced by neural plasticity, become the foundation for all subsequent learning and belief formation. When confronted with ambiguous information, the brain automatically resolves uncertainty by drawing upon these established patterns, a process that occurs entirely below the threshold of consciousness. Research into phenomena like "The Dress"—the viral image that appeared either blue and black or white and gold to different viewers—reveals how prior experiences with lighting conditions unconsciously shape perception. This demonstrates that even basic sensory experiences involve interpretation based on accumulated knowledge. The brain's predictive processing creates what researchers call "SURFPAD"—when Substantial Uncertainty combines with Ramified Priors to produce Disagreement. Different life experiences generate different unconscious assumptions, leading people to perceive identical information in fundamentally different ways. This process explains why individuals can examine the same evidence yet reach opposing conclusions with equal confidence. Cognitive resistance emerges when new information threatens these deeply embedded predictive models. The anterior cingulate cortex, which processes cognitive dissonance, triggers an alarm when expectations clash with reality. However, this system evolved to balance stability with adaptability, generally favoring the preservation of existing models unless overwhelming evidence demands revision. This neurological conservatism serves an important function, preventing the chaos that would result from constant belief revision, but it also creates barriers to necessary updates when circumstances genuinely change.
Deep Canvassing and the Psychology of Perspective-Taking
Traditional approaches to persuasion, particularly those relying on factual arguments and logical reasoning, consistently fail to produce lasting attitude change on controversial topics. Research into "deep canvassing"—a technique developed by LGBTQ activists in Los Angeles—reveals why facts alone prove insufficient and what actually enables rapid belief transformation. This method achieves remarkable success rates, with approximately one in ten people changing their minds on divisive issues after a single twenty-minute conversation. The deep canvassing approach deliberately avoids factual arguments, instead focusing on eliciting personal stories and emotional experiences from the individuals being engaged. Canvassers ask people to recall times when they felt judged, excluded, or discriminated against, then guide them to consider how these experiences might relate to the lives of marginalized groups. This process activates what psychologists call "analogic perspective-taking"—the ability to understand another person's experience by drawing parallels to one's own emotional memories. The technique succeeds because it bypasses the intellectual defenses that typically activate when people encounter challenging information. Rather than triggering the brain's threat-detection systems, personal storytelling creates a state of "elaboration"—active learning in which individuals process new ideas by connecting them to existing knowledge and experience. When people talk themselves through their own reasoning, they often discover contradictions and inconsistencies that would be invisible if simply presented with external arguments. Neuroscientific studies of deep canvassing reveal that successful attitude change involves a shift from defensive processing to open exploration. Brain scans show that when people feel their core beliefs are under attack, blood flow increases to regions associated with physical threat detection. However, when the same challenging ideas are presented through personal narrative and emotional connection, these defensive responses diminish, allowing for genuine consideration of alternative viewpoints. This neurological evidence confirms that the method works by changing not just what people think, but how they think about contentious issues.
Tribal Identity and the Social Nature of Truth
Human reasoning operates within a fundamentally social context, shaped by evolutionary pressures that prioritized group cohesion over individual accuracy. The brain's threat-detection systems respond to challenges to group-defining beliefs as if they were physical dangers, activating fight-or-flight responses when core ideological commitments face contradiction. This reaction occurs because beliefs serve not merely as tools for understanding reality, but as signals of tribal membership and trustworthiness within social groups. Research demonstrates that any arbitrary distinction—even random assignment to groups based on estimating dots or preferring certain painters—immediately triggers in-group favoritism and out-group bias. This "minimal group paradigm" reveals that humans possess an innate tendency to form tribal identities around virtually any shared characteristic. Once these identities form, they become self-reinforcing through social feedback loops that reward conformity and punish deviation from group norms. The social nature of truth becomes particularly evident in how people evaluate expert credibility. Studies show that individuals readily accept or reject identical scientific evidence based solely on whether it aligns with their group's ideological commitments. A climate scientist's credentials become irrelevant if their conclusions threaten tribal identity; expertise itself becomes subordinate to group loyalty. This pattern explains why factual corrections often backfire, strengthening rather than weakening false beliefs when they serve important social functions. Breaking free from tribal constraints requires either finding alternative communities that support different values or experiencing such overwhelming disconfirmation that group membership becomes untenable. Examples from former cult members and extremist group defectors reveal that successful belief change typically involves not just intellectual conversion, but social transition from one community to another. The most effective interventions therefore focus not on changing minds directly, but on creating conditions where people feel safe to question group orthodoxy without losing essential social connections.
Argumentation as Evolved Mechanism for Collective Reasoning
Human reasoning evolved not as a tool for individual truth-seeking, but as a mechanism for collective problem-solving through argumentation. This "interactionist model" explains why people excel at finding flaws in others' arguments while remaining blind to weaknesses in their own reasoning. The apparent irrationality of confirmation bias becomes adaptive when understood as a division of cognitive labor—each individual contributes a strongly biased perspective to a group process that collectively arrives at better solutions than any member could achieve alone. The argumentative theory resolves the paradox of human reasoning: why we are simultaneously capable of remarkable intellectual achievements and prone to obvious logical errors. When reasoning alone, individuals typically confirm their existing beliefs and generate self-serving justifications. However, when multiple perspectives interact through structured disagreement, the group's collective intelligence emerges. Each person's biased contribution becomes a valuable piece of a larger puzzle, with weaknesses in one argument exposed by strengths in another. This system requires specific conditions to function effectively. Participants must share basic goals and maintain sufficient trust to engage in good-faith disagreement. They need exposure to genuinely different perspectives, not merely variations on shared assumptions. Most importantly, the social costs of changing one's mind must remain low enough that people feel free to update their beliefs when presented with superior arguments. Modern communication technologies often undermine these essential conditions by creating echo chambers where like-minded individuals reinforce each other's biases without encountering meaningful opposition. Social media platforms amplify the worst aspects of human reasoning—the tendency toward confirmation bias and motivated reasoning—while eliminating the corrective mechanisms that make argumentation productive. The result is polarization rather than convergence, with groups becoming more extreme over time rather than more accurate.
Summary
The transformation of human minds emerges not from the simple presentation of facts, but from a complex interplay of neurological processes, social dynamics, and evolutionary adaptations that prioritize group cohesion alongside individual learning. Understanding this process reveals that effective persuasion requires creating conditions where people feel safe to question their existing beliefs without threatening their essential social connections and identity needs. The most profound insight may be that changing minds is ultimately less about winning arguments and more about building bridges between different communities of understanding, allowing individuals to maintain their core values while updating their specific beliefs about how the world works.

By David McRaney