Smart People Can Believe Dangerous Things. Here's Why.
And here is how to test whether you have slipped into the same trap.
How can otherwise intelligent friends, family, or coworkers fall for absurd claims or cling to conspiracies that collapse under even basic scrutiny?
It is not ignorance, and in some ways, that makes it worse.
Once someone ties a belief to their identity, intelligence becomes a tool for defending the error rather than correcting it.
The Greeks had a word for this refusal to learn, even when the truth is right in front of you: Amathia. It doesn’t simply mean ignorance. It is a mix of intellectual stubbornness, moral blindness, and a kind of cultivated unteachability.
Understanding that idea helps explain why smart people can believe dangerous things and why none of us is as immune as we think.
When Intelligence Starts Working Against the Truth
Amathia is a state in which someone knows just enough to feel certain while remaining closed to anything that might threaten their identity. What we mistake for confusion, ignorance or misinformation is usually identity-protective cognition at work. When a claim collapses, the bar for what counts as proof is raised or moved. When an uncomfortable fact appears, the subject changes or the messenger is attacked. Data is dismissed as biased. Supportive information is accepted without scrutiny, contradictory evidence is picked apart, and new rationalizations are spun so the core belief never has to move.
None of this is about truth. It is about protecting identity, status, belonging, and ego. That is why facts bounce off and debate goes nowhere. Amathia is dangerous precisely because it feels like knowledge. The person believes they are reasoning when they are, in fact, defending a narrative about who they are. Once belief fuses with identity, intelligence stops asking questions and starts building defenses.
What Happens When This Mindset Gains Power
We are watching the same pattern spread through public institutions.
The Centers for Disease Control is being hollowed out by ideologues who have sidelined its scientific staff and replaced evidence with politics. Scientists are leaving rather than be forced to rubber-stamp decisions that have nothing to do with research. Expertise is being reframed as bias. Ideological loyalty is treated as the only real qualification.
Why Calling This “Stupidity” Misses the Point
Most people caught in this pattern are not incapable of understanding the evidence. They are avoiding the discomfort of being wrong. When that avoidance is reinforced by social circles, political identity, or moral certainty, even very bright people can talk themselves into believing things that should be indefensible.
In fact, there is plenty of evidence that arguing until you are blue in the face rarely moves the needle. Better to start by reducing the sense of identity threat rather than leading with a barrage of facts. That means beginning from shared values such as “we both want our kids safe,” separating the person’s worth from the specific belief, and acknowledging the emotions like betrayal, fear, and anger that are actually driving their resistance. Rather than arguing point by point, ask open questions that invite them to explain what worries them and what evidence they would trust, then tailor your responses to those specific concerns instead of delivering generic talking points.
Use trusted in-group messengers and moral language that fits their world, such as faith, care for elders, and freedom from avoidable harm, so new information feels like an extension of their identity, not an attack on it.
Frame the conversation as a joint search for accuracy, offer face-saving ways to update, for example, “a lot of thoughtful people changed their minds when they saw this data,” and recognize that sometimes the most you can do with someone deeply entrenched is model calm, evidence-guided reasoning for the moveable middle watching from the sidelines.
The Harder Truth: This Applies to Us Too
It is easy to diagnose rigidity in others. It is much harder to catch it in ourselves.
Not only can smart people believe dangerous nonsense, research shows that people with higher education and greater science literacy were more polarized along political and religious lines than those with less education.
The analysis by the National Academy of Sciences concluded that education and science literacy “may increase rather than decrease polarization” because more knowledgeable individuals are better able to interpret evidence in ways that support their group’s position and feel more confident doing so.
The issue is not whether we have blind spots. We all do. The issue is whether we have built habits that expose them before they harden.
Answer these questions honestly.
Do you feel an immediate urge to defend a belief the moment it is challenged?
Do you dismiss a source because of who said it rather than what was said?
Do you look for evidence that only confirms what you already believe?
Do you feel irritation or contempt when someone presents a counterpoint?
Can you name the last time you changed your mind about something important?
What Actually Protects Us
Nothing completely immunizes us against amathia, but certain traits and habits make us less vulnerable and can even pull us back from it when we drift.
The evidence points to qualities such as intellectual humility and curiosity. Studies of political and religious conflict find that people high in intellectual humility are less polarized and less hostile, more willing to admit they might be wrong, and more open to disconfirming evidence, which directly opposes the rigid certainty at the heart of amathia.
Research on scientific curiosity shows that when people are genuinely curious, they are more likely to engage with facts that challenge their group’s narrative rather than reflexively reject them.
When people feel valued, feel they share basic goals and values, and trust that their ideas will be judged on whether they are true instead of whether they prove loyalty, the desperate need to never be wrong starts to ease.
What actually protects us is not having the highest IQ, but being the kind of person, in the kind of community, where changing your mind when you see better evidence is clearly recognized as a sign of honesty and courage, not as a betrayal.





BRILLIANT POST!!! Thank you so much, Nadine, for helping me achieve a much deeper understanding of how our people have inadvertently (for the most part) put a demented monster and his team of miscreants in charge of our Government and our LIVES. I will be sharing your work with others to help spread your work. Best wishes, NW
Excellent, Nadine. Your thoughtful insights are always very much appreciated. Thank You!