The Psychology of Belief | Why We Believe Things Without Evidence

Video Transcript

Here is the full transcript of your video, now formatted with natural paragraph breaks for ease of reading:

Welcome to The Psychology of Us. I'm Professor RJ Starr, and in this episode, we're diving into a question that touches every aspect of our lives, whether we realize it or not: Why do we believe the things we do?

Let's start with a few questions. Have you ever found yourself absolutely convinced of something, only to later realize you had no real evidence to support it? Why do we sometimes believe things so deeply that no amount of logic, facts, or reasoning can shake our certainty? And what is it about belief—whether in ideas, institutions, or even illusions—that makes it feel so undeniably real?

These are questions that, at some point, every human being must confront. Belief isn't just about religion or spirituality. It's about how we navigate the world. It's why some people trust political leaders without question, why conspiracy theories thrive despite overwhelming evidence to the contrary, and why entire industries are built around ideas that, on closer inspection, don't always hold up to scrutiny.

But belief isn't about intelligence. It's not about education level, logic, or even access to information. In fact, some of the most brilliant minds in history have held onto ideas that were later proven completely false. Belief is about something deeper—our emotions, our identities, our need for certainty in an uncertain world.

Now, before we go any further, I want to be clear about something. This episode is not about discrediting any specific belief system, nor is it about convincing anyone of what they should or shouldn't believe. My goal isn't to tell you what's true or false. Instead, we're looking at the psychology of belief—how and why we come to believe the things we do, even when there's no direct evidence, even when we don't fully understand them, and even when they shape entire communities and cultures.

In this episode, we'll break down the psychology of belief into five parts.

Part One will explore the illusion of knowledge—why we often believe we know more than we do, and how cognitive biases shape our thinking.

Part Two will examine the role of social influence and group identity, looking at why belief is often tied to our sense of belonging, and why challenging a belief can feel like challenging an entire community.

Part Three will dive into emotional needs and existential security, uncovering how belief provides comfort, stability, and meaning, especially in times of uncertainty.

Part Four will focus on charismatic leaders and the power of persuasion—breaking down how authority figures shape belief, and why confidence is often mistaken for credibility.

Finally, Part Five will explore the evolutionary roots of belief, examining how our brains are wired to find patterns and meaning even when none exist.

I think back to a moment early in my career, when I was having a conversation with someone who was absolutely convinced that a major historical event—one universally accepted by experts and supported by overwhelming documentation—had never actually happened. They weren’t just skeptical. They knew with absolute certainty that the entire event had been fabricated. And no matter how much evidence I presented, no matter how carefully I walked through the facts, nothing changed their mind.

At first, I assumed they just didn’t have the right information. But the more we talked, the more I realized this wasn’t about evidence. It wasn’t about facts. It was about identity.

Belief isn’t just something we hold. It’s something that holds us. It defines who we are, who we trust, and how we see the world. When a belief is challenged, it can feel like we are being challenged. And that’s where things get complicated.

Psychologists have studied this phenomenon for decades. One of the most well-known theories—cognitive dissonance, a term coined by Leon Festinger in the 1950s—explains how people struggle with conflicting information. When we’re faced with evidence that contradicts something we believe, we don’t always reassess and adjust our thinking. More often than not, we double down. We rationalize. We twist facts to fit our existing worldview rather than changing the worldview itself.

And it’s not just about personal belief. Entire institutions have been built around ideas that had no factual basis but became unquestionable over time. Myths become traditions. Traditions become structures. And before we know it, something that began as a story or an assumption solidifies into an unshakable truth.

This isn’t just a modern phenomenon. It’s deeply ingrained in human history. From the spread of medieval superstitions to the rise of pseudoscientific movements, we have a long record of embracing ideas that defy logic.

But why? What makes belief so powerful? And why do we sometimes cling to ideas even in the face of overwhelming evidence to the contrary?

The answer isn’t simple, but it starts with understanding how belief forms in the first place. It starts with the illusion of knowledge—with the way our minds fill in gaps when we lack information, convincing us that we understand far more than we actually do.

That completes the introductory section and transition into Part One. I’ll continue in the next message with Part One: The Illusion of Knowledge and Cognitive Biases.

Part One: The Illusion of Knowledge and Cognitive Biases

We like to think that our beliefs are based on knowledge—that we come to conclusions because we've examined the facts, weighed the evidence, and made rational decisions. But what if I told you that much of what we think we know is actually an illusion?

There's a psychological phenomenon known as the Dunning-Kruger effect, named after David Dunning and Justin Kruger, two researchers who discovered something fascinating about human confidence. Their studies found that people with the least knowledge or expertise in a subject tend to overestimate their own understanding, while those who are genuinely knowledgeable are more likely to recognize the limits of what they know. In other words, the less we know, the more confident we tend to be.

I remember experiencing this firsthand as a young professor in one of my early lectures. A student confidently challenged a well-established psychological theory, dismissing decades of research with a few buzzwords they had picked up from a social media post. They weren’t trying to be difficult. They truly believed they had uncovered something that entire fields of study had somehow missed. And in their mind, my years of research and study meant nothing compared to the information they had absorbed in a matter of minutes.

This is the illusion of knowledge—the feeling that we understand something deeply when in reality our grasp of it is superficial at best. It happens because our brains are wired for efficiency. When we encounter complex ideas, we don't always process the full depth of information. Instead, we rely on mental shortcuts—heuristics—to make quick judgments. And sometimes those shortcuts lead us to believe we know more than we do.

But it's not just about personal confidence. There's another cognitive bias at play here: confirmation bias. This is the tendency to seek out and believe information that supports what we already think, while ignoring or dismissing anything that contradicts it.

Imagine two people searching online for answers about a controversial topic. One types in “why X is true.” The other types “why X is false.” Even though they're looking at the same topic, they'll likely end up in entirely different corners of the internet—each reinforcing their existing belief. That's because our brains crave consistency. We want to be right. And so we gravitate toward information that confirms what we already believe.

In a famous experiment by Charles Lord, Lee Ross, and Mark Lepper in the late 1970s, researchers took two groups of people—one that supported capital punishment and one that opposed it. Both groups were given the same set of research studies: some that suggested capital punishment reduced crime, and others that suggested it had no effect.

You'd think that exposing both groups to the same information would bring them closer together in their conclusions. But that's not what happened. Instead, each side became even more entrenched in their original belief. They dismissed the studies that contradicted their views and embraced the ones that aligned with them.

That’s the backfire effect—a phenomenon where being confronted with contradictory evidence doesn’t lead us to question our beliefs but actually strengthens them. It’s one of the reasons why arguing with someone who holds a deeply entrenched belief can feel so frustrating. No matter how much logic or evidence you present, it often has the opposite of the intended effect.

I once found myself in a conversation with a relative about a well-known conspiracy theory. I assumed that if I simply walked them through the facts—showed them the holes in the story, explained why the evidence didn’t add up—they would naturally reconsider their position. But instead, the more evidence I provided, the more defensive they became. The conversation didn’t lead them to doubt their belief. It made them more certain.

That’s because belief isn’t just intellectual—it’s emotional. When we hold a belief, especially one that ties into our identity, challenging that belief feels like an attack on us. And when people feel attacked, they don’t reconsider. They defend.

This is why entire movements, organizations, and even industries have been built around ideas that lack evidence but persist nonetheless. Because belief isn’t about truth. It’s about certainty. And certainty—whether justified or not—gives us a sense of stability in a world that often feels unpredictable.

But belief doesn’t just exist in isolation. It thrives in social environments. The more people around us reinforce what we believe, the more unshakable it becomes. And that brings us to the next piece of the puzzle.

Part Two: The Role of Social Influence and Group Identity

Belief doesn't form in a vacuum. It grows, spreads, and solidifies within social environments. And if there’s one thing human beings crave as much as certainty, it’s belonging.

We are social creatures, wired for connection from the moment we’re born. Our survival depends on our ability to form relationships and integrate into the groups around us. And one of the most powerful ways we do that is through shared belief.

Psychologists Henri Tajfel and John Turner explored this idea in what’s known as social identity theory. They found that people derive a huge part of their self-worth and identity from the groups they belong to—whether it's a political party, a religion, a profession, or even something as simple as a favorite sports team. We naturally categorize ourselves and others into “us” and “them.” And once we identify with a group, we have a strong psychological incentive to defend its beliefs—even when they don’t make sense.

I saw this play out firsthand during my time in the military. There was a particular ritual, a tradition passed down for generations, that everyone participated in. It had no real functional purpose. But questioning it was unthinkable. If you even hinted at skepticism, you were met with resistance—not because people had deeply examined its meaning, but because questioning it felt like betraying the group.

That’s the power of belief within a social structure. Once an idea becomes a marker of group identity, challenging it isn’t just about questioning a fact—it’s about risking belonging.

This is why cults, extremist movements, and even more mainstream belief systems are so effective at keeping people inside. They don’t just offer an idea. They offer community. And that community provides something deeper than logic: security, validation, and a sense of purpose.

Take mass delusions, for example. Throughout history, we’ve seen entire societies buy into collective beliefs that, from an outside perspective, seem absurd. One of the most famous cases was the Salem witch trials in 1692. A wave of paranoia swept through a small Puritan community, leading to the execution of people accused of witchcraft. Looking back, we can see how social reinforcement, combined with fear, allowed a completely unfounded belief to take hold. But in the moment, questioning that belief meant questioning the entire social order. It meant going against neighbors, friends, and religious authorities. And that was too great a risk for most people.

This isn’t just a historical phenomenon. We see it today—in political movements, misinformation campaigns, and social media echo chambers. When an idea is repeated enough, when enough people around us believe it, it starts to feel true—whether or not it actually is.

One of the most well-known psychological studies on this was conducted by Solomon Asch in the 1950s. He designed an experiment where participants were shown a simple visual test: three lines of different lengths. They were asked to say which line matched a reference line. The task was easy—the correct answer was obvious. But there was a catch. The participants were placed in a group where the majority, who were secretly part of the experiment, intentionally gave the wrong answer.

And what happened? Most of the real participants, who knew the correct answer, conformed to the group’s incorrect response at least once—not because they were unsure, but because going against the group felt wrong.

That’s how powerful social influence is. It can make us doubt our own eyes.

Now apply that to belief systems. If everyone around you accepts an idea as truth—whether it’s a political ideology, a religious doctrine, or even a conspiracy theory—it becomes incredibly difficult to question it. Because questioning it means standing alone. And standing alone is one of the hardest things for a human being to do.

I remember a conversation I had with someone who had spent years in an insular, high-control religious group. They told me that when they finally began to have doubts, they weren’t just afraid of being wrong. They were afraid of losing their entire social world—their friends, their family, their sense of belonging. That fear kept them inside the belief system long after they had stopped believing in it.

That’s why people don’t just believe things—they defend them. Even when the facts don’t align. Even when doubt creeps in. The social cost of leaving a belief system can be too high to pay.

And there’s another layer to this—the emotional comfort that belief provides. Because at its core, belief isn’t just about knowledge. It’s about security. About feeling safe in an uncertain world.

Part Three: Emotional Needs and Existential Security

Belief isn’t just about facts, logic, or social belonging. It’s about something deeper—something woven into the very fabric of our psychology. At its core, belief provides security. It gives us a sense of stability in a world that often feels chaotic and unpredictable. And for many people, belief isn’t just about what they think is true. It’s about what needs to be true in order for life to feel meaningful.

There’s a psychological theory called terror management theory, developed by Jeff Greenberg, Sheldon Solomon, and Tom Pyszczynski, which explains how human beings cope with the knowledge of their own mortality. Unlike other animals, we’re aware that one day we are going to die. And that awareness creates a kind of existential anxiety that, if left unchecked, could be overwhelming.

So what do we do? We create meaning. We construct belief systems that provide a sense of order, purpose, and permanence. We align ourselves with religions, ideologies, and philosophies that offer a framework for understanding life and death.

I remember a conversation I had years ago with someone who had lost a loved one unexpectedly. They told me that in the aftermath of that loss, they found themselves drawn to a particular spiritual belief—one they hadn’t really considered before. They weren’t sure if it was true, but they needed it to be true. Because without it, the grief felt unbearable.

This is something that many people experience. Belief isn’t always about intellectual certainty. It’s often about emotional survival.

This is also why we see so many people drawn to conspiracy theories during times of crisis. When the world feels uncertain, when institutions fail, when tragedies happen—our minds search for explanations that make the chaos feel more controllable. A random, unpredictable world is terrifying. But a world where someone—even a shadowy organization or hidden elite—is pulling the strings? That feels more stable. It gives people a clear enemy, a narrative to hold on to, and a sense of power in an otherwise powerless situation.

And sometimes, belief doesn’t just provide emotional security—it provides identity. There’s a reason why so many people define themselves by their beliefs. It’s not just what they think. It’s who they are. And that’s why challenging a belief can feel so deeply personal. It’s not just an intellectual disagreement. It’s an attack on the self.

I’ve seen this play out countless times in discussions about politics, science, even history. When someone’s belief is questioned, their immediate response is often emotional—not rational. They feel defensive, even angry, because whether or not they realize it, that belief is tied to their sense of self.

One study on this phenomenon, conducted by Jonas Kaplan and his colleagues at USC, found that when people’s deeply held beliefs were challenged, the same regions of the brain that activate during physical pain were triggered. That means that being confronted with evidence that contradicts a core belief isn’t just uncomfortable—it hurts. It’s as if the brain is responding to a threat.

I experienced this myself years ago. There was a particular historical event I had always been taught to see in a specific way. I had grown up with a certain narrative—one that was reinforced by my education and my environment. But then I came across a piece of research that directly contradicted that narrative. At first, I dismissed it. I told myself it had to be biased, that the sources weren’t credible. But the more I read, the more I realized the evidence was solid—and that the version of history I had believed for so long wasn’t quite as I had thought.

I’d like to say that I immediately accepted this new perspective. But the truth is, I resisted it. I felt that resistance in a way I didn’t expect. Because admitting I was wrong didn’t just mean adjusting a fact in my mind. It meant confronting the reality that I had built part of my worldview on something incomplete.

That’s the challenge of belief. It’s not just about what’s true. It’s about what’s comfortable. And comfort is something that belief provides in ways logic never can. Because even when the evidence is weak, even when the foundation is shaky, a belief that provides security and meaning will always feel more compelling than a fact that leaves us in uncertainty.

But belief doesn’t just arise in the mind of the individual. It’s often shaped and reinforced by external forces—by leaders, movements, and systems that capitalize on our psychological vulnerabilities.

And that’s what we need to look at next: the role of authority and persuasion in shaping what people believe and why charismatic leaders have always had the power to make the unbelievable seem undeniable.

Part Four: Charismatic Leaders and the Power of Persuasion

Belief doesn’t spread on its own. It’s shaped, reinforced, and often manipulated by those who understand how to wield its power. Throughout history, charismatic leaders—whether in politics, religion, or even business—have played a critical role in shaping what people accept as truth.

Why? Because human beings are wired to trust authority.

One of the most famous studies in psychology, Milgram’s obedience experiment, demonstrated just how deeply ingrained this tendency is. In the 1960s, psychologist Stanley Milgram set up a study where participants were instructed to administer increasingly severe electric shocks to a stranger—who was actually an actor and wasn’t being shocked at all. Despite hearing screams of pain and pleas to stop, many participants continued, simply because they were told to do so by an authority figure in a lab coat.

What this study revealed wasn’t just about obedience. It was about the psychological weight of authority. When someone appears confident, knowledgeable, and in control, we are predisposed to believe them—even when what they say contradicts logic, morality, or our own instincts.

I once attended a seminar given by a so-called expert in a particular field. He spoke with absolute certainty, throwing out statistics and referencing studies—none of which were cited, none of which I could verify. But the audience was mesmerized. It wasn’t the content that convinced them. It was the delivery. The confidence. The certainty. The way he framed every idea as undeniable fact.

And I realized in that moment just how easy it is for people to mistake confidence for credibility.

This is something we see time and time again with cult leaders, political demagogues, and even self-help gurus. They don’t just sell ideas. They sell certainty. And certainty is intoxicating.

Take Jim Jones, for example—the leader of the People’s Temple, a movement that ended in the infamous Jonestown massacre in 1978. Jones didn’t start out by telling people to drink poison Kool-Aid. He built his following by creating a sense of community, by offering people a vision of a better world, by slowly but surely shifting their reality until the unthinkable seemed rational.

This is what persuasive leaders do. They don’t demand blind obedience from day one. They guide people, step by step, deeper into a belief system until questioning it no longer feels like an option.

Psychologist Robert Cialdini has studied the mechanisms of persuasion extensively, identifying several principles that explain why people are so susceptible to influence. One of the most powerful is social proof—the idea that if enough people believe something, it must be true. We see this in marketing, in politics, in religious movements. If a belief system is widely accepted, people assume there must be a reason for it.

There’s also commitment and consistency—the psychological tendency to double down on something once we’ve publicly committed to it. This is why cults and extremist groups often ask for small acts of devotion before escalating to more extreme demands. Once someone has invested their time, money, or reputation into a belief system, turning back becomes incredibly difficult.

I once spoke with someone who had been deeply involved in a multi-level marketing scheme. They had spent years recruiting others, investing their savings, promoting the company’s ideology—even after realizing that the entire structure was flawed. They couldn’t walk away. Admitting they had been deceived would mean acknowledging that they had spent years of their life and thousands of dollars on something that wasn’t real. And for many people, that’s harder than staying in the illusion.

That’s why belief systems—especially those reinforced by powerful, persuasive leaders—can be so difficult to escape. They don’t just shape what people think. They shape who people are.

And this isn’t limited to extreme cases. We see it in everyday life. Political leaders using emotional rhetoric to bypass logic. Influencers selling lifestyle philosophies as absolute truth. Even corporations building brand loyalty not just around products, but around identities.

Belief, once reinforced by authority and repetition, becomes incredibly difficult to question. And once an idea is deeply embedded in someone’s worldview, even evidence to the contrary often isn’t enough to dislodge it.

But why? Why do we cling so tightly to beliefs—even when they’re demonstrably false?

The answer lies not just in social influence or persuasion, but in something even deeper—something evolutionary. Because belief isn’t just a cultural phenomenon. It’s something that, on a fundamental level, our brains are designed for.

Part Five: The Evolutionary Roots of Belief

Belief isn’t just something we absorb from families, communities, or charismatic leaders. It’s something that has been wired into us over thousands of years of evolution. Human beings are pattern-seeking creatures. Our ancestors didn’t have access to modern science, so they relied on observation, intuition, and experience to make sense of the world.

If they saw dark clouds and then a storm followed, they associated the two. If one plant made them sick and another didn’t, they developed beliefs about which plants were safe and which were dangerous. This ability to see patterns—even when none existed—helped our species survive.

But this same instinct that kept our ancestors alive also made them vulnerable to false beliefs. In evolutionary psychology, this is known as hyperactive agency detection—the tendency to assume intentionality behind random events. If a rustling in the bushes might be a predator, it’s safer to assume something is there than to ignore it and be wrong. Those who erred on the side of caution—who assumed unseen forces were at work—were more likely to survive and pass on their genes.

And so, over time, our brains became hardwired to look for meaning in the meaningless, to attribute cause-and-effect relationships where none exist, to see agency where there may be none at all.

This explains why we’re so naturally drawn to belief systems. Why we see faces in clouds. Why we instinctively look for who is behind an event rather than accepting that something simply happened. Our minds are wired for stories, for explanations, for narratives that help us feel like the world makes sense.

I remember a moment from my own life when I felt this instinct in action. Years ago, I was thinking about an old friend I hadn’t spoken to in years—and suddenly my phone rang. It was them. The coincidence was so eerie that for a split second, I felt something beyond logic pulling at me. Rationally, I knew it was just randomness. Out of all the thoughts we have in a day, sometimes coincidences happen. But in that moment, it felt like something more.

That’s the power of our pattern-seeking brains. Even when we know better—even when we understand the randomness of probability—we feel meaning. And for many people, that feeling is more powerful than facts.

This is also why myths and religious narratives have existed in every human culture throughout history. They provide order in an unpredictable world. They give us moral frameworks, explanations for suffering, and a sense of connection to something larger than ourselves.

Belief, in this sense, isn’t just about individuals. It’s about culture. It’s how societies organize themselves. How traditions are passed down. How values are preserved across generations.

Even in the modern world—where we have access to more information than ever before—we still see this deep psychological need for belief playing out. People believe in political ideologies with religious fervor. They invest themselves in lifestyle philosophies as if they were sacred truths. They put their trust in brands, influencers, and movements that promise meaning, purpose, and certainty.

And when someone feels that their belief is being threatened, they don’t just resist intellectually. They resist emotionally—even physiologically. Neuroscientists have found that when people are confronted with facts that contradict deeply held beliefs, the brain’s amygdala—the area responsible for fear and emotional processing—becomes highly active. It’s the same response that occurs when we feel physically threatened.

That’s why arguments over belief systems can become so heated. Why people defend their ideas as if they’re defending themselves. Because in a way, they are.

So when we ask why people hold on to beliefs with such certainty—why they sometimes build entire institutions around ideas with no evidence—the answer isn’t simple. It’s not just about ignorance. It’s not just about social pressure. It’s about human nature itself.

We are wired to believe. We always have been. And unless we actively work to recognize our cognitive biases—unless we develop the self-awareness to question our own assumptions—we always will be.

But that doesn’t mean we’re powerless against false beliefs. It doesn’t mean we’re doomed to be trapped by the illusions our minds create. There is a way to approach belief with greater awareness—to use our understanding of psychology to help us navigate truth, uncertainty, and the narratives that shape our lives.

Final Thoughts on the Psychology of Belief

So where does that leave us?

If belief is shaped by cognitive biases, reinforced by social identity, fueled by emotional needs, manipulated by authority figures, and even embedded in our evolutionary wiring—what can we do about it?

The answer isn’t to reject belief altogether. That would be impossible. Belief is fundamental to who we are. It’s how we make sense of the world, how we build connections, how we define meaning in our lives.

The problem isn’t that we believe. It’s when we believe without question.

If there’s one takeaway from this conversation, it’s this: self-awareness is our most powerful tool. Understanding how belief works doesn’t mean we stop believing. It means we become more conscious of why we believe the things we do. It means we can recognize when we’re falling into the trap of confirmation bias. When we’re mistaking confidence for credibility. When we’re allowing social pressure to override our own reasoning.

It also means we can have more empathy for others. If you’ve ever found yourself frustrated by someone who refuses to change their mind, no matter how much evidence you present—remember, belief isn’t just intellectual. It’s emotional. It’s psychological. And for many people, questioning their beliefs feels like questioning who they are.

So instead of debating to win, we can engage to understand. Instead of ridiculing, we can explore. Instead of treating belief as a dividing line between “us” and “them,” we can recognize that it is something universal—something we all experience in different ways, for different reasons.

There’s a quote I often think about when I reflect on this topic. It’s from the philosopher Karl Popper, who once said:

"True ignorance is not the absence of knowledge, but the refusal to acquire it."

The real challenge isn’t whether we believe or don’t believe. It’s whether we’re willing to examine our beliefs. To ask ourselves:

  • Where does this come from?

  • What purpose does it serve?

  • And most importantly—am I holding on to it because it’s true, or because it’s comfortable?

If we can ask those questions honestly—without fear—then we’re not just believing blindly. We’re thinking. We’re growing. And ultimately, we’re learning how to navigate this world with a little more awareness, a little more curiosity, and a little more understanding.

Thank you for joining me for this episode of The Psychology of Us.
I’m Professor RJ Starr, and I’ll see you next time.

Previous
Previous

Why I Wrote “Gone Without Goodbye”

Next
Next

The Psychology of Mockery