The Psychology of Ethics, Dogma and Morality: How We Decide What’s Right and Wrong
“We like to think our moral code is a conscious choice—but most of it was absorbed, shaped, and reinforced long before we started asking questions. This episode unpacks the psychology of ethics, the appeal of dogma, and what it takes to grow a flexible moral mind.”
Transcript
Welcome to The Psychology of Us, where we explore the ways our minds shape our lives and the world around us. I’m Professor RJ Starr, and today, we’re diving into a topic that sits at the very heart of human thought and behavior: the intersection of psychology, ethics, and dogma.
You ever wonder why people hold on so tightly to certain beliefs, even when presented with evidence to the contrary? Or why we sometimes feel so certain that our ethical code is the right one while others seem misguided, or even dangerous? It’s a fascinating area of psychology, one that reveals a lot about how we think, how we justify our actions, and how we navigate the complex world of morality.
This isn’t just about philosophy or abstract ideas. The way we construct our moral compass, how we decide what’s right and wrong, is deeply psychological. Ouar upbringing, our social circles, even the structure of our brains influence the ethical decisions we make every single day. And when dogma enters the picture, that rigid, unwavering certainty in a belief system, it can become both a source of security and a psychological trap.
So today, we’re going to explore what ethics actually are from a psychological standpoint, why we’re drawn to rigid belief systems, and how we can cultivate ethical thinking without falling into dogmatic patterns. Let’s get into it.
Part 1: What is Ethics? The Psychological Foundations of Moral Thinking.
Ethics. We hear the word all the time. People talk about ethical business practices, ethics in science, personal ethics. But what does that actually mean?
At its core, ethics is the framework we use to decide what’s right and wrong. It’s how we justify our actions to ourselves and others. And while some ethical principles are almost universal, things like fairness, harm avoidance, and reciprocity, many of our moral stances are shaped by experience, culture, and even cognitive biases.
Psychologists have studied moral development for decades, and one of the most well-known theories comes from Lawrence Kohlberg. He proposed that we progress through different stages of moral reasoning. At the most basic level, we make decisions based on rewards and punishments, what gets us praise or what gets us in trouble. But as we grow, our sense of morality becomes more abstract. We start considering social norms, legal principles, and eventually, personal ethical codes that transcend what society tells us.
What’s interesting is that not everyone reaches those higher levels of moral reasoning. Many people get stuck in the stage where right and wrong are simply what’s approved or punished by authority. And that’s where psychology and ethics get complicated, because if you’ve been conditioned to believe something is “good” or “bad” based purely on external rules, you may never critically examine those beliefs for yourself.
Take, for example, the work of Jonathan Haidt, who proposed Moral Foundations Theory. Haidt found that different cultures, and even different political ideologies, prioritize moral values differently. Some people are more focused on principles like fairness and harm avoidance, while others place a higher value on loyalty, authority, or purity. And this is where we start to see ethical disagreements not as logical debates, but as deeply ingrained psychological responses.
Let me tell you about a time I personally had to confront my own moral reasoning. When I was younger, I was put in a situation where I had to choose between loyalty and honesty. I had a close friend who had made a serious mistake, something that, if revealed, would have had major consequences. But keeping it secret felt like a violation of my own integrity. I remember agonizing over it. Do I protect my friend, or do I do what I know is right? And here’s what struck me: my decision wasn’t just about ethics in an abstract sense. It was about my identity, who I saw myself as. And that’s something we often forget.
Ethics isn’t just about rules, it’s about how we construct our very sense of self.
And that’s why ethical conflicts can feel so deeply personal. When someone challenges our moral stance, it’s not just an intellectual debate, it feels like they’re questioning who we are. This is one of the reasons ethical disagreements, especially in today’s world of social media and political tribalism, can get so heated.
But before we move on, I want you to think about this: How did you come to believe what you believe about right and wrong? Did you actively question and develop those beliefs, or did they come from somewhere else, your upbringing, your religion, your political environment? And if you had to change one of your moral positions, how would that feel?
Because next, we’re going to talk about why some beliefs become so rigid, why we cling to dogma, even when faced with evidence that suggests we might be wrong.
Stay with me. This is where it gets really interesting.
Part 2: The Comfort of Dogma, Why We Cling to Beliefs.
So, if our sense of ethics is shaped by our upbringing, culture, and psychological development, then why do some people cling so fiercely to their moral beliefs, even in the face of overwhelming evidence that challenges them?
That’s where dogma comes in.
Dogma is a belief or set of beliefs that people accept as absolutely true, without question. It’s often associated with religion, but it can be found anywhere, politics, science, even personal philosophies. It gives people a framework, a clear guide for how to navigate the world. But psychologically, dogma isn’t just about having strong convictions. It’s about certainty, and certainty is a powerful drug.
When we know something, when we feel completely sure that our beliefs are correct, we don’t have to wrestle with doubt. We don’t have to navigate the anxiety of ambiguity. And that, in itself, is a form of psychological relief.
One of the best explanations for this comes from Terror Management Theory, which was developed by social psychologists Sheldon Solomon, Jeff Greenberg, and Tom Pyszczynski. Their research suggests that much of human behavior, including our moral certainty, stems from a deep, often unconscious fear of death.
Think about it: if you accept that everything is uncertain, that right and wrong are fluid, that the world is complex and unpredictable, that’s unsettling. But if you have an unshakable belief system, a religious faith, a political ideology, even a rigid personal code, it gives you something solid to hold onto. It gives life meaning and structure.
This is why challenging someone’s deeply held beliefs can feel like a personal attack. It’s not just an intellectual disagreement; it’s a threat, a threat to their worldview, their identity, and sometimes, their very sense of existential security.
And here’s where psychology gives us an uncomfortable truth: the more we try to convince someone that they’re wrong, the more entrenched they become.
This is called the Backfire Effect, a phenomenon studied by cognitive scientists Brendan Nyhan and Jason Reifler. They found that when people are presented with evidence that contradicts their beliefs, they don’t just ignore it, they double down. Their existing beliefs actually become stronger.
Let’s take a simple example. Imagine you’re having a conversation with someone who believes in a conspiracy theory, maybe they’re convinced the moon landing was faked. You come in with facts, historical footage, interviews with astronauts, scientific explanations. But instead of changing their mind, they become even more convinced that there’s a cover-up.
Why? Because their belief isn’t based on logic alone, it’s based on identity and trust. They’ve invested emotionally in their version of reality, and admitting they’re wrong would mean not only changing a belief, but also questioning their own ability to discern truth.
I’ll give you a personal example. Years ago, I had a strong opinion about a certain ethical issue. I won’t say what it was, because honestly, it’s not important. What is important is that I was convinced I was right. I had all my justifications lined up, my reasoning airtight. And then one day, I was forced to confront new information, evidence that contradicted what I had believed for years.
And I’ll be honest with you: my first instinct was not to accept it. My first instinct was to reject it. To poke holes in it. To discredit the source. It took time, months, actually, before I could admit to myself that I had been wrong.
And that experience stuck with me, because it made me realize how deeply dogma can take hold, even in people who consider themselves rational and open-minded.
Now, let’s be clear: having strong beliefs isn’t the problem. Conviction is important. But when belief turns into certainty, when we become so sure of something that we refuse to question it, that’s when we stop thinking critically. That’s when dogma replaces reason.
And this is where we have to ask ourselves a tough question: Am I willing to hold my beliefs up to scrutiny?
Because if we aren’t, then we aren’t really thinking, we’re just defending.
In the next part of this episode, we’ll look at what happens when ethics and dogma collide, when rigid belief systems override our ability to act with nuance, empathy, and genuine moral reasoning.
Stay with me. This is where we start to see the real consequences of psychological certainty.
Part 3: When Ethics and Dogma Collide, The Danger of Rigid Thinking.
So far, we’ve talked about how ethics develop and why people cling to certain moral beliefs, sometimes with unshakable certainty. But what happens when dogma overrides ethical reasoning? When a rigid belief system becomes so deeply entrenched that it blinds people to the harm they might be causing?
This is where psychology shows us something unsettling: even people who see themselves as morally good can justify deeply unethical actions if their belief system demands it.
Let’s talk about one of the most famous experiments in psychology, Stanley Milgram’s obedience study.
In the 1960s, Milgram wanted to understand why ordinary people could commit horrific acts under authoritarian rule, particularly during the Holocaust. So he designed an experiment where participants were instructed to administer electric shocks to another person, who was actually an actor and not really being shocked, whenever they answered a question incorrectly.
Now, the shocks started small but increased to dangerous levels, and as the voltage went up, the person being “shocked” would cry out in pain, beg for the experiment to stop, and eventually go silent.
The participants, the people administering the shocks, were clearly uncomfortable. They hesitated. Some questioned the authority figure in the room, who was a researcher in a lab coat instructing them to continue. But despite their discomfort, a shocking 65% of participants obeyed the orders all the way to the highest voltage level, even when they believed they might be causing serious harm.
Why? Because an authoritative figure told them it was the right thing to do.
This experiment demonstrated that people are capable of overriding their own moral instincts when they believe they’re following a legitimate authority. And that’s where dogma becomes dangerous. It allows people to disengage from their own ethical reasoning, to surrender their personal responsibility to a larger system, a leader, or an ideology.
Now, Milgram’s study is just one example. Another disturbing case is the Stanford Prison Experiment, conducted by Philip Zimbardo.
In this study, college students were assigned to play the role of either prisoners or guards in a simulated prison environment. Within days, the "guards", ordinary young men, began abusing the "prisoners," humiliating them, stripping them of their dignity, even subjecting them to psychological torture.
And here’s the key takeaway: these weren’t sadistic individuals. They weren’t bad people. But the structure of the experiment, the power dynamic, the rigid roles, allowed them to justify behavior that, outside of that environment, they likely never would have considered acceptable.
So what do these experiments tell us?
They tell us that when people operate within a rigid system, whether it’s a political ideology, a religious doctrine, or a cultural norm, they can rationalize behavior that would normally violate their moral code. Because when something is framed as necessary or justified within a belief system, people stop questioning whether it’s actually right or wrong.
History is full of examples of this. Wars fought in the name of righteousness. Policies that dehumanize entire groups of people. Harm inflicted because “it’s the way things are done.”
But it happens in smaller, everyday ways too.
Have you ever seen someone justify cruelty because they believe the person “deserved it”? Maybe someone dismissing an opposing viewpoint, not because they engaged with the argument, but because they already labeled the person as wrong, immoral, or beneath them?
I once had a conversation with someone who believed that people who made bad decisions, financially, socially, whatever it was, deserved to suffer the consequences, no matter how severe. They saw their belief as ethical because it was based on “personal responsibility.” But when I asked, “What if it were your best friend? Your sibling? What if it were you?”, they hesitated. They realized their ethical stance wasn’t about morality at all. It was about reinforcing a worldview that made them feel safe, that made the world seem predictable.
And that’s how dogma disguises itself as ethics. It creates the illusion of moral clarity, but really, it’s just a refusal to engage with complexity.
This is why psychological flexibility is so important, why real ethics require more than just following a script.
In the next section, we’re going to talk about how we can move beyond dogmatic thinking, how to develop an ethical mindset that isn’t rigid, but thoughtful, adaptable, and genuinely moral.
Because real ethics aren’t about certainty. They’re about understanding.
Part 4: The Psychological Flexibility of Ethics, How We Grow Beyond Dogma.
So if dogma traps us in rigid thinking, how do we break free? How do we develop an ethical framework that isn’t just about following rules, but about actually understanding morality?
The answer lies in psychological flexibility, the ability to adapt our thinking in response to new information, perspectives, and experiences. And this isn’t just about being open-minded. It’s about cultivating a deeper level of self-awareness, one that allows us to challenge our own biases and assumptions.
One of the most powerful tools for this is something called intellectual humility.
Intellectual humility is the willingness to admit when we might be wrong. It doesn’t mean having no convictions, and it doesn’t mean being indecisive. It means recognizing that our knowledge is always incomplete, that no single perspective, our own included, has a monopoly on truth.
Studies have shown that people who score high in intellectual humility tend to be better at critical thinking, more open to diverse perspectives, and less likely to fall into the psychological traps of dogmatic thinking.
But let’s be honest, this is easier said than done. Because admitting we’re wrong? That’s uncomfortable. It feels like a threat to our identity.
And yet, if we never allow ourselves to question our beliefs, we stop growing.
A few years ago, I had a student who was absolutely convinced that people who held a certain political view were, without exception, unethical. They had plenty of arguments to back this up, plenty of examples of wrongdoing from that political group. And to be fair, they weren’t entirely wrong, there were valid criticisms to be made.
But what struck me was their certainty. They had completely closed themselves off to the possibility that the people on the other side might also have moral reasoning, that their beliefs might be shaped by experiences just as real and complex as their own.
So I asked them: What would it take to change your mind?
And they hesitated. Because they hadn’t considered that before. They hadn’t asked themselves what kind of evidence, what kind of experience, would actually shift their perspective.
That’s the first step in breaking free from dogma, asking ourselves: What would it take for me to reconsider this belief?
The second step is something called moral imagination, the ability to put ourselves in the shoes of people who see the world differently.
Moral imagination is what allows us to recognize that ethical dilemmas are rarely black and white. That morality isn’t just about rules, but about people, their pain, their circumstances, their experiences.
Take a look at any major social or political divide, and you’ll see a lack of moral imagination at work. People aren’t talking to each other. They’re talking at each other, reinforcing their own beliefs rather than genuinely trying to understand.
But here’s the fascinating thing, research shows that people are more likely to shift their ethical views when they feel heard and understood, not when they’re attacked or ridiculed.
A study on moral reframing found that when people were presented with arguments for a moral position that acknowledged their existing values rather than contradicting them, they were much more open to changing their perspective.
For example, if you want to persuade someone who values tradition to support a progressive cause, don’t argue from a framework of change, argue from a framework of protecting what’s sacred. If you want someone who prioritizes personal responsibility to care about social justice, frame it as empowering people to help themselves.
In other words, when we meet people where they are, rather than dismissing them outright, we create space for real ethical growth.
And the truth is, this applies to us too. We like to think we’re rational beings, but the reality is, we all have blind spots. We all have moral inconsistencies. And if we want to be truly ethical thinkers, we have to be willing to hold up a mirror to ourselves.
So here’s a challenge: Pick one of your strongest moral beliefs. Something you feel absolutely sure about. Now ask yourself:
Where did this belief come from?
Have I truly examined it from all sides?
Is there any piece of evidence, any argument, that would make me reconsider?
Because ethics, real ethics, requires more than certainty. It requires curiosity. It requires a willingness to engage with complexity, to acknowledge when we don’t have all the answers, and to choose growth over comfort.
In the final section of this episode, we’re going to bring it all together. We’ll talk about how to live ethically without dogma, how to stand firm in our values without becoming rigid in our thinking.
So, where does that leave us?
We’ve explored how our moral beliefs are shaped by psychology, why we cling to certainty, how dogma can override ethical reasoning, and how we can develop greater moral flexibility. But at the end of the day, the question remains: How do we live ethically without falling into rigid thinking?
The answer isn’t about abandoning our values. It’s not about being morally indifferent or endlessly questioning everything to the point of inaction. Living ethically without dogma means recognizing that certainty is not a prerequisite for integrity. That being open to new perspectives doesn’t weaken our moral compass, it refines it.
And that’s a difficult balance to strike.
We like clear answers. We like to believe that we stand on firm moral ground. But history has shown us that some of the greatest ethical failures happened because people were so convinced they were right that they stopped questioning themselves.
So, if we want to be truly ethical thinkers, we have to commit to a mindset of growth.
That means practicing intellectual humility, acknowledging that we don’t have all the answers and being open to learning from those we disagree with.
That means cultivating moral imagination, seeing beyond our own experiences and recognizing the complexity of other people’s lives.
And that means embracing psychological flexibility, being willing to update our beliefs when confronted with new evidence or deeper understanding.
Now, I’m not saying this is easy. It takes work. It takes self-awareness. It takes a willingness to sit with discomfort, to entertain ideas that challenge us. But if we can do that, if we can step outside of our own certainty long enough to truly engage with the world, we don’t just become better thinkers.
We become better people.
So, here’s something to reflect on as we wrap up:
What’s one belief, one moral certainty, you’ve never truly questioned? And what would it take for you to look at it from a different angle?
Because real ethics isn’t about standing still. It’s about growing.
That’s it for today’s episode of The Psychology of Us. If you found this discussion thought-provoking, share it with someone who might appreciate the challenge. And if you have thoughts, questions, or topics you’d like me to cover in a future episode, send me an email at ProfRJStarr@outlook.com, I’d love to hear from you.
Thanks for listening, and until next time, keep thinking, keep questioning, and stay curious.