Glossary of Cognitive Biases

This glossary explores the hidden mental shortcuts that shape our decisions, beliefs, and judgments—often without us realizing it. These biases aren't signs of irrationality; they’re built-in features of the brain’s attempt to make sense of a complex world. But left unexamined, they can distort how we see ourselves, others, and the truth.

Each entry is designed to make these patterns understandable—so you can spot them not just in theory, but in everyday life.

Clear. Grounded. And fully human.

_________________________________________________________

Actor-Observer Bias

The tendency to attribute our own actions to external circumstances while attributing others’ actions to their character. If we snap at someone, it’s because we had a hard day. If they snap, it’s because they’re rude. This bias reflects a gap in perspective-taking and reinforces distorted social narratives.
See also: Attribution Theory, Perspective Bias

Anchoring Bias

A cognitive shortcut where we rely too heavily on the first piece of information we encounter—called the “anchor”—when making decisions. Whether it's the opening price in a negotiation or the first impression in a conversation, that initial detail skews our judgment more than we realize.
See also: Heuristics, Decision-Making Biases

Availability Heuristic

A mental shortcut where we judge how likely something is based on how easily examples come to mind. For example, we might believe plane crashes are more common than they are if we’ve seen recent news coverage, even though statistically they’re rare.
See also: Memory Bias, Probability Distortion

Bandwagon Effect

The tendency to adopt beliefs or behaviors because others are doing the same. Often unconscious, it can create the illusion of consensus or truth simply because an idea feels socially validated. It’s a powerful force in trends, politics, and misinformation.
See also: Groupthink, Conformity Bias

Belief Bias

The tendency to accept arguments or conclusions based on how believable they seem rather than whether they’re logically valid. If the conclusion aligns with our worldview, we’re more likely to overlook flaws in reasoning.
See also: Reasoning Errors, Logic vs. Belief

Ben Franklin Effect

A psychological phenomenon where we grow to like someone more after doing them a favor—even if we didn’t initially feel positively toward them. The brain reconciles the effort we made by shifting our attitude to justify the behavior.
See also: Cognitive Dissonance, Reciprocity Bias

Bystander Effect

A social bias where individuals are less likely to help someone in need when others are present. Each person assumes someone else will intervene, diluting personal responsibility. Ironically, the more people there are, the less likely anyone is to act.
See also: Social Influence, Diffusion of Responsibility

Choice-Supportive Bias

Once we’ve made a decision, we tend to remember it as better than it was. We emphasize the positives and downplay the negatives of our chosen option—whether it’s a relationship, a purchase, or a life path. This bias helps preserve self-consistency, but can cloud critical reflection.
See also: Post-Decision Bias, Memory Distortion

Confirmation Bias

The tendency to seek, interpret, and remember information in a way that confirms what we already believe. It’s one of the most well-known and widespread biases—and plays a key role in political polarization, echo chambers, and resistance to new evidence.
See also: Selective Attention, Motivated Reasoning

Conjunction Fallacy

A logical error where we assume that specific conditions are more probable than a single general one. For example, thinking a woman described as kind and quiet is more likely to be a “librarian and feminist” than just a “librarian,” even though the first is statistically less probable.
See also: Probability Bias, Stereotype Traps

Curse of Knowledge

Once we know something, we have trouble imagining what it’s like not to know it. This makes it hard to communicate clearly, especially in teaching or leadership, because we assume others have the same background information or understanding that we do.
See also: Empathy Gaps, Communication Bias

Dunning-Kruger Effect

A cognitive bias in which people with low ability or knowledge in a domain overestimate their competence. Ironically, the skills needed to assess one’s own performance are often the same skills required to perform well—leading to inflated self-confidence among the least competent.
See also: Metacognition, Self-Awareness Gaps

Empathy Gap

The difficulty people have in understanding or predicting emotional states that are different from their current one. For example, someone calm may underestimate how much anxiety will influence their decision in a future stressful situation.
See also: Emotional Forecasting, State-Dependent Thinking

Endowment Effect

We tend to overvalue things simply because we own them. Whether it’s a coffee mug or a personal belief, ownership makes us more attached—and less likely to part with it, even if we wouldn’t pay that price to acquire it.
See also: Loss Aversion, Ownership Bias

False Consensus Effect

We overestimate how much others share our beliefs, values, or behaviors. This gives us a sense of being “normal” and reinforces our existing views—making us less likely to question ourselves or explore alternative perspectives.
See also: Social Norming, Belief Projection

Focusing Effect

When making judgments, we give too much weight to one detail and ignore others. For example, people might assume moving to a sunny state will make them happier, ignoring factors like cost of living, relationships, or job satisfaction.
See also: Salience Bias, Misplaced Priorities

Forer Effect

Also known as the Barnum Effect, this bias leads people to accept vague, general statements as highly personal and accurate—especially when they think the information is tailored to them. It’s why horoscopes and personality quizzes often feel “spot on.”
See also: Personal Validation, Suggestibility

Framing Effect

How information is presented—positively or negatively—affects decision-making. Saying a treatment has a 90% success rate feels more reassuring than saying it has a 10% failure rate, even though both statements are statistically identical.
See also: Message Framing, Persuasion Bias

Fundamental Attribution Error

The tendency to explain others’ behavior by their personality rather than their situation. If someone cuts us off in traffic, we assume they’re inconsiderate, not that they’re late for an emergency. This bias often fuels conflict and misjudgment.
See also: Actor-Observer Bias, Judgment Errors

Gambler’s Fallacy

The mistaken belief that if something happens more frequently than normal during a given period, it will happen less frequently in the future—or vice versa. For example, thinking a coin is “due” for tails after a streak of heads, despite each flip being independent.
See also: Probability Misjudgment, Illusion of Patterns

Halo Effect

A cognitive bias where our impression of someone in one domain (e.g., attractiveness or confidence) positively influences how we feel about them in unrelated areas (e.g., intelligence or competence). It explains why charismatic people are often overestimated.
See also: First Impressions, Social Judgments

Hindsight Bias

After an event occurs, we tend to see it as having been predictable all along. This “I knew it” effect erases the uncertainty of the past and makes us overconfident in our ability to forecast future events.
See also: Memory Distortion, Outcome Bias

Hot-Hand Fallacy

The belief that a person who has experienced success with a random event has a greater chance of continued success. Common in sports, where a player on a “streak” is seen as having the hot hand—even when their performance may just be chance.
See also: Pattern Illusion, Streak Belief

Hyperbolic Discounting

The tendency to prefer smaller, immediate rewards over larger, delayed ones. It’s a major factor in procrastination, impulsive spending, and health decisions—where the brain values short-term gratification more than long-term gain.
See also: Delay Discounting, Self-Control Bias

Illusion of Control

We overestimate our influence over events, especially those governed by chance. Whether it’s rolling dice harder for a higher number or believing positive thinking will change external outcomes, this bias inflates our sense of agency.
See also: Magical Thinking, Agency Distortion

Illusory Superiority

The belief that we are above average in skills, intelligence, or ethics—despite statistical improbability. This bias helps preserve self-esteem but can blind us to personal growth opportunities.
See also: Self-Enhancement, Unrealistic Optimism

In-Group Bias

The tendency to favor those who belong to our social group—whether defined by race, religion, profession, or ideology. This bias can show up in hiring, friendships, or even everyday trust, and often leads to unfair or exclusionary outcomes.
See also: Tribalism, Social Favoritism

Just-World Hypothesis

The belief that people get what they deserve and deserve what they get. While comforting, this bias can lead to victim-blaming and a lack of empathy, particularly in response to injustice or suffering.
See also: Moral Rationalization, Empathy Barriers

Mere Exposure Effect

We tend to develop a preference for things simply because we’ve been exposed to them repeatedly. Whether it’s a song, a face, or a product, familiarity can breed comfort—even if we didn’t like it at first.
See also: Repetition Bias, Familiarity Principle

Naïve Realism

The belief that we see the world objectively—and that those who disagree are uninformed, irrational, or biased. This mindset fuels polarization and makes it difficult to engage in open, respectful dialogue.
See also: Intellectual Humility, Perspective Bias

Negativity Bias

We tend to give more psychological weight to negative experiences than positive ones. One insult sticks longer than five compliments. This bias evolved for survival—recognizing danger was more urgent than appreciating beauty—but it often distorts modern perception.
See also: Emotional Weighting, Threat Sensitivity

Normalcy Bias

The assumption that because something hasn’t happened before, it won’t happen in the future. It makes people underestimate the possibility of disaster or crisis—even when warning signs are clear. This bias can delay preparedness and decision-making in emergencies.
See also: Denial, Risk Underestimation

Not Invented Here (NIH) Bias

A tendency to reject ideas, products, or solutions simply because they come from outside one’s own group or organization. It reflects territorial thinking and stifles innovation by prioritizing origin over merit.
See also: Group Loyalty, Intellectual Arrogance

Observer Expectancy Effect

When a researcher's expectations unconsciously influence participants' behavior or interpretation of results. This bias can affect outcomes in experiments, interviews, or even classroom settings—and is a core reason for blind and double-blind research design.
See also: Experimenter Bias, Self-Fulfilling Prophecy

Optimism Bias

We believe we’re less likely than others to experience negative events and more likely to experience positive ones. It’s a hopeful lens that can build confidence—but may also lead to poor risk assessment or lack of preparedness.
See also: Unrealistic Optimism, Risk Perception

Ostrich Effect

The tendency to avoid negative information by metaphorically "burying our heads in the sand." People may ignore bank statements, health results, or troubling news to protect their emotional state—even when awareness would lead to better outcomes.
See also: Avoidance, Information Aversion

Outgroup Homogeneity Bias

The perception that members of a group we don't belong to are all alike, while those in our group are diverse individuals. This bias reinforces stereotypes and undermines cross-group empathy.
See also: Stereotyping, Social Categorization

Overconfidence Effect

A common bias where people’s subjective confidence in their knowledge or judgments is greater than their actual accuracy. It affects everything from financial forecasting to eyewitness testimony, and it’s especially dangerous in high-stakes decisions.
See also: Confidence vs. Competence, Prediction Error

Pessimism Bias

The opposite of optimism bias—this is the tendency to overestimate the likelihood of negative outcomes. It’s often linked to anxiety or past trauma and can lead to avoidant behavior or unnecessary caution.
See also: Catastrophizing, Risk Amplification

Planning Fallacy

We consistently underestimate how long tasks will take, even when we’ve done similar tasks before. This bias reflects our failure to account for complications, distractions, or how long things usually take “in real life.”
See also: Time Estimation, Optimism Bias

Projection Bias

We assume that others share our beliefs, preferences, or emotional states. This makes it harder to understand disagreement or accept differences—and often leads to poor communication and false consensus.
See also: Ego-Centric Bias, Empathy Gaps

Reactance

A defensive reaction to perceived restrictions on freedom or autonomy. When people feel pressured to change their behavior, they often double down on their current stance—even if the change would benefit them.
See also: Autonomy Resistance, Motivated Rebellion

Recency Bias

We give greater importance to the most recent information we've encountered—even if earlier information was more relevant or accurate. This bias shows up in evaluations, memory, and decision-making.
See also: Temporal Bias, Serial Position Effect

Representativeness Heuristic

A mental shortcut where we judge how likely something is based on how much it resembles a typical case—even if the statistics say otherwise. This can lead to errors in logic, such as assuming someone who likes poetry must be a librarian.
See also: Base Rate Neglect, Stereotype Thinking

Salience Bias

We focus on things that stand out, not necessarily things that are important. Bright colors, loud voices, or emotionally charged stories grab attention—and often overshadow subtler, more meaningful information.
See also: Attention Bias, Distraction by Novelty

Scarcity Effect

When something becomes rare or limited, we value it more—even if its actual worth hasn't changed. Marketers use this bias to create urgency with phrases like “only 3 left!” or “limited-time offer.”
See also: Perceived Value, Loss Aversion

Self-Fulfilling Prophecy

A belief or expectation that causes itself to become true through its influence on behavior. If you expect someone to dislike you, you may act distant—leading them to actually withdraw.
See also: Expectancy Effects, Behavioral Confirmation

Self-Serving Bias

We attribute our successes to our abilities and our failures to external factors. This bias protects self-esteem but can prevent honest self-reflection and learning from mistakes.
See also: Attribution Errors, Ego Protection

Semmelweis Reflex

The tendency to reject new evidence or knowledge because it contradicts established norms or beliefs. Named after Ignaz Semmelweis, a physician whose discovery about handwashing was dismissed by his peers.
See also: Status Quo Bias, Resistance to Change

Serial Position Effect

We’re more likely to remember the first and last items in a series than the middle ones. This bias influences everything from job interviews to how we recall parts of a conversation or lecture.
See also: Memory Bias, Primacy and Recency

Shared Information Bias

Groups tend to focus discussion on information everyone already knows, rather than introducing new or unique ideas. This reduces creativity and reinforces echo chambers.
See also: Groupthink, Redundancy Bias

Social Comparison Bias

We tend to evaluate ourselves in relation to others, often feeling threatened by peers with similar skills or status. This can lead to jealousy, devaluation, or sabotaging behavior in competitive settings.
See also: Envy, Peer Benchmarking

Spotlight Effect

We overestimate how much others notice us, our mistakes, or our appearance. The truth is: most people are too focused on themselves to pay as much attention as we think they do.
See also: Social Anxiety, Self-Consciousness

Status Quo Bias

We prefer things to stay the same, even when change would be beneficial. This conservatism in decision-making makes people resist upgrades, new habits, or alternative paths—even when the current situation isn't ideal.
See also: Change Aversion, Inertia Bias

Sunk Cost Fallacy

We continue investing time, money, or energy into something because of what we've already invested—even when it’s no longer serving us. It's the psychological version of “throwing good money after bad.”
See also: Escalation of Commitment, Exit Difficulty

Survivorship Bias

We focus on the people or things that made it through a selection process—and forget those that didn’t. This creates distorted conclusions, like overvaluing the traits of successful businesses without looking at the failures.
See also: Outcome Bias, Missing Data Fallacy

System Justification

A psychological tendency to defend and rationalize the status quo, even when it may be unjust or disadvantageous. This bias helps maintain social order but can also reinforce inequality and reduce motivation to seek change.
See also: Ideological Bias, Resistance to Reform

Third-Person Effect

We believe others are more susceptible to media influence or propaganda than we are. This creates a false sense of immunity and can undermine media literacy or public awareness.
See also: Media Bias, Cognitive Dissonance

Time-Saving Bias

The false belief that small increases in speed lead to large time savings. For example, believing that driving 10 mph faster will drastically cut commute time—when it rarely does.
See also: Perception of Time, Efficiency Illusion

Zero-Risk Bias

We prefer eliminating a small risk entirely over significantly reducing a larger one. This leads to decisions that feel safer emotionally but may not be the most rational or impactful in practice.
See also: Risk Perception, Emotional Reasoning

Previous
Previous

Glossary of Psychological Tests & Scales

Next
Next

Glossary of Defense Mechanisms