top of page
  • Writer's pictureJ Felix

Cognitive Distortions, Fallacies, Heuristics & Biases

Updated: Apr 25

We co-create reality. Moment by moment, the mind secretes thought after thought. Some are positive, others negative; some are beneficial, some benign; some are constructive, others destructive; some are encouraging, others discouraging. Most are non-native and unoriginal. Richard Dawkins calls these discrete units of thought memes. Memes spread from person to person within a culture. Songs, one-liners, idioms, hand gestures, and sound bites are some examples of memes. Many become building blocks for the thoughts we take to be ours.

Researchers have identified a region of the brain called the "gestalt cortex" which appears to play a role in this co-construction (Lieberman, 2022). Parts of the brain responsible for processing vision, sound, and touch interface with a structure called the temporoparietal junction, which is part of the gestalt cortex. The temporoparietal junction helps people integrate and create meaning from the world they see.

Perceptions, memes, biases, memories, and assumptions influence our thoughts. Few investigate them, however. Most assume them to be true and an accurate representation of reality. Researchers call this phenomenon "naive realism." For meditators, radical responsibility, a mindful way of being coined by Fleet Maull, insists we do our work.

In computer science, the acronym GIGO stands for garbage in garbage out. The quality of the input determines the quality of the output. In other words, the quality of data coming out is only as good as what went in. Human irrationality often boils down to computational constraints. These constraints output suboptimal decisions.

Teachers, parents, clerics, professors, scholars, journalists, pundits, influencers, experts, and others influence what and how we think. A child receives these inputs on faith. As we mature, grow in experience, and learn, we have the opportunity to investigate, deconstruct, and reflect on what we've learned. In some cases, we must unlearn what we've been taught. This is our work.

A drop of rain- in and of itself- is insignificant, but when they drip from clouds in the billions, they form streams and rivers that carve the landscape, water crops, fill aquifers, and sustain life. They can also drown livestock, flood towns, and tear homes from their foundations, washing them away. Similarly, thought streams form channels that sculpt one's perceptions and "reality." They can sustain life or threaten well-being.

Thoughts arise on and off the cushion. During meditation, you're simply more aware of the busyness of mind, the disjointed thoughts, the meandering stream. Purifying the thought stream is itself a practice. Each technique has its own instructions for handling intrusive thoughts and mental elaborations. In focused meditation, the instruction is to acknowledge, cut, and reorient attention. In open monitoring, we simply watch the flow of thoughts, like an observer standing on the bank of a river. If we are using a labeling technique, we can label thoughts as "thoughts," or we can be more specific. We may be storytelling, analyzing, imagining, or predicting, for example. We can go further by recognizing cognitive distortions, and habitual ways of thinking that are characterized by negativity and bias. They are often illogical and exaggerated. These thought patterns perpetuate psychopathological states such as depression and anxiety. These emotions seem involuntary, beyond our control, but it is at the level of thought that we can exercise choice. What we do is an extension of what and how we think. We cannot absolve ourselves from our work by giving autonomy to feelings or instincts.

Recognizing disabling patterns of mind is a first step in freeing ourselves from them.

It is difficult to deconstruct and unlearn the biases, fears, or lessons we were taught.

There are many analytical techniques we can use to examine and deconstruct conditioned thought patterns. What follows is not exhaustive. We can label/identify distortions (see the list of the most common distortions and fallacies below). From here, we have choices. We can simply cut or ignore them. We can question them (demanding specificity or evidence). Byron Katie asks these questions when cognitive distortions arise: "Is it true? How do I know it's true? How do I react, what happens, when I believe the thought? Where would I be/How would I feel without this thought?" Another strategy is to identify the need behind the distortion. If, for example, I fail at a project and overgeneralize ("I'm a failure") and catastrophize ("I'll never succeed"), I can go deeper into the feeling (disappointment) and need (competence, to contribute). I may find other ways to satisfy this need (e.g. through volunteer service). Reattribution is another strategy. If a close friend or someone dear to you berated themselves, what would you say to them? Befriend yourself and direct that empathy inside. Semantics can also be used to dissect a cognitive distortion. "I'm a failure" and "I haven't succeeded yet" are two very different ways of seeing the same setback. The former can be characterized as low-performance self-talk, the latter suggests persistence. "I'll never succeed" and "This approach didn't work. Let me try another" yield different results both in affect/attitude and outcome (quitting vs persisting). The language we use to frame our perceptions matters. Bertrand Russell highlighted this in the oft-cited example: "I am firm, you are obstinate, he is a pig-headed fool. I am righteously indignant, you are annoyed, he is making a fuss over nothing. I have reconsidered the matter, you have changed your mind, he has gone back on his word."

Below are lists of the most common distortions & fallacies (Source: positive psychology website):

1. All-or-Nothing Thinking / Polarized Thinking

Also known as “Black-and-White Thinking,” this distortion manifests as an inability or unwillingness to see shades of gray. In other words, you see things in terms of extremes – something is either fantastic or awful, you believe you are either perfect or a total failure. "You're either with us or against us," for example. One could be for a particular end, but object to the means to that end. Polarized thinking, however, cuts us off from exploring possibilities.

2. Overgeneralization

This sneaky distortion takes one instance or example and generalizes it to an overall pattern. For example, a student may receive a C on one test and conclude that she is stupid and a failure. Overgeneralizing can lead to overly negative thoughts about yourself and your environment based on only one or two experiences.

3. Mental Filter

Similar to overgeneralization, the mental filter distortion focuses on a single negative piece of information and excludes all the positive ones. An example of this distortion is one partner in a romantic relationship dwelling on a single negative comment made by the other partner and viewing the relationship as hopelessly lost, while ignoring the years of positive comments and experiences. The mental filter can foster a decidedly pessimistic view of everything around you by focusing only on the negative.

4. Disqualifying the Positive

On the flip side, the “Disqualifying the Positive” distortion acknowledges positive experiences but rejects them instead of embracing them. For example, a person who receives a positive review at work might reject the idea that they are a competent employee and attribute the positive review to political correctness, or to their boss simply not wanting to talk about their employee’s performance problems. This is an especially malignant distortion since it can facilitate the continuation of negative thought patterns even in the face of strong evidence to the contrary.

5. Jumping to Conclusions – Mind Reading

This “Jumping to Conclusions” distortion manifests as the inaccurate belief that we know what another person is thinking. Of course, it is possible to have an idea of what other people are thinking, but this distortion refers to the negative interpretations that we jump to. Seeing a stranger with an unpleasant expression and jumping to the conclusion that they are thinking something negative about you is an example of this distortion.

6. Jumping to Conclusions – Fortune Telling

A sister distortion to mind reading, fortune telling refers to the tendency to make conclusions and predictions based on little to no evidence and holding them as gospel truth. One example of fortune-telling is a young, single woman predicting that she will never find love or have a committed and happy relationship based only on the fact that she has not found it yet. There is simply no way for her to know how her life will turn out, but she sees this prediction as fact rather than one of several possible outcomes.

7. Magnification (Catastrophizing) or Minimization

Also known as the “Binocular Trick” for its stealthy skewing of your perspective, this distortion involves exaggerating or minimizing the meaning, importance, or likelihood of things. An athlete who is generally a good player but makes a mistake may magnify the importance of that mistake and believe that he is a terrible teammate, while an athlete who wins a coveted award in her sport may minimize the importance of the award and continue believing that she is only a mediocre player.

8. Emotional Reasoning

This may be one of the most surprising distortions to many readers, and it is also one of the most important to identify and address. The logic behind this distortion is not surprising to most people; rather, it is the realization that virtually all of us have bought into this distortion at one time or another. Emotional reasoning refers to the acceptance of one’s emotions as fact. It can be described as “I feel it, therefore it must be true.” Just because we feel something doesn’t mean it is true; for example, we may become jealous and think our partner has feelings for someone else, but that doesn’t make it true. Of course, we know it isn’t reasonable to take our feelings as fact, but it is a common distortion nonetheless.

9. Should Statements

Another particularly damaging distortion is the tendency to make “should” statements. Should statements are statements that you make to yourself about what you “should” do, what you “ought” to do, or what you “must” do. They can also be applied to others, imposing a set of expectations that will likely not be met. When we hang on too tightly to our “should” statements about ourselves, the result is often guilt that we cannot live up to them. When we cling to our “should” statements about others, we are generally disappointed by their failure to meet our expectations, leading to anger and resentment.

10. Labeling and Mislabeling

These tendencies are basically extreme forms of overgeneralization, in which we assign judgments of value to ourselves or to others based on one instance or experience.

For example, a student who labels herself as “an utter fool” for failing an assignment is engaging in this distortion, as is the waiter who labels a customer “a grumpy old miser” if he fails to thank the waiter for bringing his food. Mislabeling refers to the application of highly emotional, loaded, and inaccurate or unreasonable language when labeling.

11. Personalization

As the name implies, this distortion involves taking everything personally or assigning blame to yourself without any logical reason to believe you are to blame.

This distortion covers a wide range of situations, from assuming you are the reason a friend did not enjoy the girls’ night out, to the more severe examples of believing that you are the cause for every instance of moodiness or irritation in those around you.

In addition to these basic cognitive distortions, Beck and Burns have mentioned a few others (Beck, 1976; Burns, 1980):

12. Control Fallacies

A control fallacy manifests as one of two beliefs: (1) that we have no control over our lives and are helpless victims of fate, or (2) that we are in complete control of ourselves and our surroundings, giving us responsibility for the feelings of those around us. Both beliefs are damaging, and both are equally inaccurate. No one is in complete control of what happens to them, and no one has absolutely no control over their situation. Even in extreme situations where an individual seemingly has no choice in what they do or where they go, they still have a certain amount of control over how they approach their situation mentally.

13. Fallacy of Fairness

While we would all probably prefer to operate in a world that is fair, the assumption of an inherently fair world is not based on reality and can foster negative feelings when we are faced with proof of life’s unfairness. A person who judges every experience by its perceived fairness has fallen for this fallacy, and will likely feel anger, resentment, and hopelessness when they inevitably encounter a situation that is not fair.

14. Fallacy of Change

Another ‘fallacy’ distortion involves expecting others to change if we pressure or encourage them enough. This distortion is usually accompanied by a belief that our happiness and success rest on other people, leading us to believe that forcing those around us to change is the only way to get what we want. A man who thinks “If I just encourage my wife to stop doing the things that irritate me, I can be a better husband and a happier person” is exhibiting the fallacy of change.

15. Always Being Right

Perfectionists and those struggling with Imposter Syndrome will recognize this distortion – it is the belief that we must always be right. For those struggling with this distortion, the idea that we could be wrong is absolutely unacceptable, and we will fight to the metaphorical death to prove that we are right. For example, the internet commenters who spend hours arguing with each other over an opinion or political issue far beyond the point where reasonable individuals would conclude that they should “agree to disagree” are engaging in the “Always Being Right” distortion. To them, it is not simply a matter of a difference of opinion, it is an intellectual battle that must be won at all costs.

16. Heaven’s Reward Fallacy

This distortion is a popular one, and it’s easy to see myriad examples of this fallacy playing out on big and small screens across the world. The “Heaven’s Reward Fallacy” manifests as a belief that one’s struggles, one’s suffering, and one’s hard work will result in a just reward.

It is obvious why this type of thinking is a distortion – how many examples can you think of, just within the realm of your personal acquaintances, where hard work and sacrifice did not pay off? Sometimes no matter how hard we work or how much we sacrifice, we will not achieve what we hope to achieve. To think otherwise is a potentially damaging pattern of thought that can result in disappointment, frustration, anger, and even depression when the awaited reward does not materialize.

Logical fallacies also distort clear thinking. Logical fallacies corrupt thought and, in extreme cases, weaponize language. A woman tormented by cognitive distortions may suffer depression or anxiety. A man whose reasoning has been infected with logical fallacies may contribute to the suffering of others. When illogical thought patterns infect populations, we can get xenophobia, mass hysteria, or racism.

“When others see the world differently than we do, it can serve as an existential threat to our own contact with reality and often leads to anger and suspicion about the others,” UCLA professor Matthew Lieberman asserted.


  1. ad hominem fallacies replace logical argumentation with attack-language. "The ad hominem is a fallacy of relevance where someone rejects or criticizes another person’s view on the basis of personal characteristics, background, physical appearance, or other features irrelevant to the argument at issue. It is more than an insult. It is an insult used as if it were an argument or evidence in support of a conclusion. In politics, it's called mudslinging. e.g. "Crooked Hillary" "tRUMP" ("15 Logical Fallacies You Should Know")

  2. Strawman Argument. Strawman arguments mischaracterize the opponent's arguments to make them easier to refute. Stricter gun control legislation can be mischaracterized as a plot to confiscate all guns. A lowering of corporate tax can be mischaracterized as a plot to destroy the middle class.

  3. Appeal to Ignorance argues that because something cannot be proven to be true, it must be. Conspiracy theories are often built on this. For example, we have no evidence that X committed fraud because he is so devious he must have destroyed all the evidence. The war in Iraq was premised on an appeal to ignorance. The Bush administration argued that Saddam Hussein had weapons of mass destruction. Independent international auditors, however, found no evidence. But the absence of evidence was proof that the devious Hussein did, in fact, have them. We just hadn't found them yet.

  4. False Dilemma/False Dichotomy. This line of reasoning limits options or possibilities to two, where there may be a range of options. Example: You either support this legislation or you're against us. One can have objections to the legislation and still support the platform, position, or intent.

  5. Slippery Slope Fallacy "The slippery slope fallacy works by moving from a seemingly benign premise or starting point and working through a number of small steps to an improbable extreme." If, for example, an opposing candidate is elected, the economy will implode, the country will become a sanctuary for criminals and thugs, civil strife will erupt, and the nation will fall.

  6. Circular Reasoning. When an argument assumes from a premise already stated. "The President says it's true, so it must be true because he said it was true." or "The Bible is the Word of God because it says so in the Bible."

  7. Hasty Generalizations are general statements without sufficient evidence to support them. Example: Democrats and Republicans never agree. The truth is they sometimes do. Indeed, they often agree on end goals but disagree on strategy.

  8. Red Herring Fallacies are distractions that are introduced to divert the topic. If a politician is asked a specific question about one policy, they might launch into something tangentially related without ever answering the question.

  9. Tu quoque or the "You, too" fallacy dismisses an argument by pointing out the opponent's hypocrisy. If the President is presented with evidence of lying, he can divert blame by pointing out his opponent's lies.

  10. Causal fallacies come in different varieties, There is the non causa pro causa ("not the-cause for a cause") fallacy. e.g. If her last name is O'Malley, she must be Irish. Another is post hoc ergo propter hoc ("after this, therefore because of this"). This fallacy happens when you mistake something for the cause just because it came first. Many superstitions are based on post hoc fallacies. For example, I walked under a ladder. I got sick the following day. Therefore, walking under the ladder caused the illness. The third type of causal fallacy is cum hoc ergo propter hoc (Lat., “with this therefore because of this"). This fallacy happens when we mistakenly interpret two things found together as being causally related. Two things may correlate without a causal relation, or they may have some third factor causing both of them to occur. Or perhaps both things just, coincidentally, happened together. Correlation doesn’t prove causation.

Heuristics are the mental shortcuts we use to solve problems and make decisions. Overall, they are helpful. They may have a neural correlate. The brain makes neural calculations to make sense of the continuous stream of inputs and data coming through the sense doors: sights, sounds, smells, touch, tastes, etc (Groen et al., 2022). What we call heuristics may be, at a subtler level, neural computations.

Sometimes, these shortcuts can blindside our judgment. In selecting one interpretation, the gestalt cortex inhibits other solutions, possibilities, perspectives or ways of seeing. 4 common heuristics are listed below (Source: Very Well Mind):


The availability heuristic involves making decisions based on how easy it is to bring something to mind. When you are trying to make a decision, you might quickly remember a number of relevant examples. Since these are more readily available in your memory, you will likely judge these outcomes as being more common or frequently occurring.

For example, if you are thinking of flying and suddenly think of a number of recent airline accidents, you might feel like air travel is too dangerous and decide to travel by car instead. Because those examples of air disasters came to mind so easily, the availability heuristic leads you to think that plane crashes are more common than they really are.


The representativeness heuristic involves making a decision by comparing the present situation to the most representative mental prototype. When you are trying to decide if someone is trustworthy, you might compare aspects of the individual to other mental examples you hold. A sweet older woman might remind you of your grandmother, so you might immediately assume that she is kind, gentle, and trustworthy.


The affect heuristic involves making choices that are influenced by the emotions that an individual is experiencing at that moment. For example, research has shown that people are more likely to see decisions as having benefits and lower risks when they are in a positive mood. Negative emotions, on the other hand, lead people to focus on the potential downsides of a decision rather than the possible benefits.


The anchoring bias involves the tendency to be overly influenced by the first bit of information we hear or learn. This can make it more difficult to consider other factors and lead to poor choices. For example, anchoring bias can influence how much you are willing to pay for something, causing you to jump at the first offer without shopping around for a better deal.

Psychologists have identified hundreds of biases that distort our perceptions. They go by many names: confirmation bias, the spotlight effect, the endowment effect, the misinformation effect, etc. In the Cognitive Biases Codex, they are grouped by tendencies of mind. For example, to get things done, we tend to complete things we've invested time and energy in (e.g. sunk cost fallacy); to avoid mistakes, we tend to preserve autonomy and group status, and avoid irreversible decisions (e.g. reactance, status quo bias); we favor simple explanations over complex, ambiguous ones (e.g. belief bias, information bias); we edit and reinforce memories after the fact (e.g. misattribution, suggestibility, source confusion); we discard specifics in favor of generalities (e.g. negativity bias, implicit associations); we notice things primed in memory or repeated often (e.g. availability heuristic, attentional bias); the bizarre and funny are recalled more than the ordinary and quotidian (e.g. von Restorff effect, bizarreness effect); we are drawn to details that confirm our beliefs (e.g. confirmation bias, selective perception, subjective valuation), we notice flaws in others (the mote in a neighbor's eye) than our own flaws (the log in our eye) (e.g. bias blind spot, naive cynicism); we tend to create stories out even out of sparse data (e.g. confabulation, clustering illusion, anecdotal fallacy); we fill in characteristics from stereotypes, generalities, and prior experiences (e.g. group attribution error, essentialism, authority bias, bandwagon effect); we think we know what others are thinking (e.g. spotlight effect, illusion of transparency), we project our current mindset unto the past or future (e.g. self-consistency bias, rosy retrospection); we simplify probabilities to make them easier to digest (e.g. mental accounting, zero sum bias); we rate people or things we're familiar with as better (e.g. halo effect, out-group homogeneity bias), we reduce events and lists to their key elements (e.g. primacy effect, part list cueing effect), we notice when something has changed (e.g. anchoring, conservationism, Weber-Fechner law); we favor the immediate and relatable (e.g. appeal to novelty, hyperbolic discounting).

The mind indulges in games of self-deception. For example, the Barnum Effect, also called the Forer Effect. is a phenomenon that occurs when people believe that personality descriptions, like those found in astrology, apply specifically to them (more so than to other people), despite the fact that the description is actually filled with information that applies to everyone.

Our thoughts spring from the assumptions, perceptions, biases, and experiences stored in consciousness. Many form maladaptive schemas that are based on cognitive distortions or fallacies. Maladaptive schemas refer to negative thought patterns that bias our perceptions and beliefs. Maladaptive schemas increase the probability of developing mental health problems, or mood disorders like depression or anxiety (Bishop et al., 2022).

Our unexamined fallacies, biases, prejudices, assumptions, and heuristics color our perceptions. In a previous post, I shared information on the science of seeing. Photons of light are converted to electrical signals that are sent to the visual cortex. Information flows from the retina through the thalamus to the visual cortex. In the visual cortex, information gets processed in multiple stages. 90% of the connections coming into the visual cortex, however, carry predictions from neurons in other parts of the cortex (Feldman Barret, 2017). Only a fraction of what we 'see' is raw, visual input. The mind is filtering the content and deciding how to respond.

What we see is filtered with predictions, evaluations, memory, emotion, and identification that are pulled from other parts of the mind. In other words, we co-create what we see. It is a representation, augmented by biases, heuristics, memes, and thoughts that go unquestioned, unchallenged, and unexamined.

A single neuron is very small—only 10 to 100 micrometers—and when it fires, its action potential (the spike in electrical activity) only lasts about two milliseconds. The brain generates electrical activity to represent 'reality.' During that process of neural representation, the brain encodes sensory information and thoughts into models.

Our moods are influenced by neural representation. Changes of neural representations and brain states impact mood fluctuations over time. These representations can be deconstructed. We can build more accurate models.

The brain reacts to feedback between neurons in different parts of the visual system. Some neurons may reverse course and send information back to the first stage for processing, such that one can see things that aren't there. Examples abound: a young anorexic woman peers at her image in a mirror and sees herself as grossly obese, a policeman testifies that he saw a gun that turned out to be a wallet, we project virtues onto those we like and see those we dislike as irredeemably despicable, a psychopathological dictator can be seen as a benevolent and dear leader, etc.

By identifying common fallacies, cognitive distortions, biases, and heuristics, we can look more deeply into our conditioned ways of seeing. With awareness, we can challenge and change these perceptions. By changing our perceptions, we can reframe our experiences and live more responsibly.

Discipline your thoughts; discipline your speech. Set a guard over your thoughts and keep watch over the door of the lips.

First published 12/07/2020. Edited 4/4/2022

393 views0 comments

Recent Posts

See All

The Gift of Speech

Loafe with me on the grass, loose the stop from your throat, Not words, not music or rhyme I want, not custom or lecture, not even the best, Only the lull I like, the hum of your valvèd voice. -Walt W


bottom of page