These are the biases and fallacies that define and distort how we understand the world around us.
While cognitive bias and fallacy crafts a false image relative to reality, most educated people don’t often realize that they’re very useful at making life more meaningful or easier:
- There’s much less mental work from filtering out information or converting it into a story.
- Quick-and-easy solutions lets us sift through much more information, even if it’s not that thorough.
- By trusting things beyond ourselves, we get along in society more easily.
- By honoring our fear impulses, we can successfully avoid most bad things that may happen, even if modern society often requires the opposite approach.
- We can discover meaning and purpose from observations to give us something to live for, even when there’s nothing there.
- Irrational behavior is often the most compassionate and loving, which is where the ultimate meaning of life resides.
While avoiding all these aren’t necessarily important to live a good and meaningful life, we must be aware of them to keep our imaginations and predictions tethered closely with reality.
It’s worth noting that awareness of these biases don’t necessarily change anything. We tend to veer into an opposite bias while trying to be accurate, and influencing others with incentives to be unbiased doesn’t help at all.
This is not a complete list, and probably can never be. It’s all based on a “rational actor model”, which implies that everyone is reasonable until distorted by a bias. The rational actor model is likely wrong.
Detecting Information
Availability heuristic/availability bias – we believe something is likely if we find 1 instance of it
Expectations bias – expectations we’ve predetermined define what we perceive
List-length effect – as lists grow, we remember more items in that list, but a smaller percentage of it
Priming – we’re constantly influenced by subconscious triggers
Tachypsychia – our perception of time lengthens and slows down
Weber-Fechner law – we detect increases in things proportionally to how many things there were at the beginning
Ignoring Information
Banner blindness/edge blindness – we learn to ignore what we repeatedly see
Cross-race effect/cross-race bias/other-race bias/own-race bias/other-race effect – we more easily recognize faces that match our racial group
Google effect/digital amnesia – we remember where something is, but not what it is
Memory inhibition/part-list cueing effect/repetition blindness – we don’t remember some items in a list or sequence because of other items
Next-in-line effect – we ignore information that came right after what we just said
No true Scotsman/purity appeal/perfect solution fallacy – we reject an idea because it doesn’t fit every element we feel about something
Normalcy bias/normality bias – we tend to ignore threat warnings
Plant bias – we ignore plants, including when valuing their importance and use
Selective perception/Ostrich effect/Semmelweis reflex – we ignore details we don’t like
Survivorship bias/survival bias/immortal time bias/swimmer’s body illusion – we neglect to notice things that don’t make it past a selection process
Mixing/Merging Information
Ad hominem – we mix up what we hear and why it was said
Anthropomorphism/anthropocentrism – we assume human-like traits and experiences for non-human things
Appeal to probability/appeal to possibility – we assume a likely thing is a guaranteed thing
Argument from ignorance – we accept strange explanations when we’re uncertain
Bulverism – we will claim A, and because of B want A to be true, so A is false
Circular reasoning/begging the question/petitio principii – our “why” questions can be answered sequentially in a loop
Cognitive dissonance – we can believe opposite things at the same time
Common source bias – we mix up information when it’s from the same source
Composition fallacy/frozen abstraction – we assume the qualities of a part are the qualities of the whole
Confabulation – we mix up information
Division fallacy – we assume the qualities of the whole are the qualities of the part
Illicit transference – we mix up things that are general principles for categories and specific things that don’t apply to everything in a category
Interoceptive bias/irrational hungry judge effect – we are influenced by external, unrelated circumstances
Masked-man fallacy – we believe that because A has Quality 1 and B doesn’t, then B can’t have Quality 2 that A has
Misinformation effect – we remember the wrong information because future information mixed with it
Memory misattribution/source misattribution – we forget where things went and fail in 3 ways:
- Cryptomnesia – we think something we heard was our original thought
- False memory – we believe something that didn’t happen actually happened
- Source confusion – we misplace where we perceived something
Mental accounting – we group money and expected likelihoods into exclusive, unrelated categories
Nietzsche-Sapir-Whorf-Korzybski hypothesis/linguistic relativity hypothesis – we form language to organize and understand reality, not to simply clarify or communicate it
Placebo effect – we positively change our physiology and attitude because of things that didn’t do anything
Stolen concept fallacy – we use A to disprove B, but A logically depends on B to exist
Surrogation – we mix up the thing and the measurement of that thing
Telescoping effect – we misplace the chronology of past and future events
Tip of the tongue phenomenon – we can remember a feeling tied to a word, but not the word itself
Two negative premises – we believe that because A isn’t B, and B isn’t C, that A is C
Undistributed middle – we believe that because Object A and Object B have Quality A, that Object A is Object B
Prioritizing Memory
Attentional bias – we remember things better if we think about them more frequently
Attribute substitution – we use simple details when things get too complicated
Belief bias – we agree with conclusions based on our beliefs, not on logic
Bizarreness effect/humor effect – we remember odd or funny things more than mundane things
Boundary extension – we imagine the background of an image as larger than the foreground
Childhood amnesia – we forget most things before age 4
Continued influence effect/misinformation effect – we still remember wrong information even after we were corrected
Cue-dependent forgetting/context effect/mood-congruent memory bias – we recall memories based on our present thoughts
Fading affect bias – we forget negative memories faster than positive ones
Generation effect – we remember things we thought more than things we observed
Implicit association – we remember words based on how closely connected we’ve made them
Isolation effect/Von Restorff effect – when we perceive multiple things, we remember the most different thing of all of them
Lag effect/spacing effect – we learn more easily when the information is spaced out over time
Mandela effect – we remember things that never happened
Memory inhibition effect – we partially don’t remember things
Perky effect – we mix up imagined and real images
Persistence – we re-experience painful memories
Picture superiority effect – we remember pictures more than words
Reminiscence bump – we remember our adolescence and early adulthood more than most other periods of our lives
Serial position effect/list length effect – we remember the first and last things in a series the easiest
- Anchoring bias/contrast effect/focusing effect/primacy effect/arbitrary coherence – we rely heavily on the first observed piece of information
- Availability heuristic/recency bias/suffix effect/Travis syndrome – we prioritize recent and available information over older information
- Serial position-negativity effect – assuming things are the same, we dislike things later in a sequence with the same qualities
Suggestibility – we mistake someone’s question as our thought
Unit bias – we feel one unit of something is the ideal amount
Crafting Stories
Anecdotal fallacy/spotlight fallacy/Volvo effect/hasty generalization/proof by example/no limits fallacy – we use personal experiences or isolated examples instead of sound arguments or compelling evidence
Apophenia/bucket errors/clustering illusion/illusory correlation – we connect unrelated information and believe they’re connected
Association fallacy – we conclude that if A is B and A is C, that B is also C
Converse error/affirming the consequent – we assume only A can cause B, and B, therefore A
Denying the antecedent – we assume only A can cause B, and not A, so not B
False cause – we assume A coming before B means A caused B
Framing effect – we change our decisions based on whether the options were positively or negatively expressed
Frequency illusion/Baader-Meinhof effect – we start seeing something everywhere once we’ve noticed it
Pareidolia/Texas sharpshooter fallacy – we create meaning out of things that have no meaning
Peak-end rule/duration neglect – we measure an experience by its most intense point and how it ends, not by all its parts
Eaton-Rosen phenomenon/rhyme as reason effect – we consider things are more true when they rhyme
Levels-of-processing effect/testing effect/processing difficulty effect – we understand things better when we mentally process them more
Leveling and sharpening – we omit information from stories (leveling) and add details to our stories (sharpening)
Magical thinking/Tinker Bell effect – we make and believe stories that explain coincidences
Modality effect – we remember things based on how they’re presented
Red herring – we connect concepts that are only barely related
Salience bias – we focus on things that emotionally affect us more than things that don’t
Sample size insensitivity – we judge likelihoods while ignoring the number of events that had that likelihood
Storytelling effect – we remember stories more than facts
Verbatim effect – we remember the “gist” of something more than exact wording
Faster Selecting
Additive bias – we tend to solve problems by adding when we should be subtracting
Ambiguity effect – we prioritize decisions where things are more certain
Attentional bias – we filter thoughts based on what we’re paying attention to
Base rate fallacy – we prioritize individual information over general information
Cashless effect – we spend more when we don’t actually see our money
Center-stage effect – we choose the middle option in a set of items
Decoy effect/attraction effect/asymmetric dominance effect – we change our opinion of 2 options when a 3rd option is inherently worse than one of them and has pros and cons with the other
Default effect – we tend to choose the most standard option
Denomination effect – we tend to spend smaller amounts of money (e.g., coins) more quickly than larger amounts (e.g., banknotes)
Distinction bias – we find more differences between two things when we examine them together versus separately
Fallacy argument/fallacy appeal/Mister Spock fallacy – we assume a faulty premise automatically means a faulty conclusion
False dichotomy/false dilemma fallacy – we assume an either/or extreme and ignore other possibilities
Genetic fallacy – we assume something’s origin defines whether it’s true
Golden Mean fallacy – we assume the “middle ground” between two points is the best option
Group attractiveness effect – we find individual items more attractive when presented in a group
Ignorance appeal/ignorance argument/lack of imagination argument/personal incredulity argument – we assume something is true or false only because it hasn’t been proven otherwise
Law of the instrument/law of the hammer/Maslow’s gavel/golden hammer – the thing we use dictates how we perceive everything else
Less-is-better effect – we prefer smaller amounts of uneven comparisons when observing them separately, then reverse our preference when they’re observed together
Money illusion/price illusion – we measure money based on its relative number instead of what it can do for us
Prosecutor’s fallacy – we reject an event’s explanation even though our preferred expectation is even more unlikely
Reactive devaluation – we assume information from an enemy has less value
Fear of Missing Out
Action bias – we tend to act even when we shouldn’t
Fresh start effect – we take action if we feel there’s something new to it
Novelty appeal – we prefer things we think are new
Proudly found elsewhere (PFE) syndrome – we prefer using things from outside groups
Recency illusion – we believe a word is newer because we notice it, even when it’s been around for a long time
Zeigarnik effect/unit bias – we remember uncompleted or interrupted tasks more than completed ones
Fear of Novelty
Conservatism/regression bias – we tend to overvalue prior experience and undervalue new information
Default bias – we don’t change established behaviors
Extinction burst – we push against changing our habits, even when we know better
Familiarity bias/mere-exposure effect/status quo bias – we prefer familiar experiences
Functional fixedness – we limit an object’s use to how it’s been traditionally used
Mere exposure effect/familiarity principle – we don’t like things when we’re familiar with them
Negativity bias/negativity effect – we remember negative things more than neutral or positive things
Omission bias – we believe doing nothing is better than doing the wrong thing
Semmelweis reflex/Semmelweis effect – we reject new things because they contradict something already established
Status quo bias/tradition appeal – we prefer things the way they have been
Bad Predictions
Anticipatory rationalization – we treat our feelings about the future as if they were reality
Arrival fallacy – we believe that once we’ve succeeded, we’ll experience enduring happiness
Conjunction fallacy – we expect multiple likelihoods are more likely than one likelihood
Disposition effect – we tend to keep bad investments and sell good investments
Dunning-Kruger effect – we overestimate our skills when we don’t know much and underestimate skills when we’re highly qualified
End-of-history illusion – we expect we’ll change less in the future than we did up through now
Form function attribution bias – we assume the way something looks is how it’s used
Gambler’s fallacy/Monte Carlo fallacy/fallacy of the maturity of chances – we believe if something is happening more or less frequently than normal, it’ll pivot to the opposite in the future
Hard-easy effect – we think we can do harder things, but not easier things
Impact bias/durability bias/projection bias/self-consistency bias/hedonic adaptation – we overestimate the intensity of future emotional states and underestimate the intensity of past emotional states
Information bias – we keep seeking information even when it doesn’t affect our decisions at all
Nature appeal – we believe natural things are good and unnatural things are bad
Oven logic – we assume adjusting one condition can compensate for changes in another
Prevention bias – we prefer to spend more resources on prevention than detection/response, even when they’re the same effectiveness
Process of elimination fallacy/Sherlock Holmes fallacy/arcane explanation – we consider an explanation correct when we remove other explanations
Pseudocertainty effect – we assume uncertain things are certain
Selection bias – we make statistics with non-random sampling
Self-serving bias – we take credit for positive events and blame others for negative ones
Slippery slope fallacy – we believe that A always leads to B, so seeing A means B will always happen
Subadditivity effect – we tend to make the probability of multiple things together smaller than each of those things individually
Time-saving bias – we underestimate time we can save by going faster from a slow speed, and overestimate time we can save by going faster from a higher speed
Well traveled road effect – we estimate longer times to travel unfamiliar routes than familiar ones
Bad Optimism
Control illusion – we assume we can control events more than we can
Escalation of commitment/irrational escalation – we keep doing things that cause no perceptible positive results
Hot hand fallacy/hot hand phenomenon – we believe a successful result improves the chances of a future successful result
Just-world hypothesis/just-world fallacy – we believe people will get what they morally deserve
Optimism bias/unrealistic optimism/comparative optimism/wishful thinking/valence effect/positive outcome bias – we believe we’re more likely to experience positive results than others
Peltzman effect/risk compensation – we increase risky behavior when we see safety measures
Plan continuation bias – we don’t adapt our plans to changes in reality
Planning fallacy – we underestimate how much time a task will take
Positivity effect/socioemotional selectivity theory – we prefer to hear positive information
Probability neglect – we ignore likely consequences when we’re uncertain
Pro-innovation bias – we believe a new thing should be adopted universally by society without any adaptations
Restraint bias – we overestimate our ability to control our impulses
Scope neglect – we ignore the size of a problem when considering it
Self-licensing/moral self-licensing/moral licensing/licensing effect – we make immoral choices when we feel confident in ourselves
Validity illusion – we overestimate our ability to accurately analyze data to interpret and predict results
Bad Cynicism
AI effect – we discredit the behavior of an artificial intelligence computer program because we claim it’s not real intelligence
External agency illusion – we assume our satisfaction comes from external factors and not our decisions
Hyperbolic discounting/current moment bias/present bias/dynamic inconsistency – we prioritize immediate benefits over larger long-term benefits
Loss aversion/dread aversion – we spend more effort avoiding painful things than earning the same amount of beneficial things
Not invented here (NIH) syndrome/toothbrush theory – we avoid using things from outside groups
Pessimism bias – we assume negative things are more likely than they really are
Reactance – we feel threatened when certain behaviors might be restricted
Risk compensation – we change our behavior to avoid perceived risks
Zero-risk bias – we try to get rid of all risks on a smaller thing instead of cutting down on overall risk
Zero-sum bias – we assume there’s a winner and a loser, even when everyone can win
Bad Hindsight
Choice-supportive bias/post-purchase rationalization – we justify decisions we’ve agreed to and discredit decisions we didn’t take
Confirmation bias/congruence bias/cherry-picking – we seek evidence that confirms what we already think
Endowment effect/divestiture aversion/mere ownership effect – we value something more if we feel it’s ours
Euphoric recall – we remember experiences more positively than they were
Hindsight bias/knew-it-all-along phenomenon/creeping determinism – we assume we predicted past events more than we really did
Investment loops – we’re more likely to use something later when we invest into something
Effort justification/IKEA effect/labor illusion/processing difficulty effect – we value things much more when we partially create or work with the things that make our consequences
Moral luck – we give moral blame or praise to people not responsible for it
Non-adaptive choice switching/once bitten, twice shy effect/hot stove effect – we avoid previous choices that were ideal because they caused us pain
Outcome bias/defensive attribution hypothesis/consequences appeal/force appeal/fear appeal – we judge decisions by their outcome instead of the quality of the decision itself
Proportionality bias – we assume big events are caused by big things
Retrospective determinism – we assume things will happen again
Rosy retrospection/declinism – we judge the past more positively than the present
Subjective validation/personal validation effect/self-relevance effect – we consider something to be correct if it has personal meaning to us
Sunk cost effect/sunk cost fallacy/commitment escalation/cab driver’s fallacy – we’re slow to pull out of something we’ve already invested in
Socially Under-thinking
Curse of knowledge/curse of expertise – we’re unaware other people don’t have the same knowledge as us
Enemy mine – we assume 2 beliefs that share their dislike for a 3rd one must be compatible
Gender bias – we assume genders automatically can or can’t do certain things
Group attribution error – we assume either one group member represents the entire group, or the group’s decision reflects each individual’s beliefs
Identifiable victim effect/compassion fade – we only help people when we see 1 person’s suffering, but not when it’s many people
Obscurity appeal/Chewbacca defense – we justify things from potentially irrelevant (non-sequitur) or incorrect information, then believe it’s valid because others don’t know that information
Out-group homogeneity effect/out-group homogeneity bias – we assume others outside our group are collectively more similar than the people inside our group
Reverse psychology – people who oppose us can influence us to do what they want by asking the opposite of their desires
Social cryptomnesia – we forget who did things
Stereotype/essentialism/implicit bias – we assume specific things about others define the entirety of who they are
Straw man fallacy/ridicule appeal – we use silly over-simplified versions of others’ beliefs when we disagree with them
Women are wonderful effect – we give more positive attributes to women than men
Bad Social Predictions
Ad hoc – we use speculation as a basis for potential facts
Agent detection/hard work fallacy – we presume someone is always responsible, even for random events
Armchair fallacy – we assume we understand the effort required for others’ creations when we don’t know
Fundamental attribution error/actor-observer bias – we assume people choose and control their situations more than circumstances affecting it
Hawthorne effect/evaluation apprehension – we change our behavior when we know we’re being observed
Intentionality bias – we assume people do things intentionally
Placement bias – we misjudge ourselves compared to others
- Overconfidence effect/illusory superiority/Lake Wobegon effect/better-than-average effect – we think we’re better at things than others
- Worse-than-average effect – we assume we’re worse than others at doing things
Pluralistic ignorance – we assume we’re the only person thinking something, even when everyone else is
Transparency illusion – we believe other people know what we’re thinking when they don’t
Bad Social Optimism
Assumed similarity bias – we assume others have more in common than us than they do
Bystander effect – we expect other people around us to act when something must be done
False consensus effect – we overestimate how much other people agree with us
Halo effect/noble edge effect – we assume our positive impression of someone or a group means they’re also good deep-down
In-group favoritism/in-group-out-group bias/in-group bias/intergroup bias/in-group preference – we favor members of our group more than outsiders
Original position fallacy – we prefer a change because we assume we’ll be on the good side of that change
Public goods game – we assume an economic system could exist where we didn’t have to regulate lazy people, cheaters, liars, and thieves
Sexual overperception bias – we assume others are sexually attracted to us when they aren’t
Truth bias – we assume others are speaking the truth
Bad Social Cynicism
Extrinsic incentives bias – we assume others care more about outer incentives than inner incentives (e.g., money over learning skills) than we do
Hostile attribution bias – we assume others have hostile intent even when we don’t know
Human nature fallacy – we assume human nature is inherently bad
Puritanical bias – we assume people do bad things from moral failure without any context
Sexual underperception bias – we assume others aren’t sexually attracted to us when they are
Shared information bias – we spend lots of time retelling people things they already know
Third-person effect – we believe other people are more influenced by large-scale media than we are
Trait ascription bias – we assume our personalities are flexible and others’ aren’t
Ultimate attribution error – we assume motivations for our groups are good and motivations for groups we’re not in are bad
Bad Social Submission
Abilene effect – we collectively decide for something that’s bad for most or all of us
Authority bias/authority argument/authority appeal – we give weight to an authority figure’s opinion
Availability cascade – we follow what others believe and do, who follow what others believe and do
Bandwagon effect/groupthink – we do and believe things because others are doing it
Ben Franklin effect – we’re more likely to do favors for people we’ve already done favors for
Cheerleader effect/group attractiveness effect – we think individuals are more attractive when they’re in a group
Group shift – we adapt to more or less risk than the rest of our group
HIPPO problem – we trust the Highest Individually Paid Person’s Opinion
Illusory truth effect/validity effect/reiteration effect – we’ll think a lie is true if we’re repeatedly exposed to it
Nausea argument (ad nauseam) – we believe we’ve dispelled an argument beforehand, so it’s not worth discussing anymore
Observer-expectancy effect/experimenter-expectancy effect/expectancy bias/observer effect/experimenter effect/subject-expectancy effect – the participants of experiments are affected toward researchers’ biases
Pity appeal – we give more grace than we should because we feel sorry for people
Publication bias – we publish scientific papers that validate existing scientific bias
Pygmalion effect – we do what others expect of us
Saying is believing effect – we listen more closely to when messages are adapted to the audience
Survey bias/social desirability bias/courtesy bias/flattery appeal/popularity appeal – we tend to tweak our answers toward whatever is socially acceptable
System justification – we believe any existing system is good to maintain
Wealth appeal/Colbert bump – we give weight to a wealthy or famous person’s opinion
Social Conceit
Asymmetric insight illusion – we believe we know others more than they know us
Barnum-Forer effect – we believe generic descriptions of a person apply to us
Bias blind spot/naïve cynicism/naïve realism/objectivity illusion – we see others’ bias, but think we see things objectively
Egocentric bias – we remember things as better than they were
Explanatory depth illusion – we think we can do difficult things more than we can
False uniqueness bias – we believe we and our creations are more unique than they really are
Moral credential effect – we feel we can do immoral things if we’ve done moral things before it
Social comparison bias – we dislike and compete with others we imagine are better than us
Special pleading – we believe general rules for human behavior don’t apply to us
Spotlight effect – we believe others notice us more than they really do
Skewing Antisocially
Automation bias – we assume automated decision-making systems are more unbiased than people
Backfire effect – we intensify our convictions when they’re challenged
Empathy gap/empathy bias – we tend to feel no empathy for others even when we ought to
Inherent nature appeal – we assume something unacceptable is acceptable because the person’s nature is to do it