top of page

Heuristics, Biases & Fallacies

  • personal995
  • Jul 1, 2024
  • 12 min read

Updated: Dec 6, 2024




To assist with preventing cognitive errors in judgement.




Introduction


Heuristics, biases, and fallacies are cognitive shortcuts and systematic errors in thinking that can influence decision-making and judgment, often leading to deviations from rationality and logical reasoning. These concepts provide insights into common mental pitfalls that humans may encounter when processing information or making choices. By understanding heuristics, biases, and fallacies, individuals can better recognize and mitigate these cognitive errors, improving the quality of their decision-making processes. These concepts assist in promoting critical thinking, reducing cognitive biases, and enhancing overall cognitive resilience in various decision-making contexts.




Index




1. Availability Heuristic


Brief: People tend to overestimate the likelihood of events based on their availability in memory. Events that are more memorable or recent are perceived as more common or likely.


Summary: The availability heuristic is a mental shortcut that relies on immediate examples that come to a person's mind when evaluating a specific topic, concept, decision, or event. People tend to overestimate the likelihood of events based on their availability in memory. Events that are more memorable or recent are perceived as more common or likely.


When useful:

  • In situations where quick decision-making is needed and there is not enough time to analyze all available data.

  • When recalling information about frequently occurring events or experiences.

  • In everyday problem-solving when drawing from personal experiences or observations.


Example: Imagine a person watches a lot of news about airplane accidents. When asked about the safety of air travel, they might overestimate the risk of flying because these dramatic events are readily available in their memory, even though statistically, flying is much safer than driving. This overestimation happens because the person can easily recall recent or memorable airplane accidents, influencing their perception of how common such events are.




2. Confirmation Bias


Brief: The tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses.


Summary: Confirmation bias is the tendency for individuals to seek out, interpret, favor, and remember information in a way that confirms their preexisting beliefs or hypotheses. This cognitive bias leads people to give more weight to evidence that supports their existing views and dismiss or undervalue evidence that contradicts them.


When useful:

  • In forming quick judgments based on prior knowledge or experience.

  • When trying to reinforce confidence in a decision that needs to be implemented immediately.

  • In situations where maintaining consistency in beliefs is important for social or psychological reasons.


Example: Consider someone who believes that a particular diet is the best for weight loss. They will likely pay more attention to articles, studies, and testimonials that support the effectiveness of this diet and ignore or downplay any evidence that suggests other diets might be more effective or that the favored diet might have negative health impacts. For instance, if they read ten articles about diet, they might remember and cite the ones that align with their beliefs and forget the ones that do not.



3. Anchoring Bias


Brief: The common human tendency to rely heavily on the first piece of information (the "anchor") when making decisions.


Summary: Anchoring bias is the cognitive bias where people heavily rely on the first piece of information (the "anchor") they encounter when making decisions. This initial information serves as a reference point and can significantly influence subsequent judgments and estimates, even if the anchor is arbitrary or irrelevant.


When useful:

  • In negotiations, where setting an initial offer can influence the final outcome.

  • When estimating values or making decisions quickly, using the anchor as a starting point.

  • In situations where a lack of information means that any initial data point can provide a useful reference.


Example: Imagine you are buying a used car. The seller initially asks for $15,000. Even if you negotiate and get the price down to $13,000, the initial $15,000 serves as an anchor, making $13,000 seem like a better deal even if the car's actual market value is closer to $10,000. The initial high price has set a reference point that skews your perception of what constitutes a fair price.




4. Dunning-Kruger Effect


Brief: A cognitive bias wherein people with low ability at a task overestimate their ability, while those with high ability underestimate their competence.


Summary: The Dunning-Kruger effect is a cognitive bias in which people with low ability or knowledge in a particular area overestimate their competence, while those with high ability or expertise tend to underestimate their competence. This phenomenon occurs because the lack of skill or knowledge prevents individuals from recognizing their own deficiencies, while experts, aware of the complexity of the subject, may undervalue their own expertise.


When useful:

  • In understanding and improving learning and self-assessment processes.

  • In identifying and addressing overconfidence in workplace settings or education.

  • In leadership and team management to ensure balanced evaluation of skills and abilities.


Example: Consider a novice chess player who has learned the basic moves and strategies but has little experience playing against strong opponents. They might believe they are very skilled because they haven't encountered enough challenging situations to reveal their weaknesses. On the other hand, a highly experienced chess master might underestimate their ability because they are acutely aware of the vast array of strategies and the limitations of their knowledge compared to other grandmasters. This discrepancy in self-assessment is a clear manifestation of the Dunning-Kruger effect.




5. Post Hoc Fallacy


Brief: A logical fallacy that occurs when it is assumed that because one event followed another, the first event must have caused the second.


Summary: The post hoc fallacy, short for post hoc ergo propter hoc ("after this, therefore because of this"), is a logical fallacy that occurs when it is assumed that because one event followed another, the first event must have caused the second. This fallacy arises from the mistaken belief that temporal succession implies a causal relationship.


When useful:

  • In identifying flawed arguments in discussions, debates, and reasoning.

  • In critically analyzing cause-and-effect claims in scientific research, media reports, or everyday situations.

  • In improving logical thinking and avoiding incorrect assumptions in problem-solving.


Example: Imagine someone who wears a particular shirt and then experiences a successful day at work. They might believe that wearing that shirt caused their success, attributing the positive outcome to the shirt rather than considering other factors like their preparation, skills, or external circumstances. This is an example of the post hoc fallacy, where the mere sequence of events (wearing the shirt followed by a successful day) is mistakenly interpreted as a causal link.




6. Hindsight Bias


Brief: The tendency to see events as having been predictable after they have already occurred.


Summary: Hindsight bias is the tendency to see events as having been predictable after they have already occurred. People often believe that they "knew it all along" and overestimate their ability to have predicted an outcome before it happened. This bias can lead to an overestimation of one's foresight and an underestimation of the complexity and uncertainty involved in decision-making.


When useful:

  • In reflecting on past decisions to understand the role of hindsight in evaluating outcomes.

  • In learning from past experiences while avoiding the trap of believing events were more predictable than they actually were.

  • In improving future decision-making by recognizing the limitations of foresight and the impact of unforeseen variables.


Example: Consider a sports fan who, after watching their team win a game, claims they knew all along that the team would win. Before the game, however, they might have been uncertain or even skeptical about the outcome. The fan's belief that the victory was predictable is influenced by hindsight bias, as they are now interpreting the result as having been more certain than it truly was at the time.




7. Bandwagon Effect


Brief: The tendency to adopt beliefs, ideas, or trends because they are popular or widely accepted by others.


Summary: The bandwagon effect is the tendency for individuals to adopt beliefs, ideas, or trends simply because they are popular or widely accepted by others. This social phenomenon occurs as people are influenced by the opinions and behaviors of their peers, often leading to a herd mentality where conformity is favored over individual analysis or dissent.


When useful:

  • In marketing and advertising to leverage social proof and increase product adoption.

  • In understanding social dynamics and the spread of ideas or behaviors within groups.

  • In analyzing the influence of peer pressure and social conformity in various contexts.


Example: During an election, if a particular candidate gains significant momentum and widespread support, individuals who were previously undecided or supported other candidates might switch their allegiance to the leading candidate simply because "everyone else is doing it." This shift in support is driven by the bandwagon effect, where the popularity of the candidate influences people to join the majority, regardless of their initial stance or individual analysis of the candidate's policies.




8. Self-Serving Bias


Brief: The habit of attributing positive events to one’s own character while attributing negative events to external factors.


Summary: Self-serving bias is the tendency for individuals to attribute their successes to internal factors, such as their own abilities or efforts, while attributing their failures to external factors beyond their control. This bias helps to maintain and enhance self-esteem and can influence how people perceive their own actions and outcomes.


When useful:

  • In maintaining self-esteem and confidence after experiencing setbacks or failures.

  • In understanding personal and social dynamics where self-perception and attribution of success and failure are involved.

  • In psychological counseling to address cognitive distortions and promote a balanced self-view.


Example: Imagine a student who receives a high grade on an exam. They are likely to attribute this success to their intelligence and hard work. However, if the same student receives a low grade on another exam, they might blame the teacher's unfair questions, the difficulty of the material, or other external circumstances. This pattern of attribution demonstrates self-serving bias, where positive outcomes are credited to internal factors and negative outcomes to external factors.




9. Gambler’s Fallacy


Brief: The mistaken belief that past random events can influence the outcome of future random events, such as thinking that a coin flip is "due" to land on heads after a streak of tails.


Summary: The gambler's fallacy is the mistaken belief that past random events can influence the outcome of future random events. This fallacy occurs when individuals think that a certain outcome is "due" after a streak of opposite outcomes, despite each event being independent and having an equal probability.


When useful:

  • In understanding and correcting misconceptions about probability and random events.

  • In financial decision-making, gambling, and games of chance to avoid flawed reasoning.

  • In statistical education to illustrate the principles of independent events and probability.


Example: Consider a person flipping a fair coin. If the coin lands on tails five times in a row, they might believe that it is "due" to land on heads next. However, each flip is an independent event with a 50% chance of landing on heads or tails, regardless of previous outcomes. The belief that heads is more likely after a streak of tails demonstrates the gambler's fallacy, as past flips do not influence future ones.




10. Sunk Cost Fallacy


Brief: The inclination to continue an endeavor once an investment in money, effort, or time has been made, even when it is no longer the best option.


Summary: The sunk cost fallacy is the inclination to continue an endeavor once an investment in money, effort, or time has been made, even when continuing is no longer the best option. This fallacy arises from the irrational tendency to consider unrecoverable past costs when making decisions about the future, instead of focusing on the prospective costs and benefits.


When useful:

  • In decision-making to avoid persisting in unproductive or inefficient activities.

  • In personal finance to assess investments and expenditures objectively.

  • In project management and business strategy to prioritize future outcomes over past investments.


Example: Imagine someone buys a non-refundable ticket to a concert but later feels unwell and tired on the day of the event. Despite feeling unwell and knowing they might not enjoy the concert, they decide to go anyway because they've already paid for the ticket. In this scenario, the decision to attend the concert is influenced by the sunk cost fallacy — the person is focusing on the money already spent (a sunk cost) rather than objectively evaluating whether attending the concert would provide enjoyment or benefit given their current circumstances.




11. Framing Effect


Brief: The way information is presented can significantly affect decisions and judgments, even when the underlying information is the same.


Summary: The framing effect refers to the phenomenon where the way information is presented (the "frame") can significantly influence decisions and judgments, even when the underlying information is the same. This cognitive bias demonstrates that people react differently to a particular choice depending on how it is presented, emphasizing either potential gains or losses.


When useful:

  • In marketing and advertising to influence consumer choices by framing products in positive terms.

  • In policy-making to present information in ways that encourage desired behaviors or outcomes.

  • In negotiations and persuasion to leverage framing techniques that enhance the perceived value or benefits of a proposal.


Example: Imagine a medical treatment that is described in two different ways: Option A is described as having a 70% success rate, while Option B is described as having a 30% failure rate. Despite conveying the same statistical information, people might perceive Option A more positively because it is framed in terms of success rate, whereas Option B might be seen more negatively due to the focus on failure rate. This illustrates how the framing effect can influence decisions based on how information is presented.




12. Halo Effect


Brief: The tendency for an impression created in one area to influence opinion in another area, such as assuming someone who is attractive is also intelligent.


Summary: The halo effect is the cognitive bias where a person's overall impression of someone (or something) influences their judgments about that person's specific traits or characteristics. This bias occurs when a positive or negative impression in one area leads to a biased perception of qualities in unrelated areas.


When useful:

  • In understanding how first impressions can shape subsequent judgments and interactions.

  • In marketing and branding to create positive associations that enhance overall perceptions.

  • In interpersonal relationships to recognize and mitigate biases that may affect assessments of others.


Example: Consider a job interviewer who meets a candidate who is well-dressed and articulate during the interview. The interviewer might then assume that the candidate is also competent, trustworthy, and capable in areas not directly observed, such as problem-solving or teamwork. This assumption is influenced by the halo effect, where positive qualities perceived in one aspect (appearance and communication skills) bias judgments about unrelated aspects (competence and reliability).




13. Survivorship Bias


Brief: The logical error of concentrating on the people or things that "survived" some process and inadvertently overlooking those that did not due to their lack of visibility.


Summary: Survivorship bias is the logical error of focusing on the people or things that "survived" some process or event while overlooking those that did not survive, often due to their lack of visibility or availability for study. This bias can lead to distorted conclusions because the sample that is studied or observed is not representative of the entire population.


When useful:

  • In historical analysis and research to ensure comprehensive understanding and accurate conclusions.

  • In business and investment strategies to avoid basing decisions solely on successful outcomes without considering failures.

  • In personal development and learning from experiences by acknowledging both successes and failures.


Example: In the tech startup industry, analysts often focus on studying successful companies that have achieved significant growth and profitability. However, survivorship bias occurs when this analysis overlooks the numerous startups that failed due to reasons such as market timing, competition, mismanagement, or lack of funding. By only studying successful startups, analysts may draw conclusions that are overly optimistic or skewed, ignoring valuable insights that could be learned from understanding why other startups failed and the broader risks involved in the industry.




14. Base Rate Fallacy


Brief: Ignoring or undervaluing statistical base rates in favor of anecdotal evidence or specific information.


Summary: The base rate fallacy is the cognitive bias where individuals ignore or undervalue statistical base rates (general probabilities) in favor of specific information or anecdotal evidence. This bias leads people to focus more on individual cases or specific details, often disregarding broader statistical data that should logically have a greater influence on their decision-making.


When useful:

  • In making decisions that require consideration of both specific information and general probabilities.

  • In analyzing risks and uncertainties by balancing anecdotal evidence with statistical data.

  • In critical thinking and problem-solving to avoid drawing flawed conclusions based on isolated cases.


Example: Imagine a pharmaceutical company develops a new diagnostic test for a rare disease that affects only 1 in 10,000 people. The test is 99% accurate (1% false positive rate). Despite the test showing positive results for an individual, there is a high likelihood of a false positive due to the rarity of the disease in the population. Ignoring the base rate (low prevalence of the disease) and focusing solely on the test result could lead to incorrect assumptions about the person's actual health status, illustrating the base rate fallacy.




15. False Consensus Effect


Brief: The tendency to overestimate how much other people share one’s beliefs, attitudes, and behaviors.


Summary: The false consensus effect is the cognitive bias where individuals overestimate how much other people share their beliefs, attitudes, and behaviors. This bias leads people to believe that their own opinions and preferences are more common or widely accepted than they actually are, often assuming that others think and act in similar ways.


When useful:

  • In understanding social dynamics and the influence of personal perspectives on perceptions of others.

  • In communication and persuasion to avoid assuming universal agreement or understanding.

  • In diversity and inclusion efforts to promote empathy and awareness of differing viewpoints.


Example: Consider someone who strongly believes in a particular political ideology and assumes that a majority of their peers and the general population also hold similar views. They might be surprised or even defensive when confronted with dissenting opinions or alternative viewpoints, as they falsely assume that their beliefs are widely shared. This illustrates the false consensus effect, where personal beliefs are extrapolated to be more universally accepted than they actually are.






< Tactics - Previous

Next - Library >

Landing Page.png

The Wisdom of Many

Enter your email below and receive the free Wisdom of Many Newsletter.
Full of timeless principles and wisdom to help you be a wise decision maker and live well. 

Thank you

bottom of page