SCARS Manual of Cognitive Biases – 2024

The Range Of Psychological Biases That Affect Scam Victims and Every Human

Biases Play a Major Role In Scam Victim Vulnerability & Susceptibility To Scams, as well as How Well they will Recover from them

SCARS Journal of Scam Psychology - Manual of Cognitive Biases - on SCARS ScamPsychology.org
Authors:
• Vianey Gonzalez B.Sc(Psych) – Psychologist, Certified Deception Professional, Psychology Advisory Panel & Director of the Society of Citizens Against Relationship Scams Inc.
• Tim McGuinness, Ph.D. – Anthropologist, Scientist, Director of the Society of Citizens Against Relationship Scams Inc.
• Portions from Other Sources
• Based on research by the United States National Institute of Mental Health and other sources

This page will be constantly expanding – visit often!

The links will take the visitor to the website where the item was published.

For more visit: ScamsNOW.com and RomanceScamsNOW.com

Click Here for the SCARS Manual of Scam Psychology

SCARS Manual of Cognitive Biases 2024 on SCARS ScamPsychology.org

Introduction

Understanding cognitive biases is essential for everyone, particularly for scam victims, as it provides critical insights into how our minds can be manipulated and deceived. Cognitive biases are inherent tendencies in human thinking that can lead to errors in judgment and decision-making.

Scammers exploit these biases to manipulate their victims into believing false information or making poor choices, ultimately leading to financial loss or emotional harm. By recognizing and understanding these biases, individuals can enhance their ability to detect and resist manipulation, making them less susceptible to scams.

Moreover, knowledge of cognitive biases can aid in the recovery process for scam victims by helping them understand why they fell prey to the scam in the first place, thereby reducing feelings of self-blame and shame. Ultimately, awareness of cognitive biases empowers individuals to make more informed decisions, stay vigilant against manipulation, and safeguard themselves from potential exploitation.

A Manual of Cognitive Biases

This catalog of cognitive biases is part of SCARS continuing commitment to helping the victims of scams (financial fraud) to better understand the psychology of scams. In other words, why are victims vulnerable?

[THE ARTICLES LINED BELOW APPEAR MOSTLY ON ROMANCESCAMSNOW.COM – WE ARE RELOCATING THEM TO THIS WEBSITE AND WILL UPDATE THIS CATALOG AS NEW BIASES ARE ENCOUNTERED]

How Do Cognitive Biases Make People Vulnerable To Scams, Fraud, and Deception

How do cognitive biases play a role in making people vulnerable and susceptible to scams, fraud, and deception?

Cognitive biases are mental shortcuts that allow people to make quick decisions and judgments based on their past experiences and memories. These biases can be helpful in many situations, as they allow people to process large amounts of information quickly and efficiently. However, they can also make people vulnerable to scams, fraud, and deception.

One reason why cognitive biases make people vulnerable to scams is that they can lead people to make judgments that are not based on evidence or logical reasoning. For example, Confirmation Bias (a major bias that makes people vulnerable) is the tendency to seek out and interpret information that supports one’s preexisting beliefs, while ignoring or dismissing information that contradicts them. This can make people more susceptible to scams that appeal to their beliefs or biases, as they are more likely to believe the scammer’s claims without critically evaluating the evidence.

There are several ways that people can protect themselves from scams, fraud, and deception. One way is to be aware of common cognitive biases and how they can affect decision-making. This can help people to be more mindful of their thought processes and to question their own judgments.

Another way to protect oneself is to be skeptical of claims and offers that seem too good to be true. It is important to carefully evaluate the evidence and to ask questions before making a decision. This can help people to avoid falling for scams that rely on emotional appeals or incomplete information.

It can also be helpful to seek out additional sources of information and to consult with trusted friends, family members, or professionals before making a decision. This can provide a more balanced perspective and help to identify any potential red flags.

Overall, cognitive biases can make people vulnerable to scams, fraud, and deception by leading them to make judgments that are not based on evidence or logical reasoning, and by causing them to make irrational or risky decisions. However, by being aware of these biases and taking steps to protect oneself, people can reduce their risk of falling victim to these types of scams.

Why it is Important to Learn About Cognitive Biases

Scam victims must understand cognitive biases because these biases significantly influence their susceptibility to manipulation and deception.

By recognizing and understanding cognitive biases, such as authority bias, confirmation bias, and availability heuristic, scam victims can better comprehend how their minds may be predisposed to certain patterns of thought and behavior. This awareness empowers individuals to critically evaluate information, recognize warning signs of potential scams, and make more informed decisions. Understanding cognitive biases also helps scam victims realize that falling prey to scams is not solely a result of their own gullibility or naivety, but rather a consequence of inherent cognitive processes that can affect anyone. By educating themselves about cognitive biases, scam victims can develop strategies to protect themselves against future scams, bolstering their resilience and reducing their vulnerability to exploitation.

Catalog of Cognitive Biases

The List Of Cognitive Biases That Affect Scam Victims That We Have Cataloged

  • Above-Average Effect: This is the tendency to overestimate our own abilities and accomplishments. Also known as the “Illusory Superiority,” this bias causes individuals to overestimate their own abilities and qualities relative to others. For example, we might think that we are better drivers than most people, even though we are not.
  • Actor-Observer Bias: Attributing one’s own behavior to external factors, while attributing others’ behavior to internal factors.
  • AI (Artificial Intelligence) Fallacy Bias: When you start believing that everything you see was created by ChatGPT or another generative AI, when there is no actual evidence that it was.
  • Ambiguity Aversion: This is the tendency to prefer clear-cut choices over ambiguous ones. For example, we might be more likely to choose a sure thing, even if it is a smaller reward than to choose a riskier option with the potential for a larger reward.
  • Anchoring (Anchor) Bias: This is the tendency to rely too heavily on the first piece of information we receive when making a decision. For example, if we hear that a car accident has just happened on our way home from work, we might be more likely to drive carefully for the rest of the night, even if the accident was not in our area.
  • Anchoring and Adjustment: The tendency to rely too heavily on an initial piece of information (the anchor) when making judgments or estimates, and insufficiently adjusting from that anchor as new information becomes available.
  • Attribution bias: This is the tendency to attribute our own successes to our own abilities and our failures to external factors. For example, we might think that we won a game of tennis because we are a good player, but we might think that we lost a game of tennis because the other player was lucky.
  • Authority Bias: One of the most influential cognitive biases for scam victims. The tendency to attribute greater credibility or accuracy to perceived authorities or experts.
  • Attentional Bias: The tendency to pay more attention to certain stimuli while ignoring others, often influenced by personal interests or emotions.
  • Availability Cascade: The self-reinforcing process where a belief or idea becomes more widely accepted and influential simply because it receives more attention and exposure.
  • Availability Heuristic Bias: This is the tendency to judge the likelihood of something based on how easily examples come to mind. For example, we might think that plane crashes are more common than they actually are because we are more likely to remember news stories about plane crashes than news stories about other types of accidents.
  • The Baader-Meinhof Phenomenon: Also known as frequency illusion, this bias occurs when individuals notice something for the first time, and then suddenly, they seem to encounter it everywhere. It is the result of heightened awareness rather than an actual increase in frequency.
  • Base Rate Fallacy: Ignoring statistical or general information (base rates) in favor of specific information or anecdotes when making judgments or decisions.
  • Bandwagon effect: This is the tendency to do something because other people are doing it. For example, we might buy a new product because everyone else is buying it, even though we do not really need it.
  • Belief Perseverance: This is the tendency to cling to our beliefs even in the face of contradictory evidence. For example, we might continue to believe that a particular investment is a good idea even after the investment has lost money.
  • Bias Blind Spot: This is the tendency to believe that we are less biased than other people. For example, we might think that we are good at judging people fairly, even though we are actually biased in our own way.
  • Choice-supportive Bias: This is the tendency to remember our choices more favorably than they actually were. For example, we might remember choosing a particular product because we thought it was the best option, even though we were actually influenced by other factors, such as the salesperson’s recommendation.
  • Clustering Illusion: Seeing patterns or clustering in random data or events even when no real pattern exists.
  • Confirmation Bias: This is the tendency to seek out information that confirms our existing beliefs and ignore information that contradicts them. For example, if we believe that we are good drivers, we might be more likely to pay attention to news stories about people who were injured in car accidents while texting, and less likely to pay attention to news stories about people who were injured in car accidents while not texting.
  • Conservatism bias: This is the tendency to resist change, even when there is evidence that change is necessary. For example, we might continue to use the same software program even though there is a newer, better program available.
  • Contrast Effect: This is the tendency to judge something based on how it compares to other things. For example, we might think that a new car is a good deal if it is less expensive than other cars on the market, even if it is actually overpriced.
  • Crime Victim’s Bias – see Victim’s Bias
  • Curse of knowledge: This is the tendency for people who know something to overestimate how easy it is for others to know the same thing. For example, a software engineer might think that it is easy to understand how to use a new piece of software, even though it might be difficult for someone who is not a software engineer to understand.
  • Danger of the Single Story: This is the tendency to focus on one story or perspective and ignore other stories or perspectives. For example, we might read a news story about a terrorist attack and come away with the impression that terrorism is a major problem, even though it is actually a relatively rare event.
  • Decision Fatigue: This is the tendency for our decision-making abilities to decline after we have made a number of decisions. For example, we might be more likely to make impulsive or irrational decisions if we have been making a lot of decisions in a short period of time.
  • Decoy Effect: This is the tendency to choose a less-preferred option when it is presented as a decoy. For example, we might be more likely to choose a $500 phone if it is presented as the only option, but we might be more likely to choose a $400 phone if it is presented as one of two options, with the $500 phone as the other option.
  • Diversity illusion: This is the tendency to overestimate the amount of diversity in a group. For example, we might think that a group of people is diverse because they come from different backgrounds, even if they all share the same political views.
  • Dunning-Kruger Effect: This is the tendency for people with low ability to overestimate their own ability, while people with high ability to underestimate their own ability. For example, a person with low knowledge of a particular subject might think that they are an expert, while a person with high knowledge of the subject might think that they are not as knowledgeable as they actually are.
  • Empathy Gap: This is the tendency to underestimate the impact of negative events on others. For example, we might think that we would be able to handle a difficult situation better than someone else, even though we would not be able to handle it as well.
  • Endowment effect: This is the tendency to value something more highly simply because we own it. For example, we might be more likely to be unwilling to sell a piece of jewelry that we own, even if we would be willing to buy a similar piece of jewelry for a lower price.
  • Escalation Of Commitment: This is the tendency to increase our investment in a failing course of action, even though it is clear that the course of action is not going to be successful. For example, we might continue to invest in a failing business even though it is clear that the business is not going to be profitable.
  • False Consensus Effect Bias: This is the tendency to believe that our own beliefs and opinions are more common than they actually are. For example, we might think that most people agree with our political views, even though they actually do not.
  • False Memory: The creation or recall of inaccurate or false memories that feel real and convincing.
  • Fear of Success: This bias leads people to believe that if someone else succeeds, it means that they will have to lose out. This can lead people to sabotage the success of others in order to protect themselves.
  • Focusing illusion: This is the tendency to focus on a particular detail or aspect of a situation, while ignoring the broader context. For example, we might focus on the fact that a particular stock has gone up in value, while ignoring the fact that the overall market has also gone up in value.
  • Forer effect: This is the tendency for people to believe that a personality description is accurate about them, even if the description is actually vague and general. For example, we might read a personality description that says “you are a deep thinker who is often misunderstood,” and we might believe that the description is accurate about us, even though the description could apply to many different people.
  • Framing Effect: The way information is presented can influence judgments and decisions. Also see Retrospective Framing.
  • Frequency Illusion: This is the tendency to notice something more often after we have become aware of it. For example, we might start to notice the number 11 everywhere after we have just seen it for the first time.
  • Fundamental Attribution Error: This is the tendency to attribute other people’s behavior to their personality traits, while attributing our own behavior to situational factors. For example, we might think that someone who is rude to us is just a bad person, while we might think that we were rude to someone because we were having a bad day.
  • The Gambler’s Fallacy: This is the mistaken belief that the outcome of a random event is influenced by previous outcomes. For example, we might think that we are more likely to win at roulette if we have lost a few times in a row.
  • Group Attribution Error: This is the tendency to attribute the negative behavior of individuals to their group membership, while attributing the positive behavior of individuals to their individual characteristics. For example, we might think that all members of a particular group are lazy, even though there are some members of the group who are not lazy.
  • Group Polarization: This is the tendency for groups to make decisions that are more extreme than the initial views of the individual members of the group. For example, a group of people who are initially opposed to a new policy might become even more opposed to the policy after discussing it together.
  • Groupthink: This is the tendency for members of a group to make decisions that are not in the best interests of the group because they are afraid to disagree with the majority. For example, a group of executives might decide to launch a new product that is not well-researched because they are afraid to disagree with the CEO.
  • Halo Effect: This is the tendency to make judgments about a person based on one positive or negative trait. For example, we might think that a person is intelligent because they are good-looking, even though there is no connection between the two traits.
  • Hindsight Bias: This is the tendency to believe that we could have predicted an outcome after it has already happened. For example, we might think that we knew that a particular stock was going to go up in value, even though we actually had no way of knowing that at the time.
  • The IKEA Effect: This bias describes the tendency of people to place a higher value on objects they have assembled or created themselves. When individuals invest their time and effort in building something, they develop a sense of ownership and attachment, leading them to overvalue the end product.
  • Illusion of Control Bias: This is the tendency to overestimate our ability to control events. For example, we might think that we can control the weather or the outcome of a sporting event, even though we actually have no control over these things.
  • Illusory Correlation: This is the tendency to see a connection between two events that are not actually related. For example, we might think that we are more likely to be in a car accident on Friday the 13th, even though there is no evidence to support this belief.
  • Illusory Superiority: This is the tendency to overestimate our own abilities and accomplishments. Also known as the “above-average effect,” this bias causes individuals to overestimate their own abilities and qualities relative to others. For example, we might think that we are better drivers than most people, even though we are not.
  • Impression Management: This is the tendency to present ourselves in a way that makes us look good to others. For example, we might exaggerate our accomplishments or downplay our weaknesses in order to make a good impression on someone.
  • Inattentional Blindness: Also known as perceptual blindness, is a cognitive bias where you fail to notice something completely visible within your visual field because your attention is focused elsewhere. It’s not due to any visual impairment, but rather a limitation in our brain’s processing abilities.
  • In-group Bias: This is the tendency to favor members of our own group over members of other groups. For example, we might be more likely to help a friend than a stranger, even if the stranger is in more need of help.
  • Just-World Hypothesis/Just World Bias: The Just-World Hypothesis or Bias is a psychological theory that suggests that people have a strong need to believe that the world is a fair and just place. This belief can lead people to blame victims of misfortune for their own suffering and to hold positive views of themselves and others.
  • Loss aversion: This is the tendency to prefer avoiding losses to acquiring equivalent gains. For example, we might be more likely to walk away from a negotiation even if we are slightly ahead because we do not want to lose any ground.
  • Magical Thinking: This is a cognitive bias that involves attributing causal relationships between unrelated events or believing that one’s thoughts, actions, or wishes can influence the outcome of unrelated events. It is characterized by a belief in supernatural or mystical powers, often defying rational or scientific explanations. People engaging in magical thinking may believe in luck, superstitions, or charms, convinced that certain rituals or behaviors will bring about desired outcomes or protection from harm. While magical thinking can serve as a source of comfort or a coping mechanism, it can also lead to unrealistic beliefs and decisions based on unfounded connections between events, making individuals susceptible to exploitation and deception.
  • Matching hypothesis: This is the tendency to choose partners who are similar to us in terms of personality, values, and interests. For example, we might be more likely to be attracted to someone who is similar to us in terms of our political views, our religious beliefs, or our hobbies.
  • Mental Accounting: This is the tendency to group our financial transactions into different categories, and to treat each category as if it were separate from the others. For example, we might be more likely to spend money on a vacation if we think of it as a “fun” expense, rather than as a “necessary” expense.
  • Mere Exposure Effect: The tendency to develop a preference for things or people simply because they are familiar or have been encountered repeatedly.
  • Negativity bias: This is the tendency to pay more attention to negative information than positive information. For example, we might be more likely to remember news stories about bad things that happen than news stories about good things that happen.
  • Normalcy Bias: Normalcy bias, also known as the ‘disaster myopia’ or ‘normalcy illusion,’ is a psychological phenomenon that causes people to underestimate the likelihood of a disaster and overestimate their ability to cope with it. This bias can lead to complacency and inaction in the face of potential risks.
  • Optimism BiasThis is the tendency to overestimate our own abilities and underestimate the risks involved in a situation. For example, we might think that we are less likely to be injured in a car accident than other people, even though we know that car accidents are a leading cause of death.
  • Omission Bias: The tendency to perceive harmful actions as worse than harmful inactions, leading to a preference for doing nothing even when it is not the best choice.
  • Ostrich Effect: This is the tendency to ignore or deny negative information. For example, we might avoid reading news stories about a particular topic because we do not want to be exposed to negative information about it.
  • Outcome bias: This is the tendency to judge the success of a decision based on the outcome, rather than on the quality of the decision-making process. For example, we might think that a decision was good if it led to a positive outcome, even if the decision-making process was flawed.
  • Overconfidence bias: This is the tendency to overestimate our own abilities and knowledge. For example, we might think that we are better drivers than we actually are, or that we are better at predicting the future than we actually are.
  • Peak-end rule: This is the tendency to judge an experience based on its peak and its end, rather than on the overall experience. For example, we might remember a vacation as being more enjoyable than it actually was because we remember the peak experiences, such as the first time we saw the ocean, and the end of the vacation, when we were relaxed and happy.
  • The Peltzman Effect: Named after economist Sam Peltzman, this bias describes the tendency of people to take more risks when safety measures are introduced. For instance, drivers might drive more recklessly if they believe their vehicles are equipped with advanced safety features.
  • Perceptual Blindness: Also known as Inattentional Blindness, is a cognitive bias where you fail to notice something completely visible within your visual field because your attention is focused elsewhere. It’s not due to any visual impairment, but rather a limitation in our brain’s processing abilities.
  • Placebo Effect: Experiencing positive effects or improvements in symptoms due to believing in the efficacy of a treatment, even if it is inert or lacks active ingredients.
  • Planning Fallacy: This is the tendency to underestimate the amount of time and effort required to complete a task. For example, we might think that we can complete a project in a week, even though it will actually take us two weeks.
  • Post Hoc Ergo Propter Hoc Fallacy: Post Hoc Ergo Propter Hoc is a Latin phrase that translates to “after this, therefore because of this.” It refers to a logical fallacy that occurs when someone assumes that because one event happened after another, the first event must have caused the second event. In other words, it’s the mistaken belief that correlation implies causation.
  • Primacy Effect: This is the tendency to be more influenced by the first piece of information we receive than by subsequent information. For example, we might be more likely to form an opinion about a person based on their first impression, even if we later learn that the first impression was wrong.
  • Procrastination bias: This is the tendency to delay tasks or decisions, even if we know that we should do them. For example, we might put off studying for an exam, even though we know that we need to study in order to do well on the exam.
  • Rational Actor Fallacy: The Rational Actor Fallacy assumes people always make rational decisions based on self-interest, overlooking the role of emotions and biases.
  • Rationalization: This is the tendency to justify our decisions after the fact, even if they were not well-thought-out. For example, we might buy a new car and then convince ourselves that it was a good decision, even if we were not really sure about it at the time.
  • Reactance: This is the tendency to react negatively to attempts to persuade us. For example, if someone tries to convince us to do something that we do not want to do, we might be more likely to do the opposite.
  • Recency Effect: This is the tendency to be more influenced by the most recent piece of information we receive than by earlier information. For example, we might be more likely to vote for a candidate based on their recent campaign promises, even if we previously had a negative opinion of the candidate.
  • Reciprocity Bias: People have a natural tendency to reciprocate favors, gifts, or concessions received from others. Scammers often initiate interactions with seemingly generous gestures or offers to trigger reciprocity bias in their victims, fostering a sense of indebtedness and obligation that makes them more susceptible to further manipulation.
  • Relative Deprivation: Relative deprivation is a psychological concept that arises when individuals perceive themselves as unfairly disadvantaged compared to others.
  • Representativeness heuristic: This is the tendency to judge the probability of something based on how similar it is to other things we know. For example, we might think that a particular stock is a good investment because it has gone up in value recently, even if we do not know anything else about the stock.
  • Retrospective Framing: Retrospective framing is a cognitive bias that occurs when we interpret past events in a way that is consistent with our current beliefs or attitudes. This can lead to us misremembering or reinterpreting past events in a way that supports our current worldview. Also see the Framing Effect.
  • Salience bias: This is the tendency to pay more attention to information that is salient, or noticeable. For example, we might be more likely to remember a news story about a terrorist attack than a news story about a natural disaster, even though the natural disaster killed more people.
  • Scarcity Heuristic: People tend to place greater value on items, opportunities, or information that is perceived as scarce or limited in availability. Scammers exploit the scarcity heuristic by creating a sense of urgency or scarcity around their offers, such as limited-time deals or exclusive opportunities, to compel victims to act quickly and without thorough consideration.
  • Scarcity Mindset: This bias leads people to believe that there is only a limited amount of success to go around. This can lead people to compete with each other for success, even if it means pulling each other down.
  • Selection Bias: Drawing conclusions from a non-representative sample or biased data selection.
  • Selective Perception: This is the tendency to see what we want to see and ignore what we do not want to see. For example, we might be more likely to notice news stories that confirm our existing beliefs, and less likely to notice news stories that contradict our beliefs.
  • Self-fulfilling Prophecy: This is the tendency for our expectations to influence our behavior in a way that makes those expectations come true. For example, if we expect to fail at something, we are more likely to behave in a way that makes it more likely that we will fail.
  • Self-Serving Bias: This is the tendency to take credit for our successes and deny responsibility for our failures. For example, we might think that we got a promotion because we are a good employee, but we might think that we got fired because our boss was unfair.
  • Semmelweis Reflex: The tendency to reject or dismiss new evidence or information that contradicts established beliefs or practices.
  • Serial Position Effect: Remembering and recalling information better when it is presented at the beginning (primacy effect) or end (recency effect) of a list or sequence.
  • Shifting Baseline Syndrome: Shifting baseline syndrome (SBS) is considered a cognitive bias. SBS is a cognitive bias because it leads us to underestimate the magnitude of environmental change over time (including our personal environment.) We do this because we compare the current state of the environment to our own baseline, which is the environment that we grew up in. As each generation grows up, the baseline shifts, and we come to accept a lower level of environmental quality as normal. But it can also be that we compare the current to the most recently remembered baseline which may be very recent, and as such we do not see much change.
  • Social Comparison Bias: Social comparison bias refers to the tendency for individuals to evaluate themselves and their circumstances by comparing them to those of others. This comparison can lead to either upward or downward (superiority or inferiority) social comparisons, where individuals assess themselves as either better off or worse off than others in various aspects of life. In the context of scam victims comparing their suffering or victimization to others, social comparison bias can play a significant role in stopping their recovery (getting stuck in the comparisons), or other harmful effects.
  • Social Desirability Bias: The inclination to provide responses that are socially acceptable or desirable rather than truthful.
  • Social Proof: Individuals are more likely to conform to the actions or beliefs of others, particularly when they perceive those others as similar or credible. Scammers may create the illusion of social proof by fabricating testimonials, reviews, or endorsements to convince victims that their fraudulent offers or claims are legitimate and endorsed by others.
  • Social Trust:  Is the expectation that people will behave with goodwill and avoid harming others, is a concept that has long mystified both researchers and the general public alike.
  • Spotlight Effect: Overestimating how much attention or importance others place on our appearance, behavior, or performance.
  • Status Quo Bias: This is the tendency to stick with the familiar and avoid change, even if the change would be beneficial. For example, we might continue to walk home from work at night, even though we know that it is dangerous because we are used to doing it and it is the easiest option
  • Stereotype Threat: This is the tendency for people to perform worse on a task if they are aware of a stereotype that suggests that they will not perform well on the task. For example, a black student might perform worse on a math test if they are aware of the stereotype that black people are not good at math.
  • Stereotyping: Making generalizations or assumptions about a group of people based on limited information or characteristics associated with that group.
  • Stranger Trust: Stranger trust is the tendency to trust people whom we don’t know very well. This is yet another of our cognitive biases, as it can lead us to make incorrect judgments about people’s trustworthiness without any real reason to trust them.
  • The Subadditivity Effect: This bias involves the underestimation of the total risk of multiple, simultaneous events compared to the sum of the risks of each event considered separately. For example, people may perceive the combined risk of two potential hazards as less than the sum of the risks of each hazard individually.
  • Sunk Cost Fallacy Bias: Also called ‘Chasing the Money.’ This is the tendency to continue investing in a failing project because we have already invested a lot of time and money into it. For example, we might continue to work on a project that is not going well, even though it is clear that it is not going to be successful.
  • Survivorship Bias: This is the tendency to focus on the successes of others while ignoring the failures. For example, we might think that starting our own business is a great way to get rich because we only hear about successful entrepreneurs. However, we forget about all of the entrepreneurs who failed.
  • System Justification Bias: This is the tendency to believe that the existing social order is fair and just, even if it is not. For example, we might believe that the rich deserve to be rich and the poor deserve to be poor, even if there is no evidence to support this belief.
  • The Texas sharpshooter fallacy: This is the tendency to see patterns in random data. For example, we might see a pattern in the stock market, even though the pattern is actually random.
  • This Then That: see Post Hoc Ergo Propter Hoc
  • Trade-off Bias: This is the tendency to focus on the losses associated with a particular option while ignoring the gains. For example, we might be more likely to choose a sure thing, even if it is a smaller reward than to choose a riskier option with the potential for a larger reward.
  • Trauma Bias: Trauma bias is a type of cognitive bias that can occur when people have preconceived notions or beliefs about trauma and its effects.This bias can influence how people perceive, interpret, and respond to traumatic events and the people who have experienced them.
  • Victim’s Bias: This is the belief that only other victims who have experienced the same type of crime can possibly understand what a victim has experienced. That somehow other victims become experts and are able to provide advice about victimization, criminology, and recovery psychology, even though they are only experts in the experience they had which is often not comparable to the other victims’ experience. It is often coupled with other biases, such as the Confirmation Bias leading victims to believe that another victim experienced exactly the same even though the details may be quite different. This can lead to serious errors of judgment after a crime by seeking out nonprofessionals that do not understand the need for grief processing, trauma counseling, or the recovery process.
  • Wishful Thinking: This is the tendency to believe that something we want to be true is actually true. For example, we might think that we are going to win the lottery, even though the odds of winning are very low.
  • Zeigarnik Effect: Better memory recall for incomplete tasks or interrupted activities compared to completed tasks.
  • Zero-risk Bias: Preferring options that carry no risk, even when the benefits of a slightly riskier option outweigh the potential harm.
  • Zero-sum Bias: This is the tendency to believe that there is a fixed amount of success or resources in the world and that if one person gains, another person must lose. For example, we might believe that if one company makes a profit, another company must be losing money.

Cognitive Biases Summary

Cognitive biases do make people more vulnerable to scams, fraud, and deception by causing them to ignore warning signs, pay more attention to information that supports their preexisting beliefs, rely on incomplete information, and anchor their decisions to easy and often incorrect information. But they also play a role in helping you to remain blind to the crime as it is going on – though it is not your fault.

Of greater concern though is after the scam. These same biases can play a major role in preventing victims from recovering or even seeking the help they need.

By being aware of these biases and making an effort to overcome them, people can be better equipped to avoid falling victim to scams and other forms of deception, and better recover from these crimes after they end.

The Cognitive Bias Codex

click to see full-size

The Cognitive Bias Codex

Wikipedia’s complete (as of 2016) list of cognitive biases, beautifully arranged, and designed by John Manoogian III (jm3). Categories and descriptions originally by Buster Benson. Used with permission.

Cognitive Biases – Recourse Research & Further Reading

Important Information for New Scam Victims

SCARS Resources:

Statement About Victim Blaming

Some of our articles discuss various aspects of victims. This is both about better understanding victims (the science of victimology) and their behaviors and psychology. This helps us to educate victims/survivors about why these crimes happened and to not blame themselves, better develop recovery programs, and to help victims avoid scams in the future. At times this may sound like blaming the victim, but it does not blame scam victims, we are simply explaining the hows and whys of the experience victims have.

These articles, about the Psychology of Scams or Victim Psychology – meaning that all humans have psychological or cognitive characteristics in common that can either be exploited or work against us – help us all to understand the unique challenges victims face before, during, and after scams, fraud, or cybercrimes. These sometimes talk about some of the vulnerabilities the scammers exploit. Victims rarely have control of them or are even aware of them, until something like a scam happens and then they can learn how their mind works and how to overcome these mechanisms.

Articles like these help victims and others understand these processes and how to help prevent them from being exploited again or to help them recover more easily by understanding their post-scam behaviors. Learn more about the Psychology of Scams at www.ScamPsychology.org (this website.)

Psychology Disclaimer:

All articles about psychology and the human brain on this website are for information & education only

The information provided in these articles are intended for educational and self-help purposes only and should not be construed as a substitute for professional therapy or counseling.

While any self-help techniques outlined herein may be beneficial for scam victims seeking to recover from their experience and move towards recovery, it is important to consult with a qualified mental health professional before initiating any course of action. Each individual’s experience and needs are unique, and what works for one person may not be suitable for another.

Additionally, any approach may not be appropriate for individuals with certain pre-existing mental health conditions or trauma histories. It is advisable to seek guidance from a licensed therapist or counselor who can provide personalized support, guidance, and treatment tailored to your specific needs.

If you are experiencing significant distress or emotional difficulties related to a scam or other traumatic event, please consult your doctor or mental health provider for appropriate care and support.

If you are in crisis, feeling desperate, or in despair please call 988 or your local crisis hotline.

-/ 30 /-

What do you think about this?

Please share your thoughts in a comment below!

Opinions

The opinions of the author are not necessarily those of the Society of Citizens Against Rleationship Scams Inc. The author is solely responsible for the content of their work. SCARS is protected under the Communications Decency Act (CDA) section 230 from liability.

Disclaimer:

SCARS IS A DIGITAL PUBLISHER AND DOES NOT OFFER HEALTH OR MEDICAL ADVICE, LEGAL ADVICE, FINANCIAL ADVICE, OR SERVICES THAT SCARS IS NOT LICENSED OR REGISTERED TO PERFORM.

IF YOU’RE FACING A MEDICAL EMERGENCY, CALL YOUR LOCAL EMERGENCY SERVICES IMMEDIATELY, OR VISIT THE NEAREST EMERGENCY ROOM OR URGENT CARE CENTER. YOU SHOULD CONSULT YOUR HEALTHCARE PROVIDER BEFORE FOLLOWING ANY MEDICALLY RELATED INFORMATION PRESENTED ON OUR PAGES.

ALWAYS CONSULT A LICENSED ATTORNEY FOR ANY ADVICE REGARDING LEGAL MATTERS.

A LICENSED FINANCIAL OR TAX PROFESSIONAL SHOULD BE CONSULTED BEFORE ACTING ON ANY INFORMATION RELATING TO YOUR PERSONAL FINANCES OR TAX RELATED ISSUES AND INFORMATION.

SCARS IS NOT A PRIVATE INVESTIGATOR – WE DO NOT PROVIDE INVESTIGATIVE SERVICES FOR INDIVIDUALS OR BUSINESSES. ANY INVESTIGATIONS THAT SCARS MAY PERFORM IS NOT A SERVICE PROVIDED TO THIRD-PARTIES. INFORMATION REPORTED TO SCARS MAY BE FORWARDED TO LAW ENFORCEMENT AS SCARS SEE FIT AND APPROPRIATE.

This content and other material contained on the website, apps, newsletter, and products (“Content”), is general in nature and for informational purposes only and does not constitute medical, legal, or financial advice; the Content is not intended to be a substitute for licensed or regulated professional advice. Always consult your doctor or other qualified healthcare provider, lawyer, financial, or tax professional with any questions you may have regarding the educational information contained herein. SCARS makes no guarantees about the efficacy of information described on or in SCARS’ Content. The information contained is subject to change and is not intended to cover all possible situations or effects. SCARS does not recommend or endorse any specific professional or care provider, product, service, or other information that may be mentioned in SCARS’ websites, apps, and Content unless explicitly identified as such.

The disclaimers herein are provided on this page for ease of reference. These disclaimers supplement and are a part of SCARS’ website’s Terms of Use

Legal Notices: 

All original content is Copyright © 1991 – 2024 Society of Citizens Against Relationship Scams Inc. (Registered D.B.A SCARS) All Rights Reserved Worldwide & Webwide. Third-party copyrights acknowledged.

U.S. State of Florida Registration Nonprofit (Not for Profit) #N20000011978 [SCARS DBA Registered #G20000137918] – Learn more at www.AgainstScams.org

SCARS, SCARS|INTERNATIONAL, SCARS, SCARS|SUPPORT, SCARS, RSN, Romance Scams Now, SCARS|INTERNATION, SCARS|WORLDWIDE, SCARS|GLOBAL, SCARS, Society of Citizens Against Relationship Scams, Society of Citizens Against Romance Scams, SCARS|ANYSCAM, Project Anyscam, Anyscam, SCARS|GOFCH, GOFCH, SCARS|CHINA, SCARS|CDN, SCARS|UK, SCARS|LATINOAMERICA, SCARS|MEMBER, SCARS|VOLUNTEER, SCARS Cybercriminal Data Network, Cobalt Alert, Scam Victims Support Group, SCARS ANGELS, SCARS RANGERS, SCARS MARSHALLS, SCARS PARTNERS, are all trademarks of Society of Citizens Against Relationship Scams Inc., All Rights Reserved Worldwide

Contact the legal department for the Society of Citizens Against Relationship Scams Incorporated by email at legal@AgainstScams.org