Cognitive bias

The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972 and grew out of their experience of people’s innumeracy, or inability to reason intuitively with the greater orders of magnitude. They and their colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. They explained these differences interms of heuristics; rules which are simple for the brain to compute but introduce systematic errors. For instance the availability heuristic, when the ease with which something comes to mindis used toindicate how often (or how recently)it has been encountered.

Cognitive bias is any of a wide range of observer effects identified in cognitive science, including very basic statistical and memory errors that are common to all human beings (many of which have been discussed by Amos Tversky and Daniel Kahneman) and drastically skew the reliability of anecdotal and legal evidence. They also significantly affect the scientific method which is deliberately designed to minimize such bias from any one observer.

cognitive bias is a pattern of deviation in judgment that occurs in particular situations, which may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.[1][2][3] Implicit in the concept of a “pattern of deviation” is a standard of comparison with what is normatively expected; this may be the judgment of people outside those particular situations, or may be a set of independently verifiable facts. A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive sciencesocial psychology, and behavioral economics.

Some cognitive biases are presumably adaptive, for example, because they lead to more effective actions in a given context or enable faster decisions when timeliness is more valuable than accuracy (heuristics). Others presumably result from a lack of appropriate mental mechanisms (bounded rationality), or simply from a limited capacity for information processing.

Biases can be distinguished on a number of dimensions. For example, there are biases specific to groups (such as the risky shift) as well as biases at the individual level.

Some biases affect decision-making, where the desirability of options has to be considered (e.g., Sunk Cost fallacy). Others such as Illusory correlation affect judgment of how likely something is, or of whether one thing is the cause of another. A distinctive class of biases affect memory,[12] such as consistency bias (remembering one’s past attitudes and behavior as more similar to one’s present attitudes).

Some biases reflect a subject’s motivation,[13] for example, the desire for a positive self-image leading to Egocentric bias[14] and the avoidance of unpleasant cognitive dissonance. Other biases are due to the particular way the brain perceives, forms memories and makes judgments. This distinction is sometimes described as “Hot cognition” versus “Cold Cognition”, as motivated reasoning can involve a state of arousal.

Among the “cold” biases, some are due to ignoring relevant information (e.g. Neglect of probability), whereas some involve a decision or judgement being affected by irrelevant information (for example the Framing effect where the same problem receives different responses depending on how it is described) or giving excessive weight to an unimportant but salient feature of the problem (e.g., Anchoring).

The fact that some biases reflect motivation, and in particular the motivation to have positive attitudes to oneself[14] accounts for the fact that many biases are self-serving or self-directed (e.g.Illusion of asymmetric insightSelf-serving biasProjection bias). There are also biases in how subjects evaluate in-groups or out-groups; evaluating in-groups as more diverse and “better” in many respects, even when those groups are arbitrarily-defined (Ingroup biasOutgroup homogeneity bias).

Some cognitive biases belong to the subgroup of attentional biases which refer to the paying of increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the Stroop Task[15][16] and the Dot Probe Task.

The following is a list of the more commonly studied cognitive biases:

For other noted biases, see List of cognitive biases.
  • Framing by using a too-narrow approach and description of the situation or issue.
  • Hindsight bias, sometimes called the “I-knew-it-all-along” effect, is the inclination to see past events as being predictable.
  • Fundamental attribution error is the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior.
  • Confirmation bias is the tendency to search for or interpret information in a way that confirms one’s preconceptions; this is related to the concept of cognitive dissonance.
  • Self-serving bias is the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests.
  • Belief bias is when one’s evaluation of the logical strength of an argument is biased by their belief in the truth or falsity of the conclusion.

A 2012 Psychological Bulletin article suggests that at least 8 seemingly unrelated biases can be produced by the same information-theoretic generative mechanism.[17] It is shown that noisy deviations in the memory-based information processes that convert objective evidence (observations) into subjective estimates (decisions) can produce regressive conservatism, the conservatism (Bayesian)illusory correlationsbetter-than-average effect and worse-than-average effectsubadditivity effectexaggerated expectationoverconfidence, and the hard–easy effect.

However, as recent research has demonstrated,, even scientists who adhere to the scientific method can’t guarantee they will draw the best possible conclusions. “How could such highly-educated and precisely-trained professionals veer off the path of objectivity?” The answer is simple: Being human.

As the fields of psychology and behavioral economics have demonstrated, homo sapiens is a seemingly irrational species that appears to, more often than not, think and behave in nonsensical rather than commonsensical ways. The reason is that we fall victim to a veritable laundry list of cognitive biases that cause us to engage in distorted, imprecise and incomplete thinking which, not surprisingly, results in “perceptual distortion, inaccurate judgment or illogical interpretation” (thanks Wikipedia), and, by extension, poor and sometimes catastrophic decisions.

Well-known examples of the results of cognitive biases include the Internet, the housing and financial crises of the past decade, truly stupid use of social media by politicians, celebrities and professional athletes, the existence of the $2.5 billion self-help industry, and, well, believing that a change in the controlling party in Washington will somehow change its toxic political culture.

What is interesting is that many of these cognitive biases must have had, at some point in our evolution, adaptive value. These distortions helped us to process information more quickly (e.g., stalking prey in the jungle), meet our most basic needs (e.g., help us find mates) and connect with others (e.g., be a part of a “tribe”).

The biases that helped us survive in primitive times when life was much simpler (e.g., life goal: live through the day) and speed of a decision rightfully trumped its absolute accuracy doesn’t appear to be quite as adaptive in today’s much more complex world. Due to the complicated nature of life these days, correctness of information, thoroughness of processing, precision of interpretation and soundness of judgment are, in most situations today, far more important than the simplest and fastest route to a judgment.

Unfortunately, there is no magic pill that will inoculate us from these cognitive biases. But we can reduce their power over us by understanding these distortions, looking for them in our own thinking and making an effort to counter their influence over us as we draw conclusions, make choices and come to decisions. In other words, just knowing and considering these universal biases (in truth, what most people call common sense is actually common bias) will make us less likely to fall victim to them.

Here are some of the most widespread cognitive biases that contaminate our ability to use common sense:

  • The bandwagon effect (aka herd mentality) describes the tendency to think or act in ways because other people do. Examples include the popularity of Apple products, use of “in-group” slang and clothing style and watching the “The Real Housewives of … ” reality-TV franchise.
  • The confirmation bias involves the inclination to seek out information that supports our own preconceived notions. The reality is that most people don’t like to be wrong, so they surround themselves with people and information that confirm their beliefs. The most obvious example these days is the tendency to follow news outlets that reinforce our political beliefs.
  • Illusion of control is the propensity to believe that we have more control over a situation than we actually do. If we don’t actually have control, we fool ourselves into thinking we do. Examples include rally caps in sports and “lucky” items.
  • The Semmelweis reflex (just had to include this one because of its name) is the predisposition to deny new information that challenges our established views. Sort of the yang to the yin of the confirmation bias, it exemplifies the adage “if the facts don’t fit the theory, throw out the facts.” An example is the “Seinfeld” episode in which George Costanza’s girlfriend simply refuses to allow him to break up with her.
  • The causation bias suggests the tendency to assume a cause-effect relationship in situations in which none exists (or there is a correlation or association). An example is believing someone is angry with you because they haven’t responded to your email when, more likely, they are busy and just haven’t gotten to it yet.
  • The overconfidence effect involves unwarranted confidence in one’s own knowledge. Examples include political and sports prognosticators.
  • The false consensus effect is the penchant to believe that others agree with you more than they actually do. Examples include guys who assume that all guys like sexist humor.
  • Finally, the granddaddy of all cognitive biases, the fundamental attribution error, which involves the tendency to attribute other people’s behavior to their personalities and to attribute our own behavior to the situation. An example is when someone treats you poorly, you probably assume they are a jerk, but when you’re not nice to someone, it’s because you are having a bad day.

Memory bias — Memory biases may either enhance or impair the recall of memory, or they may alter the content of what we report remembering. There are many memory …  > read more

Anchoring bias in decision-making — Anchoring or focalism is a term used in psychology to describe the common human tendency to rely too heavily, or “anchor,” on one trait or piece of …  > read more

Many of these biases are studied for how they affect belief formation and business decisions and scientific research.

  • Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthinkcrowd psychologyherd behaviour, and manias.
  • Bias blind spot — the tendency not to compensate for one’s own cognitive biases.
  • Choice-supportive bias — the tendency to remember one’s choices as better than they actually were.
  • Confirmation bias — the tendency to search for or interpret information in a way that confirms one’s preconceptions.
  • Congruence bias — the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
  • Contrast effect — the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
  • Déformation professionnelle — the tendency to look at things according to the conventions of one’s own profession, forgetting any broader point of view.
  • Endowment effect — “the fact that people often demand much more to give up an object than they would be willing to pay to acquire it”.[2]
  • Exposure-suspicion bias – a knowledge of a subject’s disease in a medical study may influence the search for causes.
  • Extreme aversion — most people will go to great lengths to avoid extremes. People are more likely to choose an option if it is the intermediate choice.
  • Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
  • Framing – drawing different conclusions from the same information, depending on how that information is presented.
  • Hyperbolic discounting — the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, the closer to the present both payoffs are.
  • Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
  • Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
  • Information bias — the tendency to seek information even when it cannot affect action.
  • Irrational escalation — the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
  • Loss aversion — “the disutility of giving up an object is greater than the utility associated with acquiring it”.[3] (see also sunk cost effects and Endowment effect).
  • Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
  • Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
  • Obsequiousness bias – the tendency to systematically alter responses in the direction they perceive desired by the investigator.
  • Omission bias — the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
  • Outcome bias — the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
  • Planning fallacy — the tendency to underestimate task-completion times. Also formulated as Hofstadter’s Law: “It always takes longer than you expect, even when you take into account Hofstadter’s Law.”
  • Post-purchase rationalization — the tendency to persuade oneself through rational argument that a purchase was a good value.
  • Pseudocertainty effect — the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
  • Reactance – the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
  • Selective perception — the tendency for expectations to affect perception.
  • Status quo bias — the tendency for people to like things to stay relatively the same (see also Loss aversion and Endowment effect).[4]
  • Unacceptability bias – questions that may embarrass or invade privacy are refused or evaded.
  • Unit bias — the tendency to want to finish a given unit of a task or an item with strong effects on the consumption of food in particular
  • Von Restorff effect — the tendency for an item that “stands out like a sore thumb” to be more likely to be remembered than other items.
  • Zero-risk bias — the preference for reducing a small risk to zero over a greater reduction in a larger risk. It is relevant e.g. to the allocation of public health resources and the debate about nuclear power.

Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research.

  • Ambiguity effect — the avoidance of options for which missing information makes the probability seem “unknown”.
  • Anchoring — the tendency to rely too heavily, or “anchor,” on a past reference or on one trait or piece of information when making decisions.
  • Anthropic bias — the tendency for one’s evidence to be biased by observation selection effects.
  • Attentional bias — neglect of relevant data when making judgments of a correlation or association.
  • Availability heuristic — a biased prediction, due to the tendency to focus on the most salient and emotionally-charged outcome.
  • Clustering illusion — the tendency to see patterns where actually none exist.
  • Conjunction fallacy — the tendency to assume that specific conditions are more probable than general ones.
  • Gambler’s fallacy — the tendency to assume that individual random events are influenced by previous random events. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”
  • Hindsight bias — sometimes called the “I-knew-it-all-along” effect: the inclination to see past events as being predictable, based on knowledge of later events.
  • Hostile media effect — the tendency to perceive news coverage as biased against your position on an issue.
  • Illusory correlation — beliefs that inaccurately suppose a relationship between a certain type of action and an effect.
  • Ludic fallacy — the analysis of chance related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-gaussian distribution of many things.
  • Neglect of prior base rates effect — the tendency to fail to incorporate prior known probabilities which are pertinent to the decision at hand.
  • Observer-expectancy effect — when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
  • Optimism bias — the systematic tendency to be over-optimistic about the outcome of planned actions. Found to be linked to the “left inferior frontal gyrus” section of the brain, and disrupting this section of the brain removes the bias. Article summarising this finding
  • Overconfidence effect — the tendency to overestimate one’s own abilities.
  • Positive outcome bias — a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias and valence effect).
  • Primacy effect — the tendency to weigh initial events more than subsequent events.
  • Recency effect — the tendency to weigh recent events more than earlier events (see also ‘peak-end rule’).
  • Reminiscence bump — the effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.
  • Rosy retrospection — the tendency to rate past events more positively than they had actually rated them when the event occurred.
  • Subadditivity effect — the tendency to judge probability of the whole to be less than the probabilities of the parts.
  • Telescoping effect — the effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.
  • Texas sharpshooter fallacy — the fallacy of selecting or adjusting a hypothesis after the data are collected, making it impossible to test the hypothesis fairly.

Most of these biases are labeled as attributional biases.

  • Actor-observer bias — the tendency for explanations for other individual’s behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation. This is coupled with the opposite tendency for the self in that one’s explanations for their own behaviors overemphasize their situation and underemphasize the influence of their personality. (see also fundamental attribution error).
  • Dunning-Kruger effect — “…when people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it. Instead, …they are left with the mistaken impression that they are doing just fine.”[5] (See also the Lake Wobegon effect, and overconfidence effect).
  • Egocentric bias — occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
  • Forer effect (aka Barnum Effect) — the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
  • False consensus effect — the tendency for people to overestimate the degree to which others agree with them.
  • Fundamental attribution error — the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).
  • Halo effect — the tendency for a person’s positive or negative traits to “spill over” from one area of their personality to another in others’ perceptions of them (see also physical attractiveness stereotype).
  • Herd instinct – a common tendency to adopt the opinions and follow the behaviors of the majority to feel safer and to avoid conflict.
  • Illusion of asymmetric insight — people perceive their knowledge of their peers to surpass their peers’ knowledge of them.
  • Illusion of transparency — people overestimate others’ ability to know them, and they also overestimate their ability to know others.
  • Ingroup bias — the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
  • Just-world phenomenon — the tendency for people to believe that the world is “just” and therefore people “get what they deserve.”
  • Lake Wobegon effect — the human tendency to report flattering beliefs about oneself and believe that one is above average (see also worse-than-average effect, and overconfidence effect).
  • Notational bias — a form of cultural bias in which a notation induces the appearance of a nonexistent natural law.
  • Outgroup homogeneity bias — individuals see members of their own group as being relatively more varied than members of other groups.
  • Projection bias — the tendency to unconsciously assume that others share the same or similar thoughts, beliefs, values, or positions.
  • Self-serving bias — the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).
  • Self-fulfilling prophecy — the tendency to engage in behaviors that elicit results which will (consciously or subconsciously) confirm our beliefs.
  • System justification — the tendency to defend and bolster the status quo, i.e. existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest.
  • Trait ascription bias — the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable.
  • Beneffectance – perceiving oneself as responsible for desirable outcomes but not responsible for undesirable ones. (Term coined by Greenwald (1980))
  • Consistency bias– incorrectly remembering one’s past attitudes and behaviour as resembling present attitudes and behaviour.
  • Cryptomnesia – a form of misattribution where a memory is mistaken for imagination.
  • Egocentric bias – recalling the past in a self-serving manner, e.g. remembering one’s exam grades as being better than they were, or remembering a caught fish as being bigger than it was
  • Confabulation or false memory – Remembering something that never actually happened.
  • Hindsight bias – filtering memory of past events through present knowledge, so that those events look more predictable than they actually were; also known as the ‘I-knew-it-all-along effect’.
  • Selective Memory and selective reporting
  • Suggestibility – a form of misattribution where ideas suggested by a questioner are mistaken for memory. Often a key aspect of hypnotherapy.

 

One thought on “Cognitive bias

  1. Pingback: Donald Trump: FRAMED! – The Psy of Life

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s