Cognitive dissonance

Cognitive dissonance is a psychological term which describes the uncomfortable tension that results from having two conflicting thoughts at the same time, or from engaging in behavior that conflicts with one’s beliefs. It is also a description of the behaviors that allow you to override such dissonance. This entire process is studied under the rubric of “cognitive dissonance”. Some  examples of cognitive dissonance include:

  • After feeling “buyer’s remorse” and being unable to return something, convincing yourself that “no, it really was a good purchase”
  • Justifying the work you do, as important to your community or self (purchase justification)
  • Believing that fault or error (bad grades on a test, not getting a promotion, getting into an accident) must be someone else’s fault, not yours.
  • Insisting that anyone in competition with you is “bad” “evil” or “out to get you”

The concept of cognitive dissonance was developed and tested by observing some cults and observing how they reacted when their beliefs (in the end of the world) were shattered (by the world simply not ending), first and most famously in Leon Festinger, Henry Riecken, and Stanley Schacter’s When Prophecy Fails.[1] As the sensation of dissonance is very unpleasant, most people tend to resolve it by converting their knowledge, beliefs, behaviours and perceptions so that they are consistent between each other. Sounds logical, indeed, but there is a catch: the resolution is usually through the path of least psychological resistance. For example, when the cults’ prophecies were proved to be wrong, the followers’ faith didn’t diminish; to the contrary, it strengthened, because it is much easier to simply disavow pieces of evidence as “false”, put up an excuse and keep on believing, than to change a belief that has grown to be an individual’s entire soul, fiber and character.

Most people will (eventually) change their beliefs on a subject after enough contradictory evidence emerges. Because sometimes evidence emerges that is so solid and undeniable that it is easier to give up a complex worldview than having to constantly generate excuses why this evidence is false. Others individuals, especially when they have support networks of others reinforcing a delusion or worldview, will go to such great lengths to rationalize away dissenting ideas that after a certain point, an admission of error would cause the collapse of an entire web of mutually supporting beliefs. This would leave the brain with no ability to do its work, as everything it thought it knew would now be useless, resulting in agony/extreme fear of death and the activation of emergency self-protection mechanisms. Those mechanisms cause the individual to either go into an introverted reaction, with all-encompassing ignorance and cutting off any contact to those conflicting parts of the real world (See: Zen), or an extroverted reaction of trying to attack and destroy the sources of the conflicting information for heresy (typical for Abrahamic religions).

By comparison, a human who would actually go through such an event without that protection, would end up in a state of complete inability to accept themself and to choose the “right” actions for even the simplest situations, making it impossible for them to continue living. So the former protective reactions are still the better (because it means survival) of three really bad choices.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s