User Tools

Site Tools


analysis:bias:conformity

Conformity Bias

Rule:
When deciding what to do, look around and see what others most commonly do in this situation and imitate them.

Aphorisms:
When in Rome, do as the Romans do.
The nail that sticks up will be hammered down.
These aphorisms don't capture the primary idea, which is that you gain valuable information about “what works” by copying those around you. Instead, they emphasize the social cost of nonconformity.

Conformity bias is a particular interpretation of social Conformity that comes from the Boyd and Richerson theory of Cultural Evolution. Almost everyone who has considered the issue of social conformity acknowledges that people conform strongly to social behavioral norms. Interestingly, in most academic disciplines, the primary emphasis has been on the harmful effects of conformity, both as a constraint on individual freedom, and also as a pathology of decision making, where Groupthink or Herd Mentality leads to decisions that (in hindsight) were “obviously wrong.” However, recently the idea that social decision making can give superior outcomes has been getting increasing attention, such as in Wisdom of the Crowd and Gut Feelings.

From an evolutionary perspective, we would expect that such a pervasive decision bias must usually be strongly adaptive, especially given that it is also noted to have fairly frequent harmful effects. Cultural Evolution researchers have shown that in computer simulations of cultural evolution, conformity bias is necessary for cumulative cultural evolution to take place. Conformity bias is the cultural analog of DNA Repair, see Evolutionary Conservation.

An interesting aspect of conformity bias is how surprising and nonintuitive we find it when we see an illustration of the power of social Conformity, such as in the Asch Conformity Experiments, where the subjects frequently conformed with the majority view, even when it was obviously wrong.

Solomon Asch thought that the majority of people would not conform to something obviously wrong, but the results showed that 24% of the participants did not conform on any trial. 75% conformed at least once, and 5% conformed every time (37% conformity over subjects averaged across the critical trials).

Other experiments involving authority figures have even more surprising results, such as the Stanford Prison Experiment and the Milgram Experiment. Although not directly relevant to conformity bias, which doesn't consider authority, people's responses to these results show that our self-concept is highly inaccurate when we consider our willingness to conform. A common response to seeing people doing foolish or reprehensible things under social pressure is to think “I would never do that”, and yet everyone conforms to social influences to a high degree, and most people will show astonishing levels of conformance in an experimental setting.

This failure of self-awareness is an example of Intentional Opacity.

analysis/bias/conformity.txt · Last modified: 2012/01/22 11:59 by ram