Most scientists and programmers’ implicit model of belief is roughly Bayesian: when someone who believes something about the world receives new evidence, they update their beliefs in the way that fits that evidence best. This model is (mostly) true in domains that people aren’t invested in emotionally, but fails in predictable ways for beliefs that are tied to group membership. Research in social psychology has established that beliefs about contested political and social issues function primarily as signals of group identity rather than as conclusions from evidence. Holding the wrong belief does not just mean being misinformed: it puts you outside the group, so updating the belief means leaving that group. The social cost of updating is therefore often higher than the mental cost of the cognitive dissonance incurred by staying wrong. The backfire effect is the most striking manifestation of this pattern. In a range of experimental settings, presenting people with accurate evidence…
No comments yet. Log in to reply on the Fediverse. Comments will appear here.