Three Cognitive Biases That Allow Bad Ideas to Grow

0

In 1974, psychologists Daniel Kahneman and Amos Tversky published an academic paper titled “Judgment Under Uncertainty: Heuristics and Biases”. If you ever need a counterexample to the argument that good ideas won’t succeed without good branding, this is it. Despite the paper’s unsexy title — not for college class — Kahneman and Tversky essentially launched a new field with its publication: the study of cognitive biases. Through a series of ingenious experiments, they uncovered a constellation of hidden weaknesses in human judgment that lead us away from rational decision-making.

Cognitive biases are distinct from miscalculations and other errors resulting from misinformation. The mistakes people make due to misinformation can be corrected by simply providing more accurate information, but cognitive biases are “hardwired” into the brain, making them difficult to change and impervious to correction, because the The mind’s misinterpretation of specific information is precisely the problem. Kahneman’s and Tversky’s historic collaboration has since been chronicled in several books, for example, Kahneman’s Think, fast and slowby Dan Ariely As expected irrationaland Michael Lewis The cancellation plan— and some of the cognitive biases they studied even found their way into the cultural lexicon. One of them is called confirmation bias, and it helps explain why innocent but avoidable false positives occur frequently.

In the most basic sense, confirmation bias prevents us from seeing possibilities that might challenge our assumptions, and it causes us to collect, interpret, and recall information in a way that is consistent with our existing beliefs. The reason we have this trap in our thinking is that when information is presented to individuals, their brains are already filled with vast amounts of previously obtained information, social context and history that project meaning onto the news. information. Because we have a limited intellectual capacity to process all of this, we use mental shortcuts to make quick decisions, often at the gut level. One of those mental shortcuts is basically filtering out or ignoring information that doesn’t match our expectations or assumptions. Indeed, science has taught us that reconciling conflicting new information requires more mental energy than processing new information consistent with what is already in our head, and our brain prefers the easier route.

This tendency may seem contrary to our own interests, but in the context of the long Darwinian history of our species, confirmation bias makes perfect sense. Our brain has evolved to reduce uncertainty and streamline our responses. For our ancestors, a shadow could have meant a predator, so if they assumed it was one and started running, that assumption could save their lives. If they had stopped to gather more information and really think about it, they might have ended up having dinner.

While confirmation bias was useful for our species in the distant past and continues to be useful in certain scenarios, for endeavors that require careful analysis and slow deliberation, such as testing an innovative idea we hope to bring to market. scale, it can be inconvenient. This can hamper creativity and critical thinking, which are the pillars of innovation and quality work. This can cause doctors to make lazy diagnoses and follow the wrong treatment. This can lead policy makers, business leaders, administrators and investors to devote massive amounts of resources to the wrong initiative or company. And when it comes to interpreting information, whether in business or science, it can produce false positives.

British psychologist Peter Wason’s classic rule discovery test from the 1960s illustrates confirmation bias in action. He gave the subjects three numbers and asked them to find the rule that applied to the selection of these numbers. Given the sequence 2, 4, 6, for example, they usually formed the assumption that the rule was even numbers. Then, the participants presented other sequences of even numbers, and the researchers told them whether or not these numbers conformed to the rule. During this process, participants were tasked with determining whether their hypothesis was correct. After several correct tries, the participants thought they had discovered the rule. But in fact they hadn’t, because the rule was much simpler: increase the number.

The most interesting aspect of this study (and many others like it) is that almost all of the subjects only tested number sequences that conformed to their personal hypothesis, and very few tried a number sequence that could refute their hypothesis. Wason’s experiment demonstrated that most people, no matter how smart, fail to examine their assumptions critically. Instead, they only try to confirm them with “quick thinking”, using quick heuristics or mental shortcuts.

Another mental shortcut that has a knack for producing false positives is running bias. Also known as “gathering” or “waterfalls,” the train effect arises from social influences on our mental processes. Like confirmation bias, movement bias interferes with our ability to recall and evaluate information accurately. But in this case, we are under the unconscious influence of the opinions and behaviors of others – the social side of decision-making. In 1951, pioneering social psychologist Solomon Asch developed a now famous lab experiment that can help us understand this type of groupthink. He recruited students to participate in what they thought was a vision experiment. These students joined several other supposed participants – who were actually experimental Confederates or scientists posing as participants – in a classroom.

Everyone in the room saw an image with three lines of varying lengths, in which one line was very obviously longer than the others. Each person in the room was asked to say out loud which line was the longest. After the early Confederates all identified the wrong line, more than a third of the participants, on average, accepted the clearly incorrect answer; across twelve trials, a whopping 75% accepted the obviously wrong answer in at least one case. In contrast, when no Confederates were present to entice them to jump on their bandwagon, virtually all subjects chose the correct answer – demonstrating how easily our independent judgment can be subsumed by our desire to ‘s’ integrate” or to be “one of the pack”. .” Not only does this deal a disturbing blow to self-image as a free-thinking individual, but it also has troubling implications for the science of scaling.

If you look at the bandwagon effect from the perspective of marketers, whose mandate is to create demand for products at scale, this quirk of the human mind is a godsend: the desire for conformity that animates so many of our thoughts and actions can be converted into dollar signs. Indeed, plenty of research shows how the bandwagon effect shapes consumer choices, like the clothes we buy (ever wonder why different colors and styles are in fashion every year?), toys that kids ask their parents (remember Tickle Me Elmo? For your good, I hope not), and the sports teams that we support and buy apparel from (the best-selling basketball jerseys in the United States- United historically corresponds to the star players of the teams that reach the NBA Finals in a given year). The train effect – or social contagion, as it is sometimes called – can even influence our political orientations, and therefore electoral results. While all of this is all well and good for marketers and strategists hired to nudge people toward certain choices over others, for those creating and launching innovations to benefit society, it can create false positives and lead to the scaling up of bad ideas.

Share.

Comments are closed.