Whenever you're talking about intelligence, the IQ - or intelligence quotient - is bound to be discussed at some point. It was devised as a way to attempt to measure general intelligence. One of the main criticisms with early IQ-tests was the fact that it was hard to reproduce results and, more generally, that the specific things that are tested are not necessarily the only relevant factors in determining general intelligence. A number of decades and psychological theories on intelligence later, the Cattell–Horn–Carroll theory was devised (\citeA{keith2010cattell}). In the theory, ten broad abilities are presented, which are in turn subdivided into seventy more narrow abilities. Most importantly, it measures fluid intelligence and crystallised intelligence. Originally identified by Raymond Catell, fluid intelligence refers to the ability to solve novel problems and do problem solving. It tends to decrease with age. Crystallised intelligence (also a term coined by Catell), on the other hand, is based on your acquired knowledge and generally increases with age. Together, they explain most aspects of intelligence rather well ({\citeA{schneider2012cattell}). Yet, how is it then that intelligent people with university-level education and towering IQ's still fall prey to cognitive biases, logical fallacies and are prone to making errors in reasoning and judgement? \citeA{stanovich2016rationality} proposed that rationality is not measured at all by traditional intelligence tests and is something different entirely. In this essay, it will be explored how our biological mechanisms have a tendency to override our rational thinking, regardless of one's perceived intelligence. Is there a way to detect and overcome these flaws in thought and prevent people from suffering from them?

Findings

Let us take the conjunction fallacy, also known as the Linda problem, which was devised by \citeA{tversky1983extensional}.

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

  1. Which is more probable?

    (a) Linda is a bank teller.
    (b) Linda is a bank teller and is active in the feminist movement.

Logically speaking, the chance that A and B are true is always equal to or less than the chance that only A is true. Assuming the very likely scenario that B does not have a probability of 1 (i.e. Linda is not 100% certain to be a feminist), the first statement is more probable. Yet, \citeA{tversky1983extensional} found that the vast majority of participants did, in fact, pick the second statement. They argue that the representativeness heuristic (which was a term coined by them) is responsible for this outcome. A heuristic in general is a fast way of thinking that allows us to save cognitive resources. In most circumstances a heuristic helps us to get a sufficient level of accuracy, but as the Linda problem demonstrates, they also have a tendency to frequently blur our rational thinking. In a sense, it is a biologically hard-wired feature that has a tendency to overrule our logical thinking, as \citeA{stanovich2009decision} puts it. In our day and age, where \emph{fake news} and misinformation are increasingly influencing global politics, the ability to ignore our intuitive heuristics and spot cognitive biases and logical fallacies in reasoning and decision making is more important than ever.

This begs the question whether or not rational, critical thinking can be learned in such a way so that one doesn't fall prey to heuristics easily anymore. \citeA{otuteye2015overcoming} suggest that one of the ways of overcoming cognitive biases is by specifying simple decision-making algorithms in advance and becoming familiar with them. In their paper, they also state that people can be trained up to some degree to replace a heuristic with another simple decision-making algorithm. \citeA{deutsch2011handbook} consider that the effects of cognitive biases in conflict resolution can be mitigated based on three principles: feedback, analogical reasoning and behavioural skills. All of the principles that they present in their book rely on an individual's ability to recognise fallacies in their own reasoning, but their paper does not provide any solid guidelines on how \emph{exactly} this can be turned into practical advice. Beyond these two papers, little research has been done on developing cognitive bias mitigation strategies. It appears that no known "easy fix" exists aside from being aware of one's own biases and asking oneself whether or not they're being influenced by it.

\citeA{croskerry2013mindless} stresses that the ability to do critical thinking is essential in overcoming diagnostical errors in medical fields. He distinguishes between intuitive and analytic reasoning, where intuitive reasoning corresponds with automatic heuristical thinking. He agrees with \citeA{tversky1983extensional} in that this intuitive reasoning is sufficiently accurate in a large number of cases, but warns that many people regardless get the wrong treatment due to medical personnel following their intuitions instead of careful analysis. Furthermore, he concurs with the conclusion that there is no "easy fix" aside from personal vigilance and self-relection.

Discussion & Conclusion

Overall, it appears that there is no clear-cut method that can help us overcome our inherent irrationality. It would be hard to take a step back and go through the entire list of known biases and fallacies for every decision. And devising and using a decision-making algorithm for every scenario is also unlikely to work in practice. Having said that, learning about some of the most common logical fallacies and cognitive biases will likely be beneficial in detecting them, as \citeA{deutsch2011handbook} and \citeA{croskerry2013mindless} point out. Studying websites such as \href{https://yourbias.is}{yourbias.is} and \href{https://yourfallacy.is}{yourfallacy.is} might be helpful in achieving the sense of self-awareness that \citeA{croskerry2013mindless} believes will prevent heuristical reasoning from dominating analytical reasoning.

Rationality is not the same as intelligence, and as has become apparent in this essay, even the most intelligent among us cannot avoid biologically hard-wired heuristics unless special care to specifically take them into consideration is taken.

References

Croskerry, P. (2013). From mindless to mindful practice—cognitive bias and clinical decision making. New England Journal of Medicine, 368(26), 2445–2448.

Deutsch, M., Coleman, P. T., & Marcus, E. C. (2011). The handbook of conflict resolution: Theory and practice. John Wiley & Sons.

Keith, T. Z., & Reynolds, M. R. (2010). Cattell–horn–carroll abilities and cognitive tests: What we’ve learned from 20 years of research. Psychology in the Schools, 47 (7), 635–650.

Otuteye, E., & Siddiquee, M. (2015). Overcoming cognitive biases: A heuristic for making value investing decisions. Journal of Behavioral Finance, 16(2), 140–149.

Schneider, W. J., & McGrew, K. S. (2012). The cattell-horn-carroll model of intelligence.

Stanovich, K. E. (2009). Decision making and rationality in the modern world fundamentals in cognition). New York, NY: Oxford University Press.

Stanovich, K. E., West, R. F., & Toplak, M. E. (2016). The rationality quotient: Toward a test of rational thinking. MIT Press.

Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological review, 90(4), 293.