Is Islam the cause of extremist ideology, or is Islam just a scape-goat that extremist thinking has hijacked for their own justifications? That seems to be the question of the last twenty years, and often we are very quick to take a side, and sometimes very passionately. You may even notice an immediate answer or opinion swell in your mind. So often public debate puts forward seemingly binary options that have to be split into correct/incorrect in order to alleviate a seemingly overwhelming problem. Once someone has taken a side in such an argument, it’s often difficult for our minds to reassess the option. This seems to occur even in the face of new information that may make the choice trickier than once thought, or due to the presence of a charismatic speaker that ‘we feel really good about’ without assessing what they’re actually saying. But what is the apparent safety of such decided opinion? Why do our minds so easily choose a side? Why do we have an adversity to uncertainty? Cognitive neuroscience may have some insight.
Decision making is an incredibly useful attribute to possess, especially in a professional realm, where quick judgement must sometimes be made. However, sometimes we may bias which information we pay attention to, and therefore make an uninformed decision or form a bias belief. According to Hsu et al. (2005), there are roughly two types of uncertain events that we are often faced with: risky and ambiguous. Risky decisions tend to be when odds are known, and the probability of outcome may be assessed and estimated, e.g. I know I have 10 red cards and 10 blue cards in a deck, so I can roughly guess my chances of picking a blue card. Alternatively, in ambiguous decisions, the uncertainty of the odds create a difficulty in analysis, e.g. I have 20 cards, but I have no idea what proportion is red, and what proportion is blue, therefore picking a blue card has a much more ‘chance’ feel to it.
Hsu et al. (2005) found that in the face of uncertain events, in both hemispheres, the amygdala (raw emotional responses) and prefrontal cortex (executive function) increase in multimodal sensory innervation as ambiguity of a situation also increases. Like a pressure cooker, the missing information in a decision that creates uncertainty effectively puts pressure on our mind to seek out more information to inform our decision. Brand et al. (2006) also suggests that unlike risky decisions, ambiguous decisions often rely less on rational feedback from past choices, as well as being more susceptible to emotional responses associated with comparable situations.
In an ambiguous situation, when events may simply remain uncertain without a clear immediate answer, our minds may choose to seek a definitive answer to also alleviate this pressure that we may sense building, often in the absence of the whole picture. This is the attempt of our mind to balance the discomfort experienced when no clear path is right or wrong, and we are faced with a choice that doesn’t involve a binary option. This type of ambiguity in reality is rife in many opinions and beliefs, and in fact may be ignored far more frequently in our internal narratives than often perceived.
A great example and exercise around this is the current public and twitter debate/argument occurring between Sam Harris and Reza Aslan. It’s always fascinating to read the comments and opinions that followers of this conversation post online, and to notice the frequency of those who post very opinionated comments, compared with those who may assess the argument open minded. You may even notice a particular and immediate reaction rising in yourself. The point of this exercise is not to decide which individual is correct in the argument, but to simply be aware of the reaction and pressure our mind presents us with in the face of allowing ourselves to sit in uncertainty. Simply being aware and paying attention to this rising inner conflict has in fact been shown to strengthen the brain structures associated with executive functioning, and indeed our ability to navigate beliefs loosely, and with more freedom (Wells, 2002; Ivanovski & Malhi, 2007).
In the face of ambiguity, the mind sometimes protests and may desperately seek a quick answer. This sometimes is definitely a useful evolutionary tool to help us navigate our day-to-day environment that can often be highly ambiguous. However, the persistent need to find an answer can indeed produce a fragile perception of reality and impede our navigation of complicated dilemmas. Attention bias and lack of the whole picture can sometimes mean we may be at the whim of our machinery, instead of being in the drivers seat. I haven’t really touched upon the role of the speaker in this article, however as Pennycook et al. (2015) might summarise, ‘pseudo-profound bullshit’ can often throw a spanner in the works for those who are of the more impressionable variety.
“The most thought-provoking thing in our thought-provoking time is that we are still not thinking.” Martin Heidegger
Brand, M., Labudda, K., & Markowitsch, H. J. (2006). Neuropsychological correlates of decision-making in ambiguous and risky situations. Neural Networks, 19(8), 1266-1276.
Hsu, M., Bhatt, M., Adolphs, R., Tranel, D., & Camerer, C. F. (2005). Neural systems responding to degrees of uncertainty in human decision-making. Science, 310(5754), 1680-1683.
Ivanovski, B., & Malhi, G. S. (2007). The psychological and neurophysiological concomitants of mindfulness forms of meditation. Acta neuropsychiatrica, 19(2), 76-91.
Pennycook, G., Cheyne, J.A., Barr, N., Koehler, D.J., & Fugelsan, J.A. (2015). On the reception and detection of pseudo-profound bullshit. Judgement and decision making, 10 (6), 549-563.
Wells, A. (2002). GAD, Meta‐cognition, and Mindfulness: An Information Processing Analysis. Clinical Psychology: Science and Practice, 9(1), 95-100.