By Adam J. Pearson
Did you ever notice how holding a belief about the world seems to surprisingly narrow your perception? It’s as if the belief gives you tunnel vision; everything that agrees with it comes through the tunnel and everything that challenges the belief doesn’t it make it into your field of view. Or if it does, it doesn’t stay for long before it gets denied or explained away.
In the same way that putting blinders on a horse limits their peripheral vision and thus helps them to focus on the path ahead, buying into a belief puts blinders on our perception.
Why does holding a belief tend to limit rather than expand our perception? One reason seems to be that once we hold a belief, confirmation bias gets activated and the mind begins to seek out information that confirms what it already believes and miss, deny, or explain away evidence that contradicts the belief.
In this way, holding one belief to be true can blind us to other truths that lie outside or clash with that belief and, by placing a limit on perception, can narrow rather than expand the range of what we perceive.
I call this phenomenon the Believer’s Paradox; the more we believe, the less we perceive.
I named this phenomenon because it emerged out of my experience. It came up again and again. I’d hold a belief and find so much confirming evidence for it everywhere I looked, but miss out on so much at the same time. The more I believed, the more evidence I saw of my belief, but the less I perceived data that lay beyond the scope of the belief’s narrow view of reality.
At this point, you might be wondering: should we believe in the Believer’s Paradox?
I like to hold this concept of the Believer’s Paradox in mind as what Alan Siegman calls a “thought model.” A thought model is a useful way of thinking that requires neither belief nor disbelief, a kind of conceptual tool. Thought models can be picked up and played with so long as they are useful and discarded when they no longer serve a practical purpose.
There’s another interesting thing about beliefs; we don’t seem to be able to consciously will ourselves to believe or disbelieve. If I disbelieve something, I can’t will myself to believe it without seeing through my ruse. If I believe something, I can’t will myself to disbelieve it; its apparent truth shines through my attempted disbelief. All I can will myself to do is consider the evidence and arguments in favour and against a proposition and then see if my mind believes or disbelieves it as a result.
The unconscious seems to play a role in the process of ‘adopting’ beliefs just as the conscious mind does. As social psychologist, David Myers points out in his phenomenal book, Social Psychology, we are not only persuaded by rational arguments (what he calls the ‘central route to persuasion’), but also by peripheral cues (‘the peripheral route to persuasion,’ in Myers’ terminology). Without even realizing it, we can unconsciously end up believing someone’s message if, for example, they seem to be credible, if they have social proof, if we like them, or if they appeal to our feelings.
Being a critical thinker helps us be less prone to subconscious persuasion. However, it may also be helpful to change the way we frame “beliefs” to see them, instead, as “thought models.” The less we are at the mercy of beliefs and the more free we are to play around with thought models, the more options we have for ways of seeing the world and the less susceptible we are to the limiting effects of the Believer’s Paradox.
For example, a manager who clings to and believes in one particular model for how to manage her subordinates limits herself to perceiving situations in the narrow way construed by that particular model.
If, instead, she cultivates an awareness of multiple management approaches and holds these loosely as ‘thought models’ or tools in her toolbox, she remains open to seeing a situation in many possible ways. When she looks at the situation through one model, she perceives some of its facets. When she looks at the same situation through the lens of another model, she perceives other facets that the first model did not help her to see.
Without clinging to one particular thought model, she remains open to using a variety of thought models in order to get the fullest possible view of a situation. In this way, she expands her perception of reality instead of shrinking it.
This is the true value of the Believer’s Paradox; when we understand how it works, we can think in a way that minimizes its effects and maximizes our options for perceiving the world as fully as possible. “Beliefs” may limit perceptual options, but “thought models” increase them. In other words, the more aware we are of alternate thought models (ways of thinking about and seeing a situation) and the more willing we are to flexibly play around with them, the more of reality we are able to perceive.