We’ve learned there are many things that leave us vulnerable to predatory manipulators.
But no discussion of what may make us vulnerable is complete without revealing one vulnerability we all have, but that remains hidden from us.
We see others and the world through it, as if we were wearing a pair of distorted lenses, yet we don’t even know we’ve got those glasses on. But guess what? Manipulators know all about it, and they use it to their advantage with great success. What is this hidden vulnerability?
To experience it firsthand, watch the one-minute video below. While you watch, count how many times you see the players in white pass the basketball:
Did you see it?
Many people who watch the video never see it. I didn’t, and couldn’t believe “it” was actually there the first time I watched it!
That’s just one example of the many automatic ways our brains work much of the time. These automatic “shortcuts” we take are called Cognitive Bias. And these biases leave us vulnerable, because they often lead us to make faulty assumptions and come to incorrect conclusions.
The video illustrates ‘selective attention,’ which keeps us from consciously seeing unexpected things. We selectively pick out one message from a mixture of messages occurring at the same time. Our brain edits out certain things because there’s just too much coming at us to consider it all consciously.
Cognitive biases are part of the reason we didn’t ‘see’ the psychopath, who was the big gorilla right in front of us.
According to the University at Albany,
“Although cognitive biases can sometimes be helpful in familiar situations or in dealing with predictable threats, they can lead to catastrophic failures in assessment of unfamiliar and unpredictable adversaries.”
Isn’t that the truth.
The human mind has evolved to have and use many different automatic shortcuts and to generate all kinds of assumptions. We all do these things unconsciously, by virtue of being human.
Look again — things aren’t always what they appear to be.
Here’s a list of some of the biases that may have made us vulnerable to that “unfamiliar and unpredictable adversary”:
- Projection Bias or Assumed Similarity Bias (not to be confused with psychological “projecting’): This mental shortcut leads us to the unconscious assumption that others share the same or similar values, thoughts and beliefs. In other words, we believe that others are just like us — if we are an honest, loving and kind person with a conscience, we believe most everyone else is the same way. We don’t even consider that some people may have drastically different values and motivators.
- The Affect Heuristic: This is a mental shortcut that allows a strong, emotional first impression to affect decisions, even if subsequent evidence weighs against the original decision. First impressions often remain even after the evidence on which they are based has been totally discredited. This bias is described as “going with your gut instinct,” which may not always be the best thing to do. The psychopath creates a great first impression, and we get stuck on that.
- The just-world hypothesis or just-world fallacy: This is the cognitive bias (or assumption) that “a person’s actions always bring morally fair and fitting consequences to that person, so that all noble actions are eventually rewarded and all evil actions are eventually punished.” This means if we see ourselves as a good person, we believe only good things will happen to us. We just don’t expect that something bad may happen, so we’re not even looking for it. (Because this bias also includes the belief that people “get what they deserve,” it is at the root of victim blaming.)
- The Observer’s Illusion Of Transparency: This one causes us to overestimate how well we understand another person’s personal mental states. We assume we know what someone else is thinking and feeling, when in fact we don’t.
- The confirmation bias: This is the tendency to search for, interpret, focus on and remember information in a way that confirms our preconceptions. This makes it hard to change your mind about something — or someone — once you’ve already developed a belief about who they are. If someone has already won you over and gained your trust, it is very hard to change that perception, even when things start to go very wrong.
About the confirmation bias, from “4 Reasons You Can’t Trust Yourself” (Psychology Today):
“Would you use the following words or phrases to describe yourself when you make a decision or take a position for or against something?
Good judge of the facts
Attentive to reason
Skilled at evaluating an argument
Sadly, this list is pretty much science fiction because of the confirmation bias, one of the many shortcuts the brain takes which leaves us thinking “fast,” and pretty much automatically, rather than carefully processing. Research shows that instead of judging and weighing all the facts, we listen to and give credence to those facts and arguments that align with or reflect beliefs we already hold. I’m sure you’re shaking your head and saying, “Not me!” as you read this. Sorry; there’s no point in your fooling yourself.”
- And the grandaddy of all biases, The Bias Blind Spot: People are largely blind to their own cognitive biases. We will accept that biases exist in others, but tend to deny that we ourselves have biases.
So what, if anything, can we do to think logically and objectively and override our brain’s automatic shortcuts and assumptions?
There is no easy answer or any easy way to do this, but plenty of people are working on it. You can start here:
- Peruse the videos created by the Center for Applied Rationality, whose goal is to help people overcome cognitive bias.
- Read the book by Daniel Kahneman, “Thinking, Fast and Slow.” He argues that if we consciously identify and attend to our biases in real-time (a feat that requires great effort!) we can lessen their affect on our reasoning, to some degree.
- But Alex Lickerman, M.D., believes attending to our biases isn’t enough. He says the remedy is mindfulness, or taking the time and expending the energy “to examine our own thought processes consciously and continuously.” And he says that must include trying to question our assumptions, too, although that’s difficult because we are often unaware of them, just as we are often unaware of our cognitive biases. You can read his opinion in the article, How To Ensure You’re (Almost) Always Right
- Watch more “gorilla” videos by the makers of the original video, Chris Chabris and Daniel Simons.
- Play a video game designed to help you recognize and overcome cognitive biases (still in development, I believe)
♥ Thank you for reading.
Comments are closed.