Much of our modern dialogue about business and many other topics today is based on the dichotomy between science and advocacy. We say evidence, consistency, and proof are important to us, but we also have strongly held beliefs the truth of which ironically prove hard to explore objectively.
That's because we're not objective at all. We're built that way, and we hardly ever gather data on ourselves... plus how much is the data gathering process unbiased? We tend to think we're proficient at something even when we aren't. And the “we aren't” is likely closer to the truth.
Our ego fights fiercely to defend its honor — this is one idea Freudian therapists and experimental psychologists agree in. “As a result, many of our most basic assumptions about ourselves, and society, are false,” says Leonard Mlodinow in Subliminal.
Mlodinow asks a series of provocative questions like, Why doesn't the business executive question his/her own abilities when the group or division is not meeting the numbers? Why don't the professionals who never seem to shed the modifier in front of their title wonder how they're not progressing? How do we convince ourselves we're the best drivers on the road? That we have talent when we don't?
Psychologist Jonathan Haidt observed that there are two ways to get at truth:
-
the way of the scientist
-
the way of the lawyer
Scientists gather evidence, look for regularities, form theories explaining their observations, and test them.
Attorneys begin with a conclusion they want to convince others of and then seek evidence that supports it, while also attempting to discredit evidence that doesn't.
The human mind is designed to be both a scientist and an attorney, both a conscious seeker of objective truth and an unconscious, impassioned advocate for what we want to believe.
Together, these approaches vie to create our worldview.
We don't know what we don't know, which makes it hard as we try to frame questions well. But also when we go around trying to prove we're right, that's a poor recipe for decision-making. That's a problem and it doesn't serve us most of the time, even when we factor luck.
That's because, as it turns out, according to Mlodinow:
the brain is a decent scientist but an absolutely outstanding lawyer. The result is that in the struggle to fashion a coherent, convincing view of ourselves and the rest of the world, it is the impassioned advocate that usually wins over the truth seeker.
The unconscious mind is a master at reconstructing a view of the world with scant data. It's a good thing when we're called to infer information quickly, say to save our lives, or to deduct a pattern early on and save resources that are precious energy-wise.
Deductive reasoning — along with imagination — is the reasons why Big Data should not replace thinking, especially of the strategic kind. If it's true that, everything around us that we call life, was made up by people that were no smarter than us, as Steve Jobs once said in an interview, it's also true that our senses and processing power construct this reality from “a mix of raw, incomplete, and sometimes conflicting data.”
Self-awareness is very hard for this reason. Our unconscious mind uses the same outstanding lawyer-skillet to create our image of self borrowing liberally from a collage of facts and also illusions. We should also be mindful that facts share the same root with manufactured, just for good measure.
So we literally construct our self and our reality based on an interpretation. When empathy levels are low or absent, compassion is not present, we succumb to more narcissistic tendencies. But it gets worse, because the rational agent in our mind, the scientist who lives in the conscious mind, then admires this self-portrait somehow believing it to be true. This is how our ignorance sabotages us.
Curiosity can help us become more scientists and serve up better information to our conscious mind. Neil deGrasse Tyson says, “Science literacy is more, how is your brain wired for thought? Are you wired for curiosity?” Because science is all around us, all we need to do is get away from our own “motivated reasoning,” as psychologists call it, which helps us feel in control by casting a very positive image of who we are, but deludes us by doing so.
And it prevents us from getting better, growth mindset or not. Rationally, we know the impossibility of 40 percent of engineers squeezing into the top 5 percent as we'd like to believe, or 60 percent of students to fit in the ten percentile, or for 94 percent of college professors to fit into the top half. But we often do convince ourselves with ambiguity as our ally.
Ambiguity is our tool of the convincing trade, it creates uncertainty of meaning. It gives us wiggle room, allowing us to interpret things on more than one way. The ambi part of the word means “two meanings,” but it can be more than two. Open to interpretation doesn't mean vague, however, so there is that.
We do interpret experiences differently, even when the experience itself is the same — a football game, a concert, a sermon or a presentation, and so on. Even in science, research and evidence are subject to interpretation, more easily in social sciences... and likely highly correlated with interests. Which is why advancement in so many fields happens when the people who have a stronghold on a theory or course of action die.
In other words, when we want to see something, we work really hard to see it. Motivation is why when we deal with complex issues, like what constitutes ethical behavior, or running a business, or our ability to get along with others, our unconscious mind takes its pick to feed to the conscious self, says Mlodinow.
Study upon study demonstrates how so many professionals who think they're unbiased are in fact very much so. The unconscious mind is quite sneaky. It puts one over us by automatically indulge our wants, needs, and desires. That's when it's firing on all cylinders.
motivated reasoning involves a network of brain regions that are not associated with “cold” reasoning, including the orbifrontal cortex and the anterior cingulate cortex — parts of the limbic system — and the posterior cingulate cortex and precuneous, which are also activated when one makes emotionally laden moral judgements.
We physically bypass reason! But our conscious minds are no fools, so the distortion cannot be extreme, it needs some semblance of objectivity to slip past us. And here's where the mind jumps in to help maintain the illusion.
For example, many studies on hiring based on resumes and interviews have now demonstrated how biased organizations are based on gender, choosing to justify the skill sets they need retroactively to fit their worldview.
When the world tries to enter the gateway of our mind, if the information is favorable, we put up an easy fight to let it in. If the data contradicts us, we ask it to spell an impossible word to get in, as it would happen in an old joke.
This is the same mental mechanism people use to find serious issues with trivial hearsay if it helps them condemn a situation or person they disagree with, and embrace their choice despite strong evidence that it's at fault — person or belief. For example, global climate change, which has been proven beyond reasonable doubt as detailed in Merchants of Doubt, and yet still so many find reason to doubts its veracity.
Scientists actually agree on that point now, it's an issue that is very much settled. But it's not good news, so people create wiggle room by believing a different story. What can we learn from all this? Are there ways to use this knowledge to improve our ability to think and act or decide by seeking evidence that also disproves our thesis?
To keep ourselves honest, and calmer in the face of disagreements with others, or biased hiring decisions that may backfire, we should be mindful that:
- people who disagree with us may not be necessarily duplicitous, or dishonest, in their refusal to acknowledge errors in their thinking, which is obvious to us
- our own reasoning may be biased to lean in the direction of what we want and need to be true
- walking in someone else's shoes is useful to see the world through their prism — when called to analyze data before taking sides in a dispute, people tend to increase their willingness to agree at the tune of from 28 to 6 percent in legal disputes
- having higher standards for the kind of supporting evidence we need to reach conclusions or make decisions, rather than adjusting them to our wants is useful to uncover new opportunities — including seeking evidence that contradicts our thesis
- creating a realistic timeline on projects based on the reasonable time it takes to complete each step or task in the chain, rather than our desire to finish a job earlier helps us deliver quality work — experience can help us here, but we should account for decisions that are outside our control as well
- believing in ourselves is good, it pushes us to accomplish greater feats with optimism — how we get there is also important
In his 1974 address at Caltech#, physicist Richard Feynman said:
The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that. After you’ve not fooled yourself, it’s easy not to fool other scientists. You just have to be honest in a conventional way after that.
I would like to add something that’s not essential to the science, but something I kind of believe, which is that you should not fool the layman when you’re talking as a scientist. I’m not trying to tell you what to do about cheating on your wife, or fooling your girlfriend, or something like that, when you’re not trying to be a scientist, but just trying to be an ordinary human being. We’ll leave those problems up to you and your rabbi.
I’m talking about a specific, extra type of integrity that is not lying, but bending over backwards to show how you’re maybe wrong, that you ought to do when acting as a scientist. And this is our responsibility as scientists, certainly to other scientists, and I think to laymen.
He then provides an example that touches on science and business:
One example of the principle is this: If you’ve made up your mind to test a theory, or you want to explain some idea, you should always decide to publish it whichever way it comes out. If we only publish results of a certain kind, we can make the argument look good.
We must publish both kinds of result. For example—let’s take advertising again—suppose some particular cigarette has some particular property, like low nicotine. It’s published widely by the company that this means it is good for you—they don’t say, for instance, that the tars are a different proportion, or that something else is the matter with the cigarette. In other words, publication probability depends upon the answer. That should not be done.
Taking this approach also helps us as scientists in the broadest sense of conscious beings, not be used by others for their ends. This is important. If we learn not to fool ourselves, which is the easiest thing to do, we also learn not to be fooled by others.
“We choose our facts based on what we believe. We also choose our friends, lovers, and spouses not just because of the way we perceive them,” says Mlodinow, “but also because of the way they perceive us.” It's a sure bet that we'll choose survival over anything else.
See also how our unconscious mind rules our behavior and finding a valid hypothesis to what is the meaning of life?