Mini book read.
Social media was supposed to change the world—it did, but likely not in the direction we imagined. Every day we're wading through fake news, lying leaders, and ghosting. Who to believe? What is real? How can we tell?
Shiv Singh and Rohini Luthra, Ph.D. have taken a deep look at why we don't trust anymore and we fall for alternative facts. Human nature makes us desperate to belong, we want to be right. But then social pressure means we bow to authority.
Because technology has made us mighty, giving us the ability to spread information and seemingly become more persuasive overnight, we have a blind trust in Artificial Intelligence (AI).
We face these challenges as customers, citizens, parents, and community members.
Savvy is a practical handbook for individuals who want to navigate this post-trust world. If you're looking for ways to get savvier about alternative facts, to become more judicious about whom and what to trust, this book is for you.
Three layers of value
When I read a book, I examine the ideas in three ways—as a learner, as an author, and as a practitioner. Here's my take on Savvy.
First as a learner—the examples and studies make information you might have read elsewhere useful in a specific context. The references are contemporary, literally pulled from the headlines. Which is remarkable, given the publishing life cycle.
Then as an author—structurally, I look at the symmetry of information and references to make the case, use of stories, visuals, and science to prove it. The book has a simple structure that is easy to follow and remember when you pick it up again. The length, at 172 pages, is appropriate to deliver substance with brevity.
Third as a practitioner—the examples help me see how I would put some of the information into practice; most importantly, they help me analyze what’s going on in my environment. The most valuable part is the application to get savvy. It answers the question of “what’s within my control.”
Overall, the book is interesting. It won't give you the magic wand, you'll have to work for that. But it will give you a set of tools to counter the psychological biases that lead us away from the truth.
My highlights
The rise and fall of Theranos is a story many of you might have come across. John Carreyrou revealed the deception in a Wall Street Journal article, and then in a book. It was surprising how many existing employees and investors denied the evidence in an attempt to remain consistent with their earlier decisions to work with the company.
Social psychologist Solomon Asch says, “we have found a tendency to conformity in our society so strong that reasonably intelligent and well-meaning people are willing to call white black.” They're unable to come to terms with the truth. The nature of the Theranos board membership contributed to groupthink.
Once a reputable person attests to the validity of something because they want to believe, they're able to convince others to join in. It did not occur to any of the members to conduct an independent due diligence. It's also easy to fall for the “us vs. them” mindset.
Preserving the cohesiveness of the group is often more important than introducing doubts or questions that might jeopardize a goal. It's worth asking if your group at work overestimates its ability to achieve this goal. Or if it has a sense of superiority about its commitment to the goal.
We have a similar mechanism online. Human biases blind us to errors in logic, lack of evidence, and the interests at play. Each of us is called to respond to fake information, it's a new survival skill we need to develop to keep sane.
On the definition of trust by social psychologist Morton Deutsch is based on reading intention:
He defined trust as the “confidence that [one] will find what is desired [from another] rather than is feared.”
Shared values and mutual incentives are critical to establishing trust. When we have the right conditions these two ingredients are sufficient. However, to tell truth from opinion, we need to pay attention to what we're ready to believe and why that may be so.
On why we go along with lies. Psychologists Gordon Pennycook and David Rand conducted studies that demonstrated how more exposure to a message made us vulnerable to believing it, without vetting it first.
On why we love bad news. Ohio state University psychologist John Cacioppo found that the human brain “reacts more strongly to stimuli it perceives as negative.” Negativity influences our attitudes. Psychologist Daniel Kahneman found that we react more strongly at the prospect of losing than we do at that of winning.
On why we see what we want to see. Psychologists Albert Hastorf and Hadley Cantril conducted studies that demonstrated we have selective perception. Their finding was that “out of all the occurrences going on in an environment, a person selects those that have some significance for him from his own egocentric position in the total matrix.” Our motives color judgment.
On why we bow to authority. Psychologists Stanley Milgram and Philip Zimbardo conducted research that revealed how ordinary people can become evil, victimizing and torturing others, based on nothing more than peer pressure and the environment they’re placed in. Milgram said, “It is not so much the kind of person a man is as the kind of situation in which he finds himself that determines how he will act.”
How behavior spreads
It's surprising to discover how so many people go along with reprehensible behavior. Often this happens in situations of power imbalances. But sometimes people have the courage to speak up. When they do, others come forward as well. This was the case with Howard Weinstein's actions.
The story of Duke Tran at Wells Fargo also created ripple effects at the company. Tran had been fired in retaliation for having discovered a customer in Lexington, North Carolina owed the bank the entire amount of a balloon mortgage for which no paperwork existed. Tran received a settlement for damages and the bank is now engaged in rebuilding trust.
But behavior also spreads based on fake news and with negative consequences. Dr. Deb Roy and his team at the MIT Media Lab “studied true and false news stories distributed on Twitter over the eleven-year period between 2006 and 2017.” “The study found that false news was 70 percent more likely to be retweeted than true news.” False information spread faster than the truth. This should give us pause.
On the future
The last chapter on artificial intelligence introduces several concepts worth thinking about.
On the social credit score. China has created the fullest application of it. But in the Western world we already have many elements in place—credit scores, Yelp, TripAdvisor and Glassdoor reviews, ranking and rating systems, activity on Facebook and Google. Imagine if all this data were linked together.
On automated actions. From driver-less cars to systems that fire people, our future many look pretty grim, if we're not careful. Imagine a hospital system shut down suddenly, or a power grid turning off without warning.
There is much more. The book is a call to understanding our triggers and to be vigilant about our decisions and responses. We should seek first to understand and analyze like the skeptic, because fakery, as Shiv and Rohini call it, has become “a pervasive fact of life in the post-trust era.”
For those who are curious, Savvy is a practical handbook for individuals who want to navigate this post-trust world.