Some things like autonomy, mastery, and purpose have become more valuable than others to satisfaction and general well-being. We desire to control our lives, how we spend our time. Mastery is an important aspect of motivation. And so is purpose, which is the main force changing how organizations and people do business. In a world where like-choices abound, values and value alignment becomes what is precious, and cannot be copied.
Wired co-founder Kevin Kelly talks about eight value-based needs not easy to copy in What Technology Wants. These needs remain an attractive type of value we can provide that people will pay for. He calls them generatives because they have to be generated in context, which makes them valuable.
Here are some examples of how it might work:
- immediacy —in a short term world where patience is put to the test at every turn, we pay for “right now.” Adding a little bit of context, that may mean five new customers, a piece of information you need to close a deal, making a problem go away. Fresh is harder to do with news, but it can become fresh analysis and point of view.
- personalization —we tend to think this is about something being addressed to us based on online tracking of behavior, or past purchases. We might want to pay for personalized medicine, we already pay for self-tracking. But what if we go beyond what is now to personal APIs? Organization and reservation tools that talk to each other on our behalf. We're already starting to see how bots can deal with customer service on individuals' behalf. This is flipping the funnel and putting us as the customer in the driver's seat.
- authenticity —people are willing to pay for the real deal in art, collecting, design, fashion, cars, and experiences. Authenticity is for profit. The businesses that make high-value work pay attention to this need. It comes from knowing oneself (the Greek γνῶθι σεαυτόν) as the inscription on the Temple of Apollo at Delphi said.
- attention —it's becoming harder to get and to keep without paying for it. Technology is now sweeping the market listening for evidence of surprise and value. Corporations listening to consumers who are listening to corporations in search of clarity of purpose. This is an odd conversation to have.
- interpretation —people use and interpret words differently. We also say one thing, and do another. Influence is open to interpretation, so it's up to customers to decide what “right now” means from their own needs and wants. Once this happens, the story will become memorable in ways that turn customers into the next tier of storytellers, if we're open to it. Many organizations still aren't.
- accessibility —early in my career, I interviewed with a four-star general for a position as interpreter at the Italian Mission to the United Nations. It required living in Manhattan, to be on call. That type of job is probably not going to change, but in many others we work from different locations. The value of access to things has since skyrocketed to how people pay for leases, rentals, and services rather than owning things.
- embodiment —this is about providing an experience worth paying for. For example, what happens at the Ritz Carlton would not happen in other hotels. Le Cirque du Soleil performances are based on universal themes. For example, Kooza brings together and explores timeless themes like fear, identity, recognition, and power.
- findability —connecting people with what they are looking for is especially powerful when they are not yet sure of what that is. Google built a highly profitable business on top of it. In The Long Tail Chris Anderson talks about the value of filters and aggregators.
Kelly believes we overestimate the effects of technology in the short term, but underestimate them in the long term. In his follow up book, The Inevitable, he talks about a few of these long-term, accelerating forces. He describes them as deep trends—flowing, screening, accessing, sharing, filtering, remixing, tracking, and questioning—and demonstrates how they overlap and are codependent on one another.
Artificial Intelligence (AI) is slowly seeping into our lives. According to Kelly, its role in the future is to help develop the third way of knowledge. He says:
One of AI’s major roles in the next 20-30 years will be to be a probe to understand what our own brains do, not by opening them up, but by trying to model them and replicate them. This is what I call the third way of knowledge.
There’s the ‘humanities’, which figure out how things are, by looking at human expression, going inside themselves, and doing what artists do. Then there’s the ‘scientists’, the second way, who run experiments and probe by trial and error. Then there’s the third, the ‘nervous’, the technologists. In the ‘nerve way’, you investigate something by trying to make something new. Which means, the way you investigate intelligence is not to probe it or not think about it, but to try to actually make artificial intelligence.
The way to study AI is not to probe it, but to try to make artificial intelligence. You study reality by creating a virtual reality. In the same way, you study democracy by creating a virtual democracy. The way in which you try and probe the basis of this state of being is through making something so that you learn by making, and that is what AI is. This is actually the Third Culture, the third way of knowing, the way in which we render the human condition is by making things.
Nine years ago, I talked about how humans and human intelligence would still be part of things. Investigating by trying to make new things with technology is an interesting observation. In this view, AI is not the same as the algorithms that are limiting our circles of knowledge based on our behavior.
In The Society of Mind, a book published in 1980, Marvin Minsky says “that intelligence is not the product of any singular mechanism but comes from the managed interaction of a diverse variety of resourceful agents.” Minsky was a pioneer of artificial intelligence and made many contributions to the filed of cognitive science. He appreciated the value of what he called negative knowledge.
We tend to shy away form all things negative, or potentially so. Feedback, reviews, even conversations. Yet, it is through these vehicles that we learn the most. But what if making new things, using the third way of knowledge, is the path to learning to accept that mistakes and misses are part of the process?
Legendary producer John Lloyd says, “It's what we do with our intelligence that matters.” Artificial or not.