I have had a couple conversations about AI this week where my perspective was surprising to others, so I wanted to more broadly share my thoughts to seed more possibilities about the future impact of AI.
A few articles that prompted these conversations:
(Aside: In originally editing this post, I realized the articles below are all from white male perspectives, which was slightly embarrassing. After publishing, my friend Lina Srivastava shared her article on Building Community Governance for AI, which envisions a more collaborative collective approach to AI that would embrace goals like justice and equity. I’ll add other perspectives as I discover them.)
- Ben Thompson’s ode to how AI can support sovereign individuals: “How many apps or services are there that haven’t been built, not because one person can’t imagine them or create them in their mind, but because they haven’t had the resources or team or coordination capabilities to actually ship them?” He imagines that AI will close that gap between what somebody can imagine and it being shipped into the world.
- Noah Smith’s take on AI where he asserts that “it’s very possible that regular humans will have plentiful, high-paying jobs in the age of AI dominance”. His theory is that AI will always be limited in resources (by energy if nothing else), and therefore there will be an opportunity cost for AI to be used on less valuable tasks, and those less valuable tasks are where humans can continue to have a role.
- My friend Victor Saad posted his thoughts on why AI will not be able to facilitate a room or tell an authentic story that brings people together. He asserts that what AI is missing is the ability to “bridge the gap between the teller and the listener, to create a shared experience that goes beyond the transmission of information”.
That last point particularly resonates with me. People have asked if I’m worried about AI replacing me as a coach, and I’m not. If my coaching were about information transmission, then, yes, AI could replace me.
The challenge for most of my clients is not information; they know what they need to do, but somehow still don’t do those things despite being highly competent and motivated. They can’t bridge the gap between their current reality and where they want to be. They struggle with unconscious commitments that are keeping them stuck, and need help creating new stories for themselves. Perhaps an AI could learn to replicate that (especially if it read my book), but would the AI be able to replicate the shared experience of support and connection I offer to my clients? I doubt it. There’s something different (for now) about making a commitment to another human being that you respect, and making a commitment to a piece of software.
So I believe human connection is one thing that AI will struggle to replicate.
Another is true expertise. Generative AI works by doing probabilistic modeling of what already exists. In other words, it generates an average result. This is great! It means that if you have zero knowledge or skills in an area, you can immediately get to average or even above average results; it levels the playing field between those with privilege and those without. A few examples:
- A non-native English speaker who took on an executive role where she had to quickly up-level her written communication skills. ChatGPT instantly helped get her to a reasonable baseline.
- A solopreneur who was trying to build a financial model to calculate his ROI on customer acquisition. ChatGPT led him through the necessary calculations and created a spreadsheet model for him.
- An executive who wanted to quickly prototype a new product marketing campaign. Rather than wait weeks for the marketing team to fit it into their busy schedule, she had ChatGPT generate the press release, the copy for ads, a few potential brochures, etc. and had something to show her SVP in a couple hours.
None of these examples is about delivering new knowledge or insight, but generative AI based on Large Language Models (LLMs) isn’t designed to be innovative – it is designed to aggregate the conventional wisdom that already exists. It creates an easy way to access the skill of a common practitioner in the field.
As an aside, I didn’t initially understand the hype about AI last year because I wasn’t getting any value from what chatGPT generated in response to my questions. Why not? Because I operate from a place of extreme privilege and knowledge: I am a native English speaker who has spent decades practicing my writing, I have studied at three of the top universities in the world, I have access to a powerful network who can educate me on any topic to world-class levels, and I have reached levels of expertise far beyond the average in my professional areas. But then I realized the world-changing possibility that AI was a tool for raising the floor by sharing such privilege and knowledge, allowing everybody to reach above average competence through the support of AI. To go back to Ben Thompson’s provocation, what could people create if they had access to above average capabilities in any area they needed, including engineering, marketing, sales, finance, writing and communications?
As AIs become more prevalent, what does that leave for humans to do? Filling in the gaps of what AIs don’t do well:
- Innovation: We will still be needed to generate ideas of what the AIs should work on.
- Human connection: We will still be needed to build the human connection between what the AI produces and those who might benefit, standing in the gap between reality as it is and the vision of what could be.
My belief is that art will be the future of humanity in a post-AI world. This continues my 2016 thinking about this potential future, where I posited that art would replace work as people’s source of meaning and purpose in a post-AI world. This is partially because I have long believed that art is about creating connection between the artist and the audience via an artifact. If art is about human connection, and that’s not what AI is currently designed to do, it seems likely humanity is going to do more art as AI takes on other functions.
I also think that AI will create unremarkable average art, almost by definition because of the way it is trained on a broad corpus. I recently saw somebody on Facebook joke that they had previously not believed in the soul, but seeing the soullessness of AI-generated images convinced them that humans add something unique to the creation process. For a piece of art to connect deeply with somebody requires a specific viewpoint, one that is not the average, but one that will connect deeply with some people while alienating others. While AI could be trained for specific target audiences, art as the combination of human connection and exceptional non-average perspective feels like an area where AI will continue to struggle.
I am defining “art” here very broadly (possibly inspired by Seth Godin):
- Creating content in any form is art, whether it’s writing, podcasting, YouTube-ing, drawing, playing music, etc. Some might argue that generative AI to create images such as Midjourney or DALL-E will replace visual artists, but they only provide the craft of art; they don’t provide the idea. In other words, an artist will no longer have to spend years developing the expertise to paint or draw what they see in their head to express their ideas; they will just have to be able to interact with an AI to create that vision for them.
- Building a solopreneur business is art, creating value for a small number of customers, using AI to handle all the routine aspects of operating the business, as Ben Thompson envisions and as many people I follow are already doing.
- Personal development is art, exploring new possibilities of what humans can be, connecting ourselves to our potential. Yes, I am saying that coaching is art.
- Community building is art, as communities are built on connection, often through the use of stories. Such stories create a shared sense of “us”, which separates “us” from “them”. Note that parenting would also be art in the sense that each family is a micro-community.
This conception of art as humanity’s destiny in a post-AI world may seem optimistic and even pollyanna-ish, and I would agree that we are a long way from such a possible future. There will be significant changes and disruptions necessary to get to that end-state; in my previous piece, I noted that universal health care and universal basic income would likely be necessary to enable it.
We would also need to find ways to ensure that AI does not reinforce existing structural biases, where only people with certain backgrounds or privileges get access to these tools; while I see AI as a possible tool for equity in distributing knowledge, there are those who would use it to create more inequity.
But rather than look at all the ways in which things can go wrong (a hyper-capitalist, racist, patriarchal society reinforced by rigid AI), let’s envision the future we want and work out how to get there from here. That’s why I wanted to share this possibility and get other people thinking about these questions; how do we build towards this future of sharing and art and community?
I get that there’s a lot more to this post, but this line is some dystopian hellscape bullshit: “…and those less valuable tasks are where humans can continue to have a role.”
When energy cost is too expensive, burn the humans for fuel.
And we arrive at The Matrix.
(I also cringed at the conclusion that humans would be doing less valuable work, though I didn’t get to Seppo’s level of dystopia.)
I really like this post overall as a place to develop how humans and AI work together, especially because I also live in a similar privileged network of educated people, and continue to think that the value of AI generating a lot of average content doesn’t really appeal to me. This helps frame it a little better as a tool to get more people to average. Maybe.
Though I’m not looking forward to the first waves of apps generated by AI for people who think they know what the next killer app is. Though maybe a hobby app written with AI by someone who is an expert in the hobby could actually be a better app than someone who is half hobbyist, half coder. We shall see.
I guess I could have written that more clearly. I don’t agree with that conclusion that humans are less valuable than AIs, but was just summarizing the thought-provoking essay I linked to. People are talking that way and my essay was intended to suggest a more positive vision of human/AI collaboration. But if my friends didn’t read it that way, I guess it wasn’t clear.
Your argument was perfectly clear to me. I absolutely loved your blog article and it gave me a more balanced and positive perspective on the subject. Thank you!
Cheers,
Nasia Christie
https://hbr.org/2024/06/genai-is-leveling-the-playing-field-for-smaller-businesses includes some nice examples of how AI is leveling the playing field for smaller businesses.