The arts were, for a very long time, the most influential source of widespread inspiration, cultural change, local and global innovation, and community building in human culture. This influence has waned considerably in recent decades, particularly over the past several years, as digital technology has ascended. The tech sector gained clout in part by conspicuously co-opting elements of the arts and culture, while simultaneously disempowering aesthetic creativity as “content,” dehydrating our languages, upending intellectual property law, and devaluing artistic labor (in addition to other, much graver harms to the environment and democracy). This particular moment recalls the Industrial Revolution, when decades of rapid innovation also brought deep conflict and suffering, and the dire implications for contemporary life and the environment.
We have arrived at this epoch with the advent of generative AI. Since the release of ChatGPT in 2022, followed in quick succession by Google Gemini, Claude, and a constant flow of competitive offerings, there is an inescapable narrative around AI that our lives are wed to the technology—for better or worse, for richer or poorer. Much of the reportage and many of the conversations around AI make it seem that this is a foregone conclusion, and that things will evolve based on the priorities and whims of the experts and the venture capital firms that fund them. We, however, have a lot more agency, input, and opportunity to guide the development of these technologies than it may appear.
Most of us have already been working with AI daily for a number of years, often without realizing it: during customer service outreach (maddeningly attempting to speak to a real human on a phone call, messaging with a chatbot), with autocorrect on our phones (and in our word processing software), through our use of “smart” appliances, or trying to outwit recruitment software during the job search process. We know from these experiences that it doesn’t always get things right. AI’s imperfections reflect the nebulous nature of the technology and spotlight the vast opportunities that remain to improve it.
The nonprofit arts and culture sector may not be as broadly influential as it once was, but it is possible to rebuild the bonds between the arts and society to be even stronger than before. We live in a time where we increasingly acknowledge that “offline is the new luxury.” People are desperate for third spaces: places that aren’t work or home, where they can commune meaningfully, in person, with others, actively or passively. At this moment their needs are not being met by most institutions, including arts organizations, who ought to be best-in-class at offering what they seek. A public place operates in a relationship with everyone who passes through, and it is up to each institution to make that relationship a good one. This is one place where AI technology could be a tool for transformative change in our industry: by deepening understanding of both specific audiences and AI technology/machine learning, professionals within the arts and culture sector would be able to inform and direct the evolution of AI in ways that are beneficial to humanity.
While many are focused on the implications of AI in the areas of creativity and intellectual property, I spent the last several months exploring the implications and possibilities for AI in the administrative aspects of our sector with E.U.-based arts and culture researcher Frances Croxford, and in dialogue with just about anyone I encountered with expertise or interest in AI. Shared below are a few ideas about ways that cultural institutions might begin to engage with AI, drawn from conversations with consumer insights, customer experience, and technology experts, including G. Robert James, principal, the Solomon Group, and author of the recently published Music Not Noise: Pitch Perfect Leadership; Alan Berkson, founder, Intelligist Group; Michel Gelobter, executive director, Yale Center for Environmental Justice; Jeffrey L. Bowman, founder and CEO, Reframe AI Technologies; Patrice Poltzer, founder of Patrice Poltzer Creative; Stephanie LeBlanc-Godfrey, founder of Mother AI; and Tim Miner, CEO, By the People Technologies.
1. Take an inventory of information
The beginning of any engagement, or experiment, with AI should begin with a data audit. Understanding and awareness of all data sources, from ticketing systems to social media, from development databases to strategic plans, from research reports to digital marketing campaign results to practices for gathering data relevant to audiences behaviors and preferences, should be a top priority.
For all of the talk of large language models (LLMs), they are only a part of the recipe to gain actionable, relevant information using AI. The secure data kept within an organization and protected by a firewall, rather than the billions of pieces of data being pulled from a variety of sources across the internet by a large language model, is highly and particularly valuable because the patterns that can be identified across this data hold an institution’s value proposition.
Data remains at the center of an organization’s business, and AI tools can be used to deepen understanding of operations, what different departments are doing (and how successful campaigns are), the audience/customer base, and how all of this information can work together to fuel institutional success. New AI “agents” like Retrieval-Augmented Generation (RAG) systems can identify information in “unstructured data”—for example, in PDFs or other documents—to inform prompts that will generate responses relevant to an organization. Another agent, Model Context Protocol (MCP) systems, have the benefit of operating within an institution’s firewall to more securely systematize the ways LLMs might engage with internal data sources.
In addition to the breadth of an organization’s data, in the context of language models, quality is crucial. The content of the information carries significant weight. As the Solomon Group’s G. Robert James warns, “Garbage in? Garbage out!” What we feed or teach the machine will dictate if it will learn and adapt in ways that will help institutions to thrive, or not.
2. Audience experience as an entry point
With our sector at a crossroads, so much depends on how well, how effectively, and how thoughtfully we can reach our audiences (and potential audiences), and the ability to empower often overwhelmed employees with tools that can help them to make the best use of their limited resources. Intelligist’s Alan Berkson acknowledges that “most organizations don’t know what they know about people. AI is a tool that can amplify your good processes and help you to identify your best customers, using your existing data.” For Reframe AI Technology’s Jeffrey L. Bowman, AI represents a great opportunity for our sector: “In the age of artificial intelligence and machine learning, as it relates to arts and culture, this is an opportunity to modernize the guest experience through personalization.”
We live in a world where personalization (when data about a customer’s past purchases and preferences is used by brands to influence a customer’s future buying habits) is both anticipated and viewed with skepticism by consumers. Used judiciously, though, AI could serve as an audience development tool, to assist in plumbing the depths of audience preferences and behaviors, not to influence programming but to help arts organizations to accurately predict the ways audiences prefer to discover, and engage with, their programming. Institutions that find ways to leverage language models to pinpoint audience preferences might offer their marketing and communications teams “muses” to inspire evidence-based ideas for creating excitement around their programming—and the fruitful activation of third spaces.
When a customer wants to engage with a brand, AI can empower that brand to address their needs more rapidly than traditional methods can. With the move toward personalization, of course, institutions of all sizes must establish strategic plans for security, ethics, and code of conduct when considering how to invest meaningfully in AI. The challenge is to use this knowledge to figure out how to get folks off of the internet and into theatres.
3. Prioritize human intervention
AI technology may be “machine learning,” but humans are at its heart: The data being processed by the machines is human data. Every aspect of the technology itself was built and/or overseen by humans, and human intervention throughout AI rubrics and processes, to review, edit, approve and/or reject information, is essential to successful technology integration. Bob James advises that organizations that will be successful in the future are those who don’t “fear the technology, or those whose leaders understand that the technology is a tool and not a replacement for something human.”
Inspiration for a human-focused approach may be drawn from creative small businesses that are establishing standards. Patrice Poltzer of Patrice Poltzer Creative, for example, drew on her years as a Today Show producer to create an AI tool, StoryPro AI, that centers real life to solve problems for entrepreneurs struggling to create compelling brand narratives. Unlike traditional AI platforms that simply generate content, Poltzer created an ecosystem where technology enhances human connection rather than replaces it. Members can join a community of entrepreneurs facing similar challenges, have access to recorded and live AI and storytelling workshops, and receive coaching through “Pocket Patrice” a chatbot trained to help entrepreneurs get the best stories out of themselves. AI serves as a tool for discovery, introspection, and “dot connection” rather than a replacement for original words and thoughts. The technology is part of the package, not the entirety of it. StoryPro AI is an explicit acknowledgement that a product designed to help people to connect with others requires people to connect with others for it to work at all.
4. Turn to wonder
How we choose to use the technology—not only instrumentally but ethically, including in relation to the environmental impacts of these products—will determine our future. We have a voice in how web 3.0 continues to develop. For many audiences, the intrinsic value of the arts is no longer enough of a draw, especially as the cost of living continues to rise. We have no choice but to engage with multiple sources of innovation and creativity to move the sector forward.
“You don’t need to code to influence AI’s trajectory,” said Mother AI’s Stephanie LeBlanc-Godfrey. “Non-technical voices—especially from the arts—are essential in ensuring AI serves human needs rather than just driving efficiency. Organizations can shape development by clearly articulating their unique needs, participating in user testing, and creating use cases that demonstrate values beyond productivity.”
We have also arrived at a point where some architects of the current technocracy are encouraging young people to study the humanities instead of computer engineering and coding, because the skills they anticipate will be valuable in the future workforce are those that involve critical thinking and clear communications skills—aptitudes nurtured in those academic fields.
AI is pervasive and will only become more so. Even if it doesn’t show up in a particularly disruptive manner in arts and culture sector enterprises right now, it is essential that arts workers understand it on a basic level. Production teams may already be using AI for idea generation; some stage managers are using it to organize notes and streamline scheduling. And artists could create powerful, resonant stories inspired by the technology and its implications for society, whether they use it directly or not.
We’ve been here before. The introduction of photography, for example, represented a turning point for humanity—one that was similarly fraught with anxiety and fear about its effects on society. Now it is an established aspect of everyday life and an art form with its own rich canon. If we open our minds to exploration and play, and continuing to place humans at the center of our artistry, our future with AI may be, rather than a dystopian nightmare, a world of exquisite beauty, of meaningful mimesis. Artists may make all the difference.
Sacha Phillip Wynne is the marketing and communications director at Joe’s Pub at the Public Theater. She is also the co-founder of WӔRK, a consultancy that employs artistic processes and practices and nature-led systems to solve business problems; an alumnus of the New Museum’s NEW INC; and a visiting professor in Pratt Institute’s Arts and Culture Management program.
Recommended reading: The Worlds I See: Curiosity, Exploration, and Discovery at the Dawn of AI by Dr. Fei-Fei Li; The Coming Wave: Technology, Power, and the 21st Century’s Greatest Dilemma by Mustafa Suleyman; 12 Bytes: How We Got Here, Where We Might Go Next by Jeannette Winterson; Five Manifestos for the Beautiful World by Phoebe Boswell, Saidiya Hartman, Janaína Oliveira, Joseph M. Pierce, and Cristina Rivera Garza; Let’s Become Fungal!: Mycelium Teachings and the Arts, based on conversations with Indigenous wisdom keepers, artists, curators, feminists, and mycologists by Yasmine Ostendorf-Rodríguez.
Support American Theatre: a just and thriving theatre ecology begins with information for all. Please join us in this mission by joining TCG, which entitles you to copies of our quarterly print magazine and helps support a long legacy of quality nonprofit arts journalism.