Researcher Affirms Human Creativity's Value Amid AI

James C. Kaufman explores questions regarding AI's impact on education, equity, and creative work across two new publications

James Kaufman in a red shirt with a painting in the background.

"What we found is that creativity and intelligence still matter," says James C. Kaufman, a professor of educational psychology in the Neag School of Education. "Participants who were more creative without AI also tended to perform better when collaborating with AI." (Peter Morenus/UConn Photo)

As generative artificial intelligence tools rapidly enter classrooms, workplaces, and creative industries, questions about what these systems mean for human creativity have become increasingly urgent. Can AI truly be creative? Does it level the playing field by expanding access to ideas and inspiration? Or does it risk weakening the very skills education is meant to develop?

For James C. Kaufman, professor of educational psychology at the University of Connecticut's Neag School of Education, the answers are complex - and cautionary.

Recent research co-authored by Kaufman, along with his new edited scholarly volume on generative AI and creativity, suggests that while artificial intelligence can support some aspects of creative work, it does not replace human creativity. Instead, it can amplify existing differences in skill, judgment, and expertise - raising important questions for education, equity, and the future of creative work.

Unlike many technology enthusiasts, Kaufman approaches AI with skepticism grounded in research rather than fear of the technology itself.

"Most creativity researchers tend to fall into two camps, those who are very excited about AI and those who are deeply concerned about it" Kaufman says. "I'm in the second camp."

His concern stems largely from how quickly generative AI systems were released and adopted, often without the safeguards, testing, or regulatory frameworks that typically accompany transformative technologies.

"AI was being actively used by people before we had time to study it carefully," Kaufman says. "That's especially problematic when we're talking about learning, creativity, and long-term skill development."

AI was being actively used by people before we had time to study it carefully. That's especially problematic when we're talking about learning, creativity, and long-term skill development. — James C. Kaufman

In a recent two-part study conducted with collaborators from other institutions, Kaufman and his colleagues examined how people engage in creative tasks both independently and with the assistance of large language models (LLMs). Participants completed storytelling tasks either on their own or with AI support. The researchers then assessed creativity, intelligence, and overall performance across both conditions. The study has not yet been peer-reviewed or accepted for publication.

"What we found is that creativity and intelligence still matter," Kaufman says. "Participants who were more creative without AI also tended to perform better when collaborating with AI."

Rather than flattening differences in creative ability, AI acted as an amplifier - benefiting those who already possessed stronger creative and cognitive skills.

"If you already have strengths in a domain, you should be able to use AI more effectively," Kaufman says. "AI doesn't suddenly make everyone equally creative."

The reason, he explains, lies in how creativity actually works. Generating ideas is only part of the process. Creativity also requires evaluating ideas, refining them, and deciding which are worth pursuing.

"AI is much better at generating ideas than it is at evaluating them," Kaufman says. "Deciding what makes sense, what is original, and what is worth pursuing still requires human judgment."

That evaluative stage relies heavily on experience, intelligence, and metacognition - an awareness of one's own strengths, limitations, and goals.

"Knowing what kind of help you actually need from AI is a skill in itself," Kaufman says, offering a simple example to illustrate the point. "If you think of AI as producing work at about a B or B-plus level, someone who is already working at an A level can use it selectively and still produce excellent work. But if someone is operating below that level, their ceiling may simply become the AI's output."

Implications for Learning and Equity

Nowhere are the implications of these findings more concerning than in education.

"The goal of an assignment isn't the final product," Kaufman says. "The goal is learning how to do the work."

When students rely heavily on AI to generate essays, stories, or solutions to problems, they may produce acceptable outcomes, but they risk bypassing the cognitive effort required for meaningful learning. Several recent studies suggest that when AI assistance is removed, gains in creativity and learning often disappear.

"That suggests students aren't necessarily developing lasting skills," Kaufman says. "They're outsourcing the work."

"The goal of an assignment isn't the final product. The goal is learning how to do the work." — James C. Kaufman

Kaufman also points to evidence from other studies showing that students frequently overestimate how much they collaborate with AI, reporting thoughtful engagement even when usage data show extensive copying and pasting. One of the study's central findings challenges the popular notion that AI "democratizes" creativity.

"Creativity is already one of the most democratic human traits we have," he says. "Across gender, culture, and socioeconomic status, there are generally no meaningful differences in creative potential."

What AI may introduce instead, he warns, are new inequities.

"As paid versions improve and free versions decline in quality, access becomes increasingly important," Kaufman says. "The most powerful tools will be available to those who can afford them."

Shifting Creative Landscapes

The consequences extend beyond classrooms into creative industries themselves. Many entry-level creative jobs, such as caption writing, concept art, and freelance digital illustration, are already being displaced by AI systems, according to Kaufman.

"These are the kinds of jobs people rely on while they're trying to establish themselves in creative fields," Kaufman says. "When those disappear, entire pipelines of talent are disrupted."

He worries this could lead to a polarized creative landscape, with hobbyist creativity on one end and elite, well-funded creative production on the other. These concerns are explored more fully in his new edited volume, "Generative Artificial Intelligence and Creativity: Precautions, Perspectives, and Possibilities," co-edited with Matthew J. Worwood, assistant professor-in-residence in UConn's Department of Digital Media and Design. The book brings together scholars from psychology, education, computer science, philosophy, and related fields to examine AI's impact on creative thinking, teaching, assessment, and ethics.

Matthew Worwood
Matthew J. Worwood, an assistant professor-in-residence in UConn's Department of Digital Media & Design, co-edited the new book "Generative Artificial Intelligence and Creativity, Precautions, Perspectives, and Possibilities." (Photo courtesy of the Department of Digital Media & Design)

Working across disciplines, Worwood says one of the most surprising aspects of the project was the range of perspectives contributors brought to discussions of generative AI and creativity.

"The variety in views of generative AI surprised me," Worwood says. "Rarely are we sharing perspectives within a single context, and that makes conversations fun and insightful, but also challenging."

That diversity, Worwood says, reinforces the book's central argument that AI should be treated as a tool rather than a replacement for human creativity, particularly in educational settings.

"Responsible, intentional use starts with the teacher and the learning expert," Worwood says. "It begins with learning objectives and taking time to consider how choices in AI use can either support or hinder students in meeting those objectives."

Transparency, he adds, is critical, and he cautions against allowing decisions about AI use in education to be driven primarily by technologists.

Responsible, intentional use starts with the teacher and the learning expert. It begins with learning objectives and taking time to consider how choices in AI use can either support or hinder students in meeting those objectives. — Matthew J. Worwood

"At the lower grade levels, it has to be the teacher, guided by an administrative team that has consulted with subject-matter experts or scholars in the learning sciences," he says. "Right now, I worry that we're too often defaulting to advice from technologists who may not fully understand how learning works."

At more advanced levels of education, Worwood envisions a gradual shift toward student autonomy.

"We hope to reach a point where students can determine how and when to use AI to support their learning," he says. "But during that transition, thoughtful guidance will be crucial."

Ultimately, Kaufman frames AI as neither inherently good nor inherently bad, but powerful.

"Creativity itself is neutral," Kaufman says. "If everyone were more creative, the world would not automatically be a better place. The same is true of intelligence and AI."

What matters, he adds, is who controls these tools, how they are used and whether institutions invest in thoughtful oversight. For educators, policymakers, and creative professionals, the challenge is not whether to engage with AI, but how to do so without sacrificing learning, equity, and human judgement.

"We're living in interesting times," Kaufman says. "And we're still deciding what kind of future we want these tools to help create."

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.