Small-group discussions designed to help elementary students engage in conversations that promote critical analytic thinking, reasoning and deeper understanding of the content they read increased critical thinking over time for fourth- and fifth-grade students, according to a new study by a team that includes researchers from the Penn State College of Education. It's the latest evidence in support of Quality Talk, the "deliberate approach to discussion that transforms student engagement" developed by P. Karen Murphy, associate dean for research and outreach in the Penn State College of Education.
The most recent study, co-authored by Murphy, is now available online ahead of publication in the December issue of Learning and Instruction. Using a new artificial intelligence (AI) approach to assess qualitative data - like conversations - the team analyzed data from nearly 400 small-group discussions that were collected as part of a larger research project. They specifically focused on identifying instances where fourth- and fifth-grade students' class conversations aligned with key discourse indicators - the ability to elaborate on an explanation and to explore discussed ideas with others - known to be associated with high-level reading and subject comprehension. The researchers spent years developing a discourse coding manual to guide the AI analysis process for this type of research. They dedicated approximately a month to preparing and finalizing the specifications for the study's AI model, which then processed the data in about 48 hours.
"Essentially, we wanted to identify when students were able to express their thinking about the subject matter being taught in ways that moved beyond just simple statements and answers and instead included reasoning and evidence in support of what they were saying," said Murphy, who is also a distinguished professor of educational psychology and Social Science Research Institute co-funded faculty member. The team was most interested in two particular indicators of high-level comprehension: individual argumentation and collaborative argumentation, which refer to elaborating on an explanation and exploring a topic in exchanges with others, respectively. Carla Firetto, Associate Professor at Arizona State University and lead author, expanded on these indicators, "as students engage in the discussions, they can express individual argumentation on their own, as they express their own thinking as they talk, or they can co-construct in the discussion with each other, going back and forth to come to an understanding together."
The data from the almost 400 small-group conversations would have taken up to a semester of work by undergraduate and graduate research assistants to prepare for analysis, Firetto said.
"By using AI, we were able to gather insights about students' growth from data we collected in a prior project funded by the Institute of Education Sciences, but had not been able to investigate before," Murphy said, explaining that the traditional way of coding discussions - individual researchers listening to the conversations and marking for specific indicators - was too time consuming and cost prohibitive. "This paper shows how AI can be leveraged to advance the field in ways that were not previously possible, " explained Firetto.
Murphy noted that while data can now be processed quickly, it took the team many years of conversations about whether certain examples of talk meet or do not meet the definition of the codes, resulting in a very clear set of rules to help inform the prompts used to train the AI, yielding better and clearer results.
"We are still just scratching the surface of learning what AI tools like this can do and how to best balance the trade-offs," Murphy said, noting that the approach could work for researchers in any field that deals with qualitative data, as long as they have a clear manual for defining the features of importance. "AI certainly offers some tangible advantages with regard to efficiency and scalability."
Firetto said the team was particularly concerned about data protection and took several steps to ensure the information was appropriately anonymized and secured before the AI-driven analysis.
Importantly, the researchers noted, not only can this approach be used in future research projects with new data, but also for previously collected data that may still have valuable information waiting to be discovered.
"Researchers who are involved in large, multiyear projects that gather troves of qualitative data often find that the data go un- or under-analyzed," Murphy said. "There is often so much more that we could still learn from those data, but there are rarely sufficient resources to analyze it in a traditional manner. Overall, it is exciting to think about how we might extend this approach to other projects we have and to explore whether there are other research questions we can return to now that were not previously feasible."
Other contributors to the paper include Penn State doctoral students Emilee A. Herman and Yue Tang; Emily Starrett, math program manager at the Hamlin Robinson School; Jeffrey A. Greene, associate dean for research and faculty development and McMichael Professor at the University of North Carolina at Chapel Hill; and Lin Yan, doctoral student from Arizona State University.
This research was supported by the Institute of Education Sciences, U.S. Department of Education; Mary Lou Fulton College for Teaching and Innovation internal grant funding at Arizona State University; and the McMichael Professorship in the School of Education at the University of North Carolina at Chapel Hill.