WSU Weaving AI Into Classroom, Aiding K-12 Teachers

WSU

This story is part of an AI series looking at how WSU is driving innovation in research and teaching through artificial intelligence. View the entire series as it becomes available.

Washington State University researchers are collaborating with K-12 teachers across the state to create AI tools for tomorrow's classrooms.

Scientists in the College of Education, Sport, and Human Sciences are developing an AI-assisted tool that will help guide scientific inquiry in middle-school classrooms - with an eye toward ensuring that no students are left behind. They're helping design a "rural roadmap" for AI to help small-town schools keep up with their urban peers. And they're planning to equip an RV with AI technologies to deliver training for small-town teachers right at their schools.

"It could pull up into the parking lot and do on-the-spot professional development for rural teachers across the state of Washington," said CESHS Dean Karen Thomas-Brown.

AI presents opportunities and challenges for educators at all levels. In addition to helping school teachers prepare to use AI, WSU is weaving it into its own classes in myriad ways. A team in the Department of Mechanical and Materials Engineering is developing a virtual teaching assistant with funding from the National Science Foundation. Colleges and departments are developing plans for incorporating AI into their curricula and hosting workshops to help students build their skills.

Closeup of Karen Thomas-Brown.
Karen Thomas-Brown

And it's showing up in many smaller, day-to-day ways as well, as professors draw on AI to help prepare class materials, develop chatbots for courses, guide classroom discussion and perform assessments.

CESHS's CAIRE Research Lab supports a spectrum of interdisciplinary research on AI in education. Thomas-Brown is also quantifying how much professors in her college are using AI to teach, with the goal of increasing that by 30% in four years. For a college with graduates in every school district in Washington, it's an imperative that can have a significant impact.

"We're trying to demystify the idea of using AI to enhance your writing as a professor, to enhance your teaching as a professor, to enhance how you communicate," she said. "We're like a financial investor. We're diversifying our portfolio to make sure we're dipping our toes in multiple areas of AI."

Professors face a two-track learning curve: Figuring out how, or whether, to draw on AI technologies themselves, while equipping students with AI skills for a workplace where most people will need them. That looks different from classroom to classroom, discipline to discipline, and professor to professor.

But one thing is sure: Students are living in a world influenced by AI, and they'll be graduating into one that's even more so.

There is an adoption curve reflected across the faculty, WSU leaders say. Some are enthusiastic early adopters. Some are dipping their toe in the water. And some resist it. However they use it, the key is to be transparent with students, noting where AI is required, allowed or prohibited, and explaining why.

A professor talks to students during a class.
Robert Crossler, chair of the Department of Management Information Systems and Entrepreneurship, and Hubman Distinguished Professorship in Management Information Systems in the Carson College of Business, works with students at last fall's AI@Carson Workshop. Crossler is the founder of the workshop and the recently launched Carson TechReady Collaborative (photo by Robert Hubner, WSU Photo Services).

Professors have a range of concerns about possible downsides. There is the fear of "metacognitive laziness," or diminished critical-thinking and problem-solving skills arising from using the tools as short-cuts. Academic dishonesty is another worry, as is concern about feeding information into the big-tech, large-language models. WSU leaders said it's important to teach about the potential pitfalls and demonstrate how to use the technology responsibly.

"There's a real opportunity to use AI in our teaching in a way that models ethical AI use for students," said Katherine Watts, a scholarly professor of English who is also chair of the WSU Teaching Academy and director of University Common Requirements. "We have a responsibility to prepare students for that reality, and we have a good opportunity to model ethical and responsible AI use in our teaching."

'Humans in the loop'

An important principle is the need to keep human hands on the wheel at all points.

In the School of Electrical Engineering and Computer Science, Associate Professor Subu Kandaswamy aims to have students treat AI like an assistant early on, then use it as they progress later to building applications. But students must still develop an understanding of the underlying scientific algorithms. The faculty and teaching assistants prepare a pool of questions, and the students are required to show which piece of code accomplishes which task.

"With the use of AI as an assistant, the strategy for assessment changes," Kandaswamy said. "I'm not going to assess how much they completed because we really don't know if they completed the work or if the AI completed it for them, so the assessment moves toward how much they understand of what they submitted. If they ask it to do some of their work and it spits out something which is 80% right, they have to identify and fix the remaining 20%."

In the humanities and arts, a public debate swirls around the impact of AI on key areas of human creativity. Some limited uses have been adopted in classes - such as using AI to generate ideas or suggest an outline, with the student completing the final work, or the creation of chatbots that may augment a professor's ability to engage with students in individualized ways.

I was in graduate school when Wikipedia was going to ruin writing, and my mentors had been through the Internet and word-processing movement. It's a challenge but it doesn't mean it's all bad.

Katherine Watts, professor

Washington State University

Watts noted that technological changes have always presented challenges for educators.

"Reading and writing has faced a lot of revolutions over the years," Watts said. "Let's not forget that books were once a frightful technology that would ruin your daughter. Pen and paper are technology. I was in graduate school when Wikipedia was going to ruin writing, and my mentors had been through the Internet and word-processing movement. It's a challenge but it doesn't mean it's all bad. We need to figure out what's going on and we need to figure out how to respond."

That response needs to begin even earlier than college, which is why CESHS is working to develop tools for the K-12 system to improve education for rural areas and other underserved populations.

Thomas-Brown tells professors: "Please don't join the bandwagon that the sky is falling. The sky is not falling, so don't Chicken-Little it. It is not a threat. It is actually an enhancement. It is an educational skill."

Tina Hilding, the director of communications in the Voiland School of Engineering and Architecture, contributed to this report.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.