AI Enhances Instrumentalists' Ability to Create Melodic Music

yun-aimusic

Music is finding a place in the world of artificial intelligence as Kristen Yun, a performing cellist and clinical associate professor of music at Purdue, leads a multidisciplinary team researching and developing AI tools to help and improve individual and ensemble performances. (Purdue University photo/John Underwood)

Purdue project examines ways technology can help improve soloist, ensemble musicians

WEST LAFAYETTE, Ind. — Purdue University is putting artificial intelligence center stage with novel research examining the technology's potential to improve musical performance, both on an individual and group level.

The multidisciplinary project takes the lead on innovating STEM (science, technology, engineering and math) and art through AI. The project is funded by a National Science Foundation grant.

Yeon-Ji "Kristen" Yun, a performing cellist and the project's principal investigator, said music and AI already have been used together in areas like recording and composition.

"But it's not been used much in music performance and education," said Yun, a clinical associate professor of music at the Patti and Rusty Rueff School of Design, Art, and Performance in Purdue's College of Liberal Arts. "So as a musician myself, I wondered if AI technology could actually provide a benefit to musicians and possible advantages to performance while keeping their creative ability."

This project will develop and integrate techniques from computer vision, natural language processing and audio analysis to create two AI-enabled tools for string music performers. The first tool, the Evaluator, aims to improve individual practice and performance by analyzing audio and video of a musician, then comparing it to digitized music scores and a database of video performances. It will offer potential improvements by detecting deviations in the audio and posture adjustments in the video.

The second tool, the Companion, is intended to play the part of absent instruments in an ensemble by using audio analysis of performances to match tempo and style of the musicians.

The initial focus of the research is on stringed instrument performances, but the intent is for it to be used with any instrument.

"These tools can be for string players with any level," Yun said. "They can be for professionals or for amateur musicians, young or old."

Yun is joined in the research by Purdue co-principal investigators Yung-Hsiang Lu, professor of electrical and computer engineering; Yingjie "Victor" Chen, professor of computer graphics technology in Purdue Polytechnic Institute; Cheryl Zhenyu Qian, chair of art and design at the Rueff School; and Mohammad Rahman, the Daniels School Chair in Management at the Mitchell E. Daniels, Jr. School of Business.

Yun said the Companion tool is much more in-depth than simply reading the music and reproducing the song.

"Musicians have freedom to create their own interpretations," she said. "If they place a note slightly longer or slightly slower, we need to separate the duration from frequency while still following the ensemble's tempo."

The team is looking at the research from the perspective of when AI technology can provide measurable benefits to musicians' practice and performance as well as what factors affect future musicians' acceptance of AI technology in their work.

Yun will collaborate on user studies for the project with Ka-Wai Yu, associate professor in music at Utah Tech University.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.