Researchers Translate Biology Into Music

University of Kentucky

For centuries composers have attempted to mimic the natural world in music. Antonio Vivaldi, in the early 18th century, musically illustrated "The Four Seasons." Gustav Holst attempted to orchestrate the solar system in his tone poem "The Planets." And in a famous musical illustration of nature, Sergei Prokofiev assigned themes - and specific instruments - to animal characters in the story of "Peter and the Wolf."

But what about the natural world of the human body, inside the cells that make up the tissues of organs? Is there a music that can be applied to those workings?

Three University of Kentucky researchers believe there is, by using a data interpretation method called data sonification. Their research received funding from the Office of the Vice President of Research (OVPR) Igniting Research Collaborations program and the OVPR and UK International Center's UKinSpire program.

Data sonification is the process of taking exact data points - such as barometric pressure readings or gene sequences - and mapping them onto musical elements like melody, pitch, rhythm and timbre. It translates those numbers into music.

The idea for sonifying cell information was born in the lab of Luke Bradley, Ph.D., acting chair of the College of Medicine's Department of Neuroscience.

"I was working on protein sequences and how they're folded, and there's patterns," Bradley said. "And I would just kind of tap my pencil to the patterns of the amino acids, the different chemistry groups. I was wondering, could that be set to sound?"

At a faculty mixer for the Chellgren Center and Lewis Honors College, Bradley posed that question to Michael Baker, Ph.D., a professor of music theory and composition in the College of Fine Arts' School of Music.

Baker said, "Yeah, probably," and then brought in his colleague at the School of Music, Timothy Moyers, Ph.D. Moyers, an associate professor of music theory and composition and an electronic music composer.

Cellular symphony

"You can think of the cell as a symphony," Bradley said. "And when everybody's playing in tune and properly, it sounds great. But if you have one group that might have something off in tuning, it doesn't sound good. And so that's what's happening by proxy, in the health of the cell."

That dissonance represents a data point indicating disease, he said.

Moyers said data sonification is a valuable tool because the different senses can reveal variations in information about the same object.

"It could be argued that our ears are much more sensitive to subtle changes than our eyes, especially me," he said. "I'm colorblind, for instance. If you imagine using color to represent data, if there's a small variation on the shade, I wouldn't be able to pick up on that. In a sample, like recorded audio, there's 44,000 samples per second. If one of those is out of place, we're going to hear that."

The sound generated by cells via the data points does not always have to be music, though, Moyers added. A seismograph, for example, sounds quiet under normal circumstances, but when an earthquake happens the needles start scratching frenetically against paper, in soft, but percussive manner.

"There are a lot of different elements, and there's a lot more we can represent, beyond melody and rhythm," Moyers said.

The trio of researchers have been collaborating with researchers at Cambridge University, who have built visual models of cancer genomics.

"They could immediately, looking at a visualization, tell not just the presence of cancer, but the specific nature of the cancer, whether it's cancer from smoking or cancer from UV or other factors," Baker said. "We have built a research engine that is parallel to that, in that the different specific kinds of cancers that we're mapping are sonified to make different specific sounds."

As an example, Baker said to imagine a group of trumpet players all playing the same note. Then imagine one of them adds a straight mute and one of them a cup mute - devices that change the tone of the trumpet's sound in different ways. Viewing those trumpets as a metaphor, with the muted trumpets representing mutations and the unmuted pure tone as normal cell activity, the UK research engine that Bradley, Baker and Moyers use can distinguish not only how many muted trumpets there are - the mutations - but what kind of mutes the individual players are using.

"We're quickly able to take an existing pure sine wave sound, hear the wrinkles that get added to this based on the distortions in the sound, correlated to the data points, and be able to tell exactly what kind of cancer is involved," Baker said.

In addition to being a detection tool for mutations and other cellular anomalies, Bradley said this application of data sonification can be used in the classroom.

"It actually started as an education tool," he said. "We were using this to illustrate to students - high school and middle school students - what a mutation is. How does it affect the symphony of the cell?"

Baker said data sonification's value as an educational tool goes beyond the textbook.

"I think good teaching is not just about - at the end of the semester - having people know things, having people be able to report things," he said. "Good teaching gives ways to ask much more interesting questions."

He said data sonification could work as a "what-if generator," allowing student researchers to alter data points into mutations and other regularities to hear how it changes the cellular symphony.

Collaborating with numbers

Data sonification can have an artistic application as well.

"Music and art are all about patterns and how we react to those patterns," Moyers said. "But I'm very interested in seeing if I could angle this in a way that I could use this for my own performances, my own creative purposes. It's a new angle that I'm quite excited about."

Moyers said he is taking samples from data sets and using the resulting audio to explore diverse ways of presenting or performing it.

"I'm kind of viewing it as though I'm collaborating in a compositional way or performative way with the data," he said.

Data sonification and other scientific disciplines like astronomy, apply data set information on astronomical elements like planetary rotational speed and generating that as sound. Baker said astronomers are using the harmony of planetary rotational speed in our solar system to compare to planetary rotation among exoplanets, or planets outside of our solar system, ranging from between four and 21,000 light years from Earth. Application of data sonification to that field has led to astonishing discoveries.

In 2017, astronomers discovered the Trappist 1 system, a red dwarf star orbited by seven planets.

"This one fellow that studies sonification of exoplanetary systems found that the planets in Trappist 1 fit into an exact Earth-like harmony in terms of their rotational speeds," Baker said. "The outermost planet at one speed. The next outermost planet goes twice as fast. The next one goes three times as fast as the outermost planet."

The tones created by the data sets associated with those orbital speeds create an overtone series, Baker said. In music the overtone series is a collection of notes that vibrate together to form a single pitch in the same way all the colors of the rainbow combine to make white light. Trappist 1 - or at least the data sets associated with the star system - "vibrated" at a specific pitch like the planets in our solar system.

What began as a hypothetical question at a faculty mixer is becoming a valuable diagnostic and teaching tool for researchers and students. The practice of data sonification is in its relative infancy, and future opportunities and applications are expected and eagerly anticipated.

"There's still a lot of work to be done and a lot of different pathways to pave in that way," Moyers said.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.