INDIANAPOLIS, March 26, 2023 — In chemistry, we have He, Fe and Ca — but what about do, re and mi? Hauntingly beautiful melodies aren’t the first things that come to mind when looking at the periodic table of the elements. However, using a technique called data sonification, a recent college graduate has converted the visible light given off by the elements into audio, creating unique, complex sounds for each one. Today, the researcher reports the first step toward an interactive, musical periodic table.
The researcher will present his results at the spring meeting of the American Chemical Society (ACS). ACS Spring 2023 is a hybrid meeting being held virtually and in-person March 26–30, and features more than 10,000 presentations on a wide range of science topics.
A video on the research is available at www.acs.org/elementmusic.
Previously, W. Walker Smith, the project’s sole investigator, took his combined passions of music and chemistry and converted the natural vibrations of molecules into a musical composition. “Then I saw visual representations of the discrete wavelengths of light released by the elements, such as scandium,” says Smith. “They were gorgeous and complex, and I thought, ‘Wow, I really want to turn these into music, too.'”
Elements emit visible light when they are energized. This light is made up of multiple individual wavelengths, or particular colors, with brightness levels that are unique for each element. But on paper, the collections of wavelengths for different elements are hard to tell apart visually, especially for the transition metals, which can have thousands of individual colors, says Smith. Converting the light into sound frequencies could be another way for people to detect the differences between elements.
However, creating sounds for the elements on the periodic table has been done before. For instance, other scientists have assigned the brightest wavelengths to single notes played by the keys on a traditional piano. But this approach reduced the rich variety of wavelengths released by some elements into just a few sounds, explains Smith, who is currently a researcher at Indiana University.
To retain as much of the complexity and nuance of the element spectra as possible, Smith consulted faculty mentors at Indiana University, including David Clemmer, Ph.D., a professor in the chemistry department, and Chi Wang, D.M.A., a professor in the Jacobs School of Music. With their assistance, Smith built a computer code for real-time audio that converted each element’s light data into mixtures of notes. The discrete color wavelengths became individual sine waves whose frequency corresponded to that of the light, and their amplitude matched the brightness of the light.
Early in the research process, Clemmer and Smith discussed the pattern similarities between light and sound vibrations. For instance, within the colors of visible light, violet has almost double the frequency of red, and in music, one doubling of frequency corresponds to an octave. Therefore, visible light can be thought of as an “octave of light.” But this octave of light is at a much higher frequency than the audible range. So, Smith scaled the sine waves’ frequencies down by approximately 10-12, fitting the audio output into a range where human ears are most sensitive to differences in pitch.