This musician transforms scientific data into elaborate melodies

If you think staring at rows of numbers and graphs seems humdrum, these musicians agree. They are on a mission to expose new scientific information through sound, by turning flat datasets into musical scores — creating the soundtrack for science:

Listen to Mark Ballora’s sonification of singularity with flutes and electronics:

Jenni Evans first met Mark Ballora at a Penn State social gathering. Both were professors at the university, but they worked in very different fields. Evans was a meteorology professor, while Ballora studied music technology. At the time, Evans didn’t know how music could impact her research. Like most scientists, she analyzed data with graphs, charts, and statistics. She had never heard of a “sonification.” But Ballora’s enthusiasm convinced her to give it a try.

“I’ve been a traditional scientist my whole life. I’ve always wanted to find a way to make my science relevant beyond the formal science setting, and this was a way to get out of myself,” she says. “My creativity came through working with (Ballora).”

Ballora passed away on July 18, 2019, leaving behind a legacy of audio representations of data, called sonifications. Sonifying data is an emerging field, which Ballora helped to pioneer. From climate trends to a squirrel’s heartbeat, Ballora wove elaborate melodies with the precision and accuracy of a scientist. Ballora said it wasn’t uncommon for scientists to “see” new information in his sonifications that they couldn’t see in their graphs. He called this the “cracks between the sounds.”

For Evans, this fortuitous partnership was long-awaited.

Evans studies hurricanes and tropical cyclones that stray from the typical route. When Ballora sonified her data, she noticed that the structure of the cyclone’s eye played an important role in determining the path, rainfall, and wind patterns.

Ballora created a computer code, assigning musical parameters to data points, like the storm’s longitude. Listening to the data in sonic form, they could hear when a hurricane was moving and gaining speed or intensity. Evans describes the whooping sounds as “techno music” and “something really cool.” For the first time, she could hear how the shape of a storm-related to its wind strength, something her charts and graphs hadn’t revealed.

Ballora also sonified the 2005 hurricane season, the most active Atlantic season on record.

Listen to Mark Ballora’s sonification of the 2005 hurricane season:

“That just blew my mind. You could (understand) the evolution of all the storms at the same time and hear them in relation to each other. When there were no storms, it went from a noisy cacophony to dead silence,” Evans says.

When Ballora sonified her data, she noticed that the structure of the cyclone’s eye played an important role in determining the path, rainfall, and wind patterns.

Ballora’s work led to other discoveries. His sonifications revealed electromagnetic waves produced during high-energy stellar explosions, something others had missed when analyzing the data by sight alone. But while Ballora aimed for strict accuracy, his creative side wasn’t lost. He selected shimmering tones to represent solar wind — the charged particles that form the northern lights. He used a swirling sound to describe tropical storms. Ballora even collaborated with Mickey Hart, the former Grateful Dead drummer, and cosmologist George Smoot to create the sound of a rotating neutron star for the film Rhythms of the Universe.

Mark Ballora collaborated with Mickey Hart, the former Grateful Dead drummer, and cosmologist George Smoot to create the sound of a rotating neutron star for the film Rhythms of the Universe:

Now, his colleagues across the field are feeling the loss of his death.

Psychology professor Bruce Walker, who heads the Georgia Tech Sonification Lab, also studies ways to use sound to present information. He says Ballora’s multidisciplinary nature was essential to his leadership in the field. Walker, who has expertise in computing and psychology, says that sonifications are “for people who can’t look and can’t see.”

His sonifications revealed electromagnetic waves produced during high-energy stellar explosions, something others had missed when analyzing the data by sight alone.

Imagine a firefighter in a smoky building or a scuba diver in a dark kelp forest: they aren’t blind, but, given the conditions, they can’t see. Or think of a driver focused on the road ahead, who can’t look behind them, or a surgeon concentrating on their patient, who can’t look at a monitor screen. These are the people, Walker says, who “run up against the limits of their vision.” Sonifications could fill in the gaps.

According to Walker, humans have an acute ability to discern audio patterns. Take speech, for instance. It is a complex, time-varying signal that encodes data — the data of our thoughts — with pitch, rhythm, tone, and volume. Yet, somehow, we can decipher a message and understand each other.

Walker found evidence that we recognize patterns better with sound than sight. Recently, Walker developed a sonification to help doctors diagnose melanoma. Doctors visually inspect suspicious moles to determine if they are cancerous, a process that is 40% accurate, at best. So, Walker tried using sound to increase the odds: a doctor photographs the suspicious mole, then a computer program analyzes the mole assigning sounds, such as pitch and tone, to colors or textures. It then generates a sonification that represents the mole. Listening to  Walker’s sonification of the mole, doctors can diagnose cancer with 90% accuracy.

Listening to Walker’s sonification of a potentially cancerous mole, doctors can diagnose cancer with 90% accuracy.

“When we convert complex data into sound and listen to it, quite often what emerges is something we can understand through sound, even though we could never understand it visually,” Walker says.

Scientists have actually been doing this for a long time. Galileo, a trained musician, first determined that objects fall with uniform acceleration by listening to an object roll over lute strings down a plank. In the 1960s NASA scientists recognized a space probe was traveling through the rings of Saturn when they converted data to sound and heard bow shocks, like waves, explains Walker. 

In 2015, scientists recorded the sound of gravitational waves created from colliding black holes using gravitational wave detectors, which are like supersensitive microphones.

Listen to the sound of two black holes colliding, recorded by LIGO in 2015:

For musician Kelly Snook, Professor of Media Arts Technology at the University of Brighton, sonifications made complex mathematical functions, like the Mandelbrot set, easy to understand.

“Hearing the sonification of the Mandelbrot set made me understand mathematics. How powerful this is? Our senses really do work together,” she says. “This is what I want to do — make the inaccessible accessible through music.”

Snook is now building a virtual instrument that plays datasets. The first will mark the 400th anniversary of Johannes Kepler’s 1619 book Harmonies of the World, which described planetary motion.

Imagine a firefighter in a smoky building or a scuba diver in a dark kelp forest: they aren’t blind, but, given the conditions, they can’t see… Sonifications could fill in the gaps.

Snook thinks that people can understand science better if they can see and hear the datasets at the same time. Her instrument, called Concordia, will be accessible at museums and through an iPhone app, allowing everyone to play with sonifications.

Brandon Biggs is a self-taught programmer who makes software content more accessible to the blind. He says that “hands-on learning” experiences, like Snook’s Concordia, aren’t as available to educators as they should be.

“A lot of education today is completely visual, and they are ignoring audio and touch. That’s a problem,” he says. “We are neglecting two-thirds of our brain. People need to be exposed to multisensory learning.”

Biggs co-founded Sonja Biggs Educational Services to provide technology and support, making education more accessible to blind students. But that wasn’t always his goal. Once a vocal performance major, he originally wanted to go on to study music technology. At the time, the technology wasn’t ready for someone like Biggs, who has been legally blind since birth. Now, Biggs creates audio maps for strategy games — think World of Warcraft but without the computer screen. A gamer himself, Biggs designs sonifications to help the player guide their character through the game.

Recently, Biggs moved from mapping virtual landscapes to real landscapes. He built an audio and tactile map of the playground in his hometown of Palo Alto, CA. It is now in the testing phase.

“There are no maps that you can view via sound,” he points out. “Google Maps is about as close to sonification as maps get. But I know you look at the GPS at some point.” Biggs explains that the navigation experience isn’t suitable for blind people, who need it to navigate even more than the sighted.

This field isn’t just useful for scientists or blind people: Margaret Schedell, a music professor at Stony Brook University, thinks music can also help people with Parkinson’s disease walk better. 

Schedel partnered with Lisa Muratori, a professor of physical therapy, to create a program that uses sound to improve gait. They asked Parkinson’s patients to listen to their walking pattern in sonic form. They chose a simple sound — “bee-oo, bee-oo” — but it only took a week for them to realize that listening to that all day long is tedious, at best. Instead, they had the patients listen to distorted popular music and audiobooks, which would only play correctly when their walk improved. Schedel saw improvements in the patients’ gaits, even after they stopped therapy.

Schedel says data sonification got underway when Big Data took off, and scientists “ran out of channels visually to convey information.” And, like tapping your feet to a musical beat, understanding sonifications can be learned.

“We aren’t born understanding pie charts,” Schedel says. “Scientists can learn how to listen for data patterns.”

But to Schedel, who is a classically trained musician first, scientific sonifications should not neglect aesthetic appeal.

“We’ve lost the Renaissance idea of music and art and science,” Schedel says.

Collaborations between musicians and scientists of all kinds happen at the annual International Conference of Auditory Display (ICAD). Ballora, who hosted the 2017 conference, once wrote, “Graphics may dazzle us, but they don’t get our feet tapping… In how many ways can we tap into the magic that is sound.”

During the 2020 conference, Mark Ballora’s absence won’t go unnoticed. 

“He was one of the pillars of the ICAD community,” says Snook, “a mentor to a lot of students.”

Related
AI is already making “functional music” for major label artists
UMG has teamed up with an AI startup to turn its artists’ music into “soundscapes” designed to help listeners sleep, focus, and more.
How AI is changing music forever
Musicians and technologists Holly Herndon and Mat Dryhurst discuss the AI music revolution and the critical importance of artist consent in building this new future
These “AI Artists” don’t want to ape existing art — they want to create something new entirely
We sat down with two prominent AI artists to talk about their projects and their quest to create new genres.
Listening to the right tunes can prevent motion sickness in VR
Playing “joyful” music can reduce the symptoms of cybersickness from VR, according to a small study out of Scotland.
Google’s AI music generator is like ChatGPT for audio
Google has unveiled MusicLM, an advanced AI music generator that can produce audio based on a short text description.
Up Next
Exit mobile version