The brain does not necessarily perceive the sounds in music simultaneously as they are being played. New research sheds light on musicians’ implicit knowledge of sound and timing.
“It is very important for our overall impression of music that the details are right,” says musicologist Anne Danielsen at the RITMO Center for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.
Together with her research colleague Guilherme Schmidt Câmara, she is looking for answers to what these details are. They know there are some basic rules relating to sound and timing which most creators of music comply with. Few, however, are aware of what they actually do in order to make it sound right.
“When we talk to musicians and producers, it becomes clear that they simply adjust sounds automatically in order to get the right timing—it’s a form of implicit knowledge,” says Câmara.
In order to make this knowledge more explicit, the researchers have studied the factors that influence when we perceive a sound happening. They have found a pattern and have noticed that our perception of timing is closely related to the quality of the sound—whether or not it is soft or sharp, short or long and wobbly.
When does a sound happen?
Timing the sounds of all instruments so that the music sounds good is essential, but the different notes are not necessarily being played when you hear them.
“Scientists have previously assumed that we perceive the timing at the beginning of a sound but have not reflected critically on what happens when the sounds have different shapes,” says Danielsen.
A sound has a rhythmic center. If you imagine a sound wave, this center is located near the top of the wave, and your perception of where in time the sound is located is actually up there, rather than where it begins.
“If the sound is sharp, the beginning coincides with this rhythmic center. While with a longer and more wobbly sound, we perceive that the center is placed long after the sound has actually begun.”
In order to hit a beat, or to play together in a band, musicians have to tune in to each other to get it right.
“If you have a soft sound and you want it to be heard exactly on the beat, then you need to place it a little early so that it can be experienced like that,” says Danielsen.
Experiments reveal musicians’ strategies
In order to investigate this, Câmara has conducted controlled experiments with skilled guitarists, bassists and drummers.
“They were all given a rhythmic reference, a simple groove pattern that can be found in many genres. Then they were asked to play along with it in three different ways: either right on the beat, a little behind, or a little ahead,” he explains.
This way, he could test their perception of the timing of different sounds, and how they play in order to time the sounds to a beat. After the experiments, they asked the musicians what they had been trying to do.
“They use their own words, saying they are playing slower or more heavily when they are aiming for after the beat. This accords well with what we see as a pattern of influencing the sound rather than just its location.”
Danielsen points out that timing one’s own playing to a beat is something that all musicians practice, so it is something that everyone thinks about.
“However, they are much less aware of how they use sound to communicate timing differences,” she says.
Musicians manipulate sound and time
The researchers believe our perception of sound in time is based on fundamental psychoacoustic rules, meaning how the brain perceives sound signals. All musicians take these more or less stated rules into account, but how they do it depends on what genre their playing falls into.
“Each genre has a characteristic microrhythmic profile. Samba has it its own, EDM has its own, hip-hop has another,” says Danielsen.
In music production, the producer sees the sound in front of her on the screen and can twist and turn the music by moving how the sounds relate to each other.
“Producers who create a groove on a computer know this. They move sounds back and forth on the beat and think: ‘if I put it there it works, and if I put it there it doesn’t.’ So, they learn through experience, and if something needs to sound precise, they need to juggle the sounds around a bit.”
AI strives to give music human qualities
The researchers believe that our knowledge about how different types of sound affect timing could be used to develop software that uses artificial intelligence to create music.
“We can already make a sequence groovier and more human so that it doesn’t sound completely mechanical. If we start with a programmed beat, then the algorithms can move the sounds slightly to affect the style. If the algorithm also takes the shape of the sound into account, we can obtain an even broader palette of rhythmic conditions that can shape the music in a more esthetically pleasing way,” says Câmara.
When you listen to music, it doesn’t take much before something sounds wrong. It’s about context, and about the type of music involved.
“When we play live, we want a margin of error, we’re not machines. There is always a certain amount of asynchronicity,” says Câmara, who is a musician himself.
Although we are talking about tiny shifts, humans have a trained ear for placing something in time by using sound.
“In some contexts, 10 to 20 milliseconds may be enough for hearing a difference. We don’t need to be completely aware of this, but we can feel it.”
Anne Danielsen points out that this does not just apply to people who work with music.
“Compared to what we perceive with our eyes, our precision in terms of time and sound is extremely precise. This makes us very sensitive to spatial sound differences. But also, when listening to differences in voices—whether someone is angry, sad, happy or annoyed—we use finely meshed audio information to interpret what that voice is actually communicating,” she says.
“It may seem incredibly small and insignificant, but it’s actually very important information for us.”
Music challenges our sensory boundaries
Danielsen believes that the fact that music research has enabled us to discover psychoacoustic rules relating to how the human brain perceives sound says something about the importance of conducting research on music.
“We do extreme things in music. By testing out the boundaries of what we may find esthetically pleasing, we are also testing our perception apparatus,” she says.
Source: Read Full Article