How understanding musical sounds could help build a better hearing aid
Hamilton, ON. Sept 30, 2016 – Learning how humans process natural and artificial sounds differently may one day lead to creating better hearing aids.
Michael Schutz, an Associate Professor of Music Cognition/Percussion at McMaster University and founding director of MAPLE (Music, Acoustics, Perception and Learning) lab has been awarded a Natural Sciences and Engineering Research Council of Canada (NSERC) Discovery grant of $140,000 over 5 years for research on how humans perceive natural and artificial sounds.
Schutz says that while a natural sound will trail off for some time as materials lose energy, many artificial sounds are designed to decay immediately – the auditory equivalent of a freight train stopping on a dime. Understanding how humans perceive natural and artificial sounds differently holds important implications.
People who use hearing aids know that listening to music is not always an enjoyable experience. The problem, says Schutz, is that companies often test hearing aids using sounds far less complex than those created by musicians.
“Artificial tones miss out on important properties of natural sounds. This is a problem both in understanding how our brains process sound in general, and how to design products related to hearing in particular,” says Schutz. “People who use hearing aids find their devices pass lab-based tests, only to be disappointed with their performance in the real world. Refining our understanding of the processing of natural versus artificial sounds will help improve our ability to design effective devices related to listening”
“The 5-year window of funding is vital in setting long-term research priorities, which lets us tackle bigger, and more interesting issues,” he adds. . . .
This represents an excerpt of the original announcement. For the complete text see the official university press.
Chopin, Bach used human speech ‘cues’ to express emotion in music
Hamilton, ON. Nov 2, 2015 – Music has long been described, anecdotally, as a universal language. This may not be entirely true, but we’re one step closer to understanding why humans are so deeply affected by certain melodies and modes.
A team of McMaster researchers has discovered that renowned European composers Frédéric Chopin and Johann Sebastian Bach used everyday speech “cues” to convey emotion in some of their most famous compositions. Their findings were recently published in Frontiers of Psychology: Cognition. Their research stemmed from an interest in human speech perception — the notion that “happy speech” for humans tends to be higher in pitch and faster in timing, while “sad speech” is lower and slower. These same patterns are reflected in the delicate nuances of Chopin and Bach’s music, the McMaster team found.
To borrow from Canadian singer-songwriter Feist, we “feel it all” because the music features a very familiar cadence or rhythmic flow. It’s speaking to us in a language we understand. “If you ask people why they listen to music, more often than not, they’ll talk about a strong emotional connection,” says Michael Schutz, director of McMaster’s MAPLE (Music, Acoustics, Perception & LEarning) Lab, and an associate professor of music cognition and percussion.
“What we found was, I believe, new evidence that individual composers tend to use cues in their music paralleling the use of these cues in emotional speech.” For example, major key or “happy” pieces are higher and faster than minor key or “sad” pieces. The team also discovered that Bach and Chopin appear to “trade-off” their use of cues within the examined music. Sets with larger pitch differences between major and minor key pieces had smaller timing differences, and vice versa. This may reflect efforts to balance the cues to avoid sounding trite, Schutz explains. . . .
This represents an excerpt of the original announcement. For the complete text see the official university press.
Want a better understanding of your favourite song? Move to the beat
We do it in the office, in the car and in that trendy Cuban café when our friends aren’t looking. When the music is good, the urge to move is irresistible.
While we don’t fully understand why music compels movement, a pair of researchers working in McMaster’s MAPLE Lab (Music, Acoustics, Perception & LEarning) have found that it can improve our sense of timing and result in a better understanding of a song’s structure — even for those with little or no musical training.
“When you’re on the bus or walking to work or school, anyone wearing an iPod is usually nodding or tapping along,” said Michael Schutz, an assistant professor who specializes in music cognition and percussion. “That movement is not only enjoyable, it actually helps us grasp music’s structure. Amazingly, we don’t need to be taught to move to music — children do this implicitly from a young age.”
Schutz conducted the research alongside Fiona Manning, a graduate Student in the McMaster Institute for Music & The Mind with an interest in exploring music, movement and perception.
As part of the study, 48 undergraduate participants donned a set of headphones and listened to a sequence of woodblock tones. Some had no musical experience, while others had up to 12 years of musical training.
Participants were asked to tap along with the beat during one round of trials, and sit perfectly still during another. Towards the end of the musical sequence, listeners heard a few seconds of silence before the final note. At the conclusion of the trial, listeners were asked to identify whether the final tone was early, late or right on time.
Moving to the beat helped listeners recognize when the final tone was off — especially when it was late. During subsequent testing, the team demonstrated that movement was most critical during the silence, in terms of maintaining the beat. They found this improvement did not depend on musical training.
Testing was done using a basic 4/4 time signature; a popular choice among pop songwriters due to the relative simplicity and repetitive nature of the pattern. The 4/4 pattern is easy for fans to move along to, which can help their understanding and enjoyment of the music.
“A solid beat is often a key element in hit songs, and it helps listeners move along. If you examine the Billboard ‘Hot 100’ list, many of those songs have a relatively simple beat,” said Schutz. “Music producers and songwriters recognize the pulse’s importance to listeners, who frequently ‘tap along’ or move when hearing their favorite music.”
Now we know that dancing doesn’t just help us enjoy music — it actually helps us hear it better.
The team’s findings will be published in the next issue of Psychonomic Bulletin & Review, a renowned experimental psychology journal.
Won’t get fooled again: Drummers use their hands to create musical illusions
John Bonham. Keith Moon. Buddy Rich. The best drummers in modern music history had it all – groundbreaking technique, lightning speed and an unmistakable panache when it came to performing.
A new paper from assistant professor Michael Schutz, however, outlines how certain percussionists use ancillary gestures (those not strictly required for sound creation) to “trick” the audience into seeing and hearing things that simply aren’t there.
In a research article appearing this month in Percussive Notes, Schutz and co-author Fiona Manning detail the impact of using ancillary gestures to create an auditory illusion during a performance. Using the marimba as a test case, researchers found that notes may sound “longer” when accompanied by an extended swing of the arm, or “shorter” when the movement is subtle – even if the note itself is exactly the same.
“Great performers understand the balance between useful gestures and distracting, unhelpful motions” explains Schutz, who directs the MAPLE (Music, Acoustics, Perception & LEarning) Lab. “It feels like trickery, but raises an important question: ‘Where does music truly exist? Outside our ears, or between them?'”
According to Schutz, our perception of a musical note is affected by a performer’s “post-impact motion.” In other words, the distance, time and velocity of the movement after striking the instrument has a big effect on how the note sounds.
He uses world-famous guitarist and composer Pete Townshend as another example. Renowned for his “windmill” guitar stroke with The Who, Townshend’s post-impact motion may have actually tricked audiences into thinking the chords were louder and resonating longer than they actually were. Not that the band needed any help in putting on a great show.
“You could say it’s just a quirk of someone’s performance characteristics, but it’s all part of the listening experience,” says Schutz. “The gestures can play an important role in shaping our listening experience.”
During a live musical performance, it’s important to note the difference between “sound” and our “perception of sound,” he explains. Our internal perception of the external world is the final arbiter of the musical experience. So even if the entire audience receives the same auditory information, they will experience it in different ways.
“Ultimately, the literal acoustic informationis less important than how it is perceived,” adds Schutz.
CFI awards infrastructure funding for 10 research projects
Hamilton, ON. Jan 24, 2012 – McMaster University came out on top when the latest round of funding from the Canada Foundation for Innovation (CFI) was recently announced by the Honourable Gary Goodyear, Minister of State for Science and Technology. The funding – awarded through a rigorous, objective and merit-based competition process – provides more than $3.25M for ten projects across campus to help establish two new research facilities, and to upgrade and enhance existing labs with new equipment and leading-edge tools. McMaster was the only university with a project count in the double digits, receiving the largest total award out of the 38 institutions awarded funding.
“The CFI’s investment in these ten projects further enhances our University’s vibrant research environment,” says Mo Elbestawi, vice-president, research & international affairs. “Many of the researchers awarded will now be able to upgrade and augment their current facilities with the latest equipment and data storage, while other researchers will be creating new laboratories and developing novel technologies on our campus. This will ultimately increase our research capacity, accelerate our research results and provide an enriched research-training environment for our students.” The research initiatives – representing more than $7.7 million in total project costs – reflect a broad cross section of McMaster’s research strengths in areas such as automotive, health, and infectious disease research.
The first research facility within McMaster’s department of music will be established with help from the CFI’s award of $424,933 in funding for the MAPLE (Music, Acoustics,Perception and LEarning) Lab, housed within the School of the Arts, and affiliated with the McMaster Institute for Music and the Mind. Michael Schutz, assistant professor, music will be leading the project: Informing Musical Performance by Exploring the Production and Perception of Music for Percussion to research questions at the intersection of music and psychology. The lab will use video-recordings of percussion playing to explore audio-visual integration, as well as timing and amplitude information collected from an electronic drumset to explore issues of sensorimotor integration. . . .
This represents an excerpt of the original announcement. For the complete text see the official university press release about this year’s CFI awards.
Moving to the Beat Improves Musical Experience, Helps Listeners Understand Rhythm
Hamilton, ON. Oct 18, 2011 – Why we do move when we hear good music? Researchers at McMaster University have found that tapping to the beat measurably enriches the listening experience, broadening our capacity to understand timing and rhythm.
The research, recently presented at the Acoustics Week in Canada conference in Quebec City, probes the complex relationship between perception and action. “We set out to answer a simple question: Can moving to the beat actually help us understand the music?” says Michael Schutz, an Assistant Professor of Music in the School of the Arts at McMaster, who designed and conducted the study. “We found that tapping along while listening does more than help us feel and enjoy the music. It actually helps us hear it better.”
Participants in this study heard a series of regular beats and were asked whether the final beat was consistent with the preceding rhythm. They then rated their confidence to each response. On half of the trials subjects were asked to tap along on an electronic drum pad; on the other half they listened without tapping. When the experimenters played the final tone after participants would have expected the beat, listeners performed 87% better at detecting the change when tapping vs. listening passively. The tapping had little effect on performance when researchers played the tone early or on time.
These findings have implications for listeners, performers, and music educators alike. “From a young age, we teach students to move to the music while performing, and now we know at least one reason why this is beneficial,” says Schutz (who is also a professional percussionist and director of the university percussion ensemble). “This study sheds light on why moving while playing helps musicians keep time and improves their overall performance.”
Schutz and his team also found that participants who tapped to the beat felt more confident in their responses compared to those who did not tap.
Musical illusion fools audiences and performers, says researcher
Hamilton, Ont. January 4, 2010—Visual information can have a profound impact on how we experience live music, creating an illusion where percussive sounds seem longer or shorter than they really are.
In an article published in a recent edition of the journal Percussive Notes, Michael Schutz, assistant professor at McMaster University and core member of the McMaster Institute of Music and the Mind, describes how expert musicians take advantage of a previously undocumented musical illusion through visible physical gestures to change the way audiences hear their performances. Intriguingly, the performers themselves are generally unaware of what they are doing.
Using videos of a world-renowned percussionist Michael Burritt, Schutz found that the length of the physical gesture—the up-down motion used to strike a percussion instrument—has no effect on acoustic duration of musical notes. In other words, notes produced using long and short motions are acoustically indistinguishable. But when study participants were watching the gestures as well as listening, the notes sounded long or short due to their brains’ integration of auditory and visual information.
“Although physical gestures fail to change the sound of a note, they can change the way a note sounds,” explains Schutz. “It’s very much like the well-known ‘ventriloquist illusion’ in which we think the speech or sound is coming from the lips of a mute puppet.”
This raises some interesting questions about how music is best experienced, he says. In this context, performers can only realize their musical intentions through the use of visual information. Therefore, do CDs, mp3s and radio broadcasts capture the full musical experience, or do they instead rob performers and listeners of an important dimension of musical communication?
Not only do expert musicians ‘trick’ their audiences, they in fact trick themselves. Many professional musicians believe their gestures change the acoustic information they produce. Although this research demonstrates they have no acoustic effect, these