On Monday afternoon, Senior Staff Writer Charlotte Slovin and Staff Writer Julia Tolda attended a Presidential Scholars in Society and Neuroscience event titled How Music Moves Us: Exploring the Connection Between Music and Emotions. The event focused on the ways music moves humans through the lens of neuroscience, movie-scoring, and machine learning.
If it weren’t for the pandemic, Julia and Charlotte would have met by the Chastity Gates, walked together in the freezing cold, animatedly chatting about classes or their most recent dreams, and sat beside each other during this event. But instead of whispering and casting glances at each other, they texted. The two met because of music, crossing paths after an Opera during their first semester, so nothing was more fitting than attending an event on its power.
Inspired by music’s undeniable ability to make us feel, the paneled event moderated by Matthew Sachs, Presidential Scholar in Society and Neuroscience at Columbia University, explored this connection through expert speakers in three fields: neuroscience, movie-scoring, and machine learning.
Dr. Elvira Brattico, a professor of Neuroscience in the Center for Music in the Brain at Aarhus University in Denmark, spoke about her leading research in the field of neurology in connection with music and aesthetics. Chanda Dancy, an accomplished film and television classical composer talked about her work in the 2019 sci-fi film After We Leave, directed by Aleem Hossain. Lastly, Dr. Anna Huang, a current AI Resident on the Magenta project at Google Brain, spoke on her work with machine learning and music technology.
Dr. Brattico kicked off the discussion by revealing a curious fact: music is as ancient as Homo sapiens and evidence for its existence is apparent worldwide. But why has music accompanied the evolution of the human species? According to Dr. Brattico, it is due to its adaptive functions in the realms of emotion, perceptual-cognition, self-knowledge, and society. Music regulates the physiological state, is useful for self-reflection and self-awareness, and for bonding with others; enjoyment and connection to music are a powerful indicator of human fitness. Like sex, food, and gambling, music activates the dopaminergic pathway, making listening to music a pleasurable and addictive activity. Dr. Brattico also addressed the phenomenon of contagion. In the context of music, contagion is the instantaneous connection between an emotional state and a musical feature. This phenomenon is particularly evident in acoustic features, as they can be reminiscent of the human voice or animal noises, triggering certain memories and emotions.
As a film score composer, Chanda Dancy’s perspective on music and the brain came from a deeply intuitive place. Dancy spoke about the techniques that composers use to convey emotion, creating recognizable patterns and motifs, but also a composer’s ability to create emotional music without the neurological basis on what specific sounds trigger specific emotions. To articulate this, Dancy shared clips of her work in the 2019 sci-fi film After We Leave. The very first note of music in the film introduces what Dancy referred to as “The Longing Theme,” a series of notes that, like the characters of the film, develop and evolve as the story progresses. Dancy played clips of this “theme” in states of temptation, betrayal, and full realization, revealing how the score of a film reflects the psyche of its characters. This alternative perspective, one from the wholly artistic side, helped to flesh out the relationship between music and emotion as it showed how much of this relationship can be intuitive, and that neuroscience only reinforces an intuition we already have.
Dr. Huang’s work on the Magenta Project asks her to look at machine learning as a generative way to model–and ultimately create–music. During the event, she spoke about the two aspects of this work: machine fluency and user control. A project that started in the early 1990s, work in fluency uses existing musical pieces as a template to generate new ones. Machine learning considers a musical piece’s expressivity, notes, and performance (volume and timing) to inform the machine’s creative decisions. More recent developments in fluency have led to the Music Transformer, AI that–similar to the work of Dancy–creates and builds upon musical motifs as a piece progresses.
With this ever-growing foundation of music-generating AI, Dr. Huang is focusing on user control and interaction, giving musicians another tool to produce and create music. Dr. Huang spoke about her work on Coconet (the machine learning model of Bach Doodle), AI that generates harmonies based on user-created melodies. She said the project presented a particular challenge to her, as she was no longer working with just one line of music, but rather an expanded number of musical parts (alto, soprano, etc.) under the confines of aesthetically pleasing harmonies. From a more mathematical perspective, she realized that it was work with permutations and conditions. Ultimately, Coconet was able to harmonize with any combination of notes.
How Music Moves Us brought together three seemingly unrelated fields into discussion, adding dimension, and nuance to the complex topic of music. With each speaker, understanding of music’s impact on human development, art, and technology grew. Music is intrinsically connected to the human psyche, the thread that runs through our relationships with ourselves, others, and the unknown. By the end of the event, it was clear that appreciation for music should extend beyond its aesthetic value–it is our past, present, and future.
Brain Music via Bwog Staff
1 Comment
@janey going to hop on spotify and listen to some tunes to restart my brain right now.