Music in new media

I’ve been thinking about music again, and staring into the pit of unknown unknowns that is my non-existent understanding of music, except as a casual listener. I know music affects me, and I’ve how important an emotional trigger in the games I’ve been playing for my studies, but I don’t know how or why, and right now I’m wishing I had a degree in Cognitive Psychology to help me understand. (The certificate would sit alongside the degrees in Computer Science, English and History that I don’t have).

Its such a huge subject, but I came across this paper, by Annabel Cohen, which though quite old (1998) I’ve found to be a useful primer. It also led me to the Gamessound website of Dr Karen Collins, Canada Research Chair in Interactive Audio at the Games Institute, the University of Waterloo, Ontario, who has written lots of juicy papers which start where Cohen left off, and are (the clue’s in the URL, a lot more games specific).

Lets start with Cohen though, a sort of new media music 101. She begins from the notion that “music activates independent brain functions that are separable from verbal and visual domains,”  and goes on to define eight functions that music has in new media:

  1. Masking – Just as music was played in the first movie theaters, partly to mask the sound of the projector, so music in new media can be used to mask “distractions produced by the multimedia machinery (hum of disk drive, fan, motor etc) or sounds made by people, as multimedia often occurs in social or public environments.” Apparently lower tones mask higher ones, and listeners filter out incoherent sounds in preference for coherent (musical) sounds . Of course the downside is music can mask speech too when that speech is part of the intended presentation.
  2. Provision of continuity – “Music is sound organised in time, and this organisation helps to connect disparate events in other domains. Thus a break in the music can signal a change in the narrative [I'm reminded of the songs in Red Dead Redemption here] or, conversely, continuous music signals the continuation of the current theme.”
  3. Direction of attention – Cohen has obviously done some experimental research on this function, broadly speaking, patterns in the music can correlate to patterns in the visuals, directing the attention of the user.
  4. Mood induction – ( quick aside here, check out this Mirex wiki page on mood tags for music). I’ve written about this before, and it’s the most obvious function to me, but Cohen is careful to make a distinction between this and the next function, which is:
  5. Communication of meaning – Cohen says “It is important to distinguish between mood induction and communication of meaning by music. Mood induction changes how one is feeling while communication of meaning simply conveys information.” Yet, when she discusses communication of meaning, she uses examples of “emotional meaning: “sadness is conveyed by slow pace, falling contour, low pitch and the minor mode.” I take from this that her nice distinction is between music that makes the user sad, and music that tells the user “this is a sad event” without changing the user’s mood. Hmmm … I’ll have to think about that.
  6. A cue for memory – This is another one that I’ve written about before. Music can trigger a user’s memories from a past event that’s totally unrelated to the new media presentation, if they’ve coincidentally heard the particular piece before, but the effect is more controllable with music especially written for the presentation. The musical term for this (from opera, arguably the first multimedia presentations) is leitmotiv. The power of the music to invoke memories or “prepare the mind for a type of cognitive activity” is well recognized in advertising and sonic brands such as those created for Intel and Nokia.
  7. Arousal and focal attention – “it is a simple fact that when there music, more of the brain is active” Cohen says (without reference). She does on to argue that with more of the brain active, the user is more able to filter out the peripheries of the apparatus running a new media presentation, and concentrate on the diagesis of the presentation, what Pinchbeck calls presence. On the other hand, she admits that some think excess stimulation pulls focus away from central vision and towards the periphery.
  8. Aesthetics – Here we come to what my colleagues report is the biggest issue with using music in interpretation. Cohen says “music is an art form and its presence enhances every situation in much the same way that a beautiful environment enhances the experience of activities within it.” But she admits that aesthetics is subjective, and “music that is not appealing can disturb the user.” Not only that, but some individuals may find all background music difficult to cope with.

So that’s my new media music 101. Next time I’ll look at what Collins has to add.