Unit 12: Music and the Body

Synopsis

Theoretical and empirical research has demonstrated that music is not only an aural experience but rather involves the whole body. Technological developments, including motion capture and physiological response systems, permit the study of the manifold bodily processes involved. These include the performance movements of musicians, the visual impression of these movements for an audience, movement responses of audiences, as well as the bodily responses when listening or dancing to music. Furthermore, technology facilitates the development of new musical instruments and possibly new ways of interacting with each other in music-related contexts. Future research may tackle the interrelatedness of these processes more and more in real-world settings outside traditional laboratories as well as in artistic environments.

Introduction

This section is about:

  • The importance of the body in making, perceiving, and understanding music,
  • The theoretical and empirical underpinnings of music and the body,
  • How musicians’ movements shape the perception of an audience,
  • Which technology can be used to investigate music and the body,
  • How these technologies can also be used for artistic purposes.

Embodied music cognition

Cognition in the embodied view is seen as interactive, corporeal, and in real-time (e.g., (Varela et al., 1991)). Embodied cognition claims that cognitive processes not only require the brain but also the body, with continuous and goal-directed interactions between them, the sensorimotor system (e.g., the ears and eyes), and the environment. Action and perception are mutually influenced by each other, in that perception requires an action provoking a new perception that make us able to adapt to the changing environment around us. Such a continuous loop and the bodily experiences involved enable us to understand, learn, and accomplishing cognitive tasks.

Following this notion, embodied music cognition (Leman, 2007) assumes that similar processes of action-perception couplings occur in musical activities as well. Thus, music-related movement could be used to reflect, imitate, predict, and understand structure and content of music. Leman suggests different co-existing degrees of musical involvement that relate to synchronisation, attuning and empathy: Synchronisation forms the fundament of physical engagement with music (synchronising to a beat being easy and spontaneous). The next component, Embodied Attuning, links body movement and higher-level musical features, such as rhythm, melody, harmony, timbre, or tonality, to help analyse and physically understand musical structure. Finally, Empathy establishes the connection to expressivity and emotions – thus identifying and reflecting the emotional content through body movements.

Musicians’ body movements

In essence, it is human body movements that cause musical sounds on acoustical instruments. While some aleatoric or generative computer music do not require performers, a large deal of electronic music is also performed by body gestures. Musicians’ body movements are occasionally so prominent in a performance that audiences are strongly affected by it. The Hamburg music theorist Johann Mattheson already observed in his book “The Perfect Chapelmaster” (1739), for instance, that violinists may even distract audiences when occasionally turning their bodies in ways as if they had serious illnesses. In fact, the perception of music is typically enhanced by watching the musicians’ bodies.

In the following, a selection of studies is summarized, some of which were among the first in the field or have been particularly influential on subsequent research. While some findings may generalize across instruments, other findings are more specific to the type of bodily and movement processes involved.

Pianists

Pianists were among the first to be studied in music-psychological experiments. In a highly original study originally published in 1990, Behne asked musicians and nonmusicians to watch performances of Chopin and Brahms pieces (Behne, 1990). The videos were filmed and cut by a professional camera team and showed the four pianists’ bodies from different angles. Nearly all observers perceived performance differences in dimensions such as expressiveness or accuracy. Unbeknown to them, however, they had always been presented with the same audio, such that three performers had only mimicked the play of the actual pianist. These findings have been replicated and indicate that what is perceived musically is to a large extent shaped by the eye, independently of musical experience (Behne et al., 2011).

Bodily, or more specifically, motor resonance plays a role in a seminal study of pianists playing “with themselves” (Keller et al., 2007). Nine pianists played a part of 19th century piano duets on a MIDI piano. Some months later, they listened to the first part while playing the other part. They were not only able to correctly say whether the first part had been performed by themselves or someone else, but they also played in higher synchrony with their own interpretation. Furthermore, those who synchronized better with their own previous recording, better recognized their performance. This study emphasizes how bodily actions performed with a certain timing can resonate in one’s own body and lead to highly synchronized performances.

Violinists

As shown in the study by Behne, seeing a performer shapes how the music is perceived and evaluated. To what extent does this hold true for the kinematic basics of bodily movements, in contrast to the face, clothes or any other information about a person’s outer appearance? A technique developed in the 1970 by Gunnar Johansson has been widely used in music research (see example). Originally using reflective tape at important body locations, and nowadays motion capture technology (see Section 5 below for more detail), a number of relevant markers is shown in so-called point-light displays. The points in these displays may appear to be random, but as soon as they move, observers can tell what activity is carried out by the person recorded, and even whether it is a friend or a stranger walking. Example: A historic introductory film about the motion perception research by Johansson and colleagues (1971)

For her PhD, Jane Davidson (1993) was the first researcher using the point-light technique to study the bodily communication of expressiveness in music(Davidson, 1993). Four violinists were asked to play either normally, in an exaggerated way or in a “deadpan” way without expressiveness. Videos of these performances were created showing only the points of the body, in versions with or without sound. A total of 21 musically experienced individuals rated the expressive intentions correctly even for visual-only videos, which led to strongest perceived differences between deadpan and exaggerated versions. Further studies by Davidson revealed that non-musicians relied particularly on the visual information.

Clarinettists

The body movements of clarinettists have been studied repeatedly. An obvious reason is that playing the clarinet simply allows for a high number of expressive body movements without hindering the performance. A further, more pragmatic reason for empirical research lies in the fact that optical motion capture systems can be easily used with clarinettists, because spatial masking of the bodily markers by the instrument is limited, and there are fewer optical reflections than for instruments made of brass such as saxophones or trumpets.

In a widely cited study, Vines, Krumhansl, Wanderley and Levitin (Vines et al., 2006) decided to show the full videos of two clarinettists performing a Stravinsky piece. Using a multimodal paradigm, presenting audio- or video-only versions, or both audio and video combined, they asked musically trained individuals to judge the perceived tension and phrasal boundaries continuously by moving a slider. Interestingly, the audio and video presentations yielded a considerable level of agreement for phrasal structures, but less so for perceived tension as a measure of perceived emotion. The authors conclude that being able to see the performers’ bodies provided important information that augmented the auditory-only experience.

Twelve years later, the body motion of 22 clarinettists was recorded with an optical motion capture system by Weiss, Nusseck and Spahn (Weiss et al., 2018). By means of a cluster analysis that involved movement data of various bodies parts (more specifically: standard deviations of angle data), they found that some clarinettists often moved up and down in their knees, while others showed strong arm motion or no specific motion pattern. A further group performed very small movements only. This group of performers received the lowest ratings in items such as expressiveness, fluency and professionalism in a perceptual experiment that kept the audio track constant. The “arms group” and particularly the “knees groups” of clarinettists were perceived to be more expressive, indicating again how vision may help the audience of musical performances. These results were not influenced by the performers’ gender.

Percussionists

Seeing the body movements of a musician may also shape the perception of tone duration. (Schutz et al., 2007) showed videos of the upper body of a professional marimba performer in various manipulations. Visual information of long and short strokes were paired with the auditory information of the strokes in all versions mixed. Strikingly, when participants watched audiovisual videos of the marimba player and were asked to judge the duration of a tone by focussing on the auditory information alone, they perceived duration differences according to the visual information. In other words, shorter tones on the marimba sounded longer to them if the visual movements of the strokes were longer. It should be noted that participants had even be made aware of the manipulations beforehand, in contrast to Behne’s experiment cited above.

Expressiveness is also communicated by percussionists via body movements. Comparable to Davidson’s (1993) research with violinists, a further marimba study by Broughton and Stevens (Broughton et al., 2009) found that seeing the body movements helped the audience in judging deadpan versus expressive performances.

Conductors

As a further example in the selection of research into the musicians’ body, the movements of conductors are described in the following. In no other musical profession, perhaps only apart from what is described in Section 3.5, the body itself visualizes music to such an extent. The gestures of conductors are not limited to arm movements but largely involve the head, including facial expressions, as well as the general body posture. The main functions of conducting are coordinating the music in terms of timing and synchronisation, dynamics, acoustical balancing of different instrumental voices, and expressiveness.

(Luck et al., 2010) recorded two conductors with eight motion capture markers on the hands, arms and shoulders. Using a continuous response paradigm, observers were asked to judge the expressiveness, valence, activity, and perceived power by moving a slider on a horizontal line while watching video presentations of the two conductors. Using a regression analysis, this study was among the first to show which specific movement features had influenced observers’ judgments. For instance, expression was judged to be higher with higher acceleration of both hands, but lower jerk of the left hand, and when the left hand was closer to the body. Among other findings, these results suggest differences between the functions of both hands that have similarly been described in conducting manuals.

Another study asked how musicians are able to play with different conductors, since their bodily gestures are always slightly different (Wöllner et al., 2012). Based on the movements of twelve conductors, a morph was constructed (see animation). For this, the kinematic information of the upper-body movements was averaged in the horizontal and vertical dimensions. Participants in an experiment watched the twelve individual as well as averaged conductors and tapped the perceived beat along with the movements. For the averaged morphed versions, jerk in the movements was reduced, and participants synchronised more consistently and accurately with these conducting patterns. Expressiveness, on the other hand, was higher for individual conductors. These results suggest that musicians mentally construct a prototypical representation of a conducting pattern, and if visual conducting movements are close to this prototype (such as for averaged movements), then this enhances synchronization with them.

Video animation: Point-light display of a morphed averaged conductor (center) and twelve individual conductors (Wöllner et al., 2012).

Making music via bodily gestures: Interactive performance systems

Music and media technology permits the shaping of sounds by bodily gestures, without physically plugging a string or touching a key or membrane. From the 1920s onwards, the Theremin has fascinated audiences in the Soviet Union and abroad. Motion capture technology and widely available systems such as Microsoft’s Kinect (which had been a relatively low-cost marker-less infrared video system) offer multiple ways of mapping bodily movements with sound. In other words, bodily movement parameters such as velocity or spatial position over time are translated into changes in pitch, instrumental timbre, intensity, and so forth.

Example: Live electronic music using Kinect

Example: A performance of “Motion Grip” (Stockholm)

Since the mapping happens in real time, audiences typically perceive the cause of the sound to originate in the performer’s bodily movement. In some cases, on the other hand, if there are high latencies or large spatial distances between performer and loudspeaker (that is, the actual acoustical sound “source”), performers and audiences may experience some form of “disembodiment” (Miranda et al., 2006); (Hajdu, 2017); (Wöllner, 2019).

Further technical means of shaping music by gestures are gloves, often containing accelerometers, a gyroscope and sometimes strain or bend sensors. The information of these sensors over time is then transferred to audio systems and mapped with various sound parameters. These gloves became known to wider audiences by pop musicians like Imogen Heap. Example: Imogen Heap explain the Mi.Mu gloves:

Perceiving and responding to music with the body

Physiological responses besides the auditory system

Listening to music involves our whole body, even if we do not move. There is now an abundance of research into the physiological sensations that are often elicited by music, from shivers, raises in pulse, sweating, to laughter or tears. These bodily responses mirror the emotional arousal involved in listening. Technology has become more affordable even for smaller research laboratories to reliably investigate individuals’ skin conductance, respiration rate, pupil diameter, or other markers of physiological arousal. Familiar and preferred music typically leads to stronger bodily responses even in infants.

Besides, we do not only perceive music by the ears but also by other senses. Studies like those referred to above show the impact of visual perception. Using the same clarinet stimuli as (Vines et al., 2006), an experiment found that if participants both see and hear the music, this resulted in higher mean skin conductance amplitude as compared to the sum of both audio and video presentations alone (Chapados et al., 2008). This finding suggests that the combination of these two senses leads to new perceptual qualities that in turn affect the body.

In addition, hearing-impaired individuals may sense the acoustical vibrations in a tactile way. A prominent musician is the percussionist and composer Evelyn Glennie, who “hears” music’s good vibrations mainly by her feet. Particularly for high intensities, sound waves are also conveyed through the body and reach the middle or inner ear this way, or the “vibes” can even be felt in the stomach directly. In 2000, Todd and Cody published research on the “rock and roll threshold” that investigated responses in a neck muscle (musculus sternocleidomastoideus) caused by loud music (Todd et al., 2000). Techno-like bassdrum rhythms were presented to student volunteers from 90 dB stepwise up to 120 dB, a volume that can also be found in dance clubs (and may cause hearing damage over longer time spans). When the rhythms were played at 105 dB, half of all participants had quick muscle potentials that were involuntarily caused by the vestibular system (which is anatomically linked to the inner ear, but facilitates bodily balance in humans); hence these responses were not conveyed via the auditory system and cortical brain areas. For 120 dB, 90% of participants had such muscle potentials – the threshold was reached.

The vestibular system may even have a role in perceiving rhythms. Laurel Trainor and her team (Trainor et al., 2009) stimulated the vestibular systems of participants (students of course, again) with short pulses at 25.1 msec intervals. Peak stimulations were given at duple, triple or in-between interval ratios. Subsequent judgments of ambiguous rhythms were influenced by these stimulations, and shifted either to be duple or triple interpretations, based on the entrainment of the vestibular system. These results suggest that even “cognitive” judgments of rhythms are influenced by bodily processes that are related to posture, balance, and movement. This may also be an explanation for why people enjoy banging their head to music, or why they move their body when determining the metrical structure.

Dance

Music makes us also move, be it tapping one’s foot or finger, head-banging, or full-body dancing. Dance, or music-induced movement, has been mostly studied in individual or dyad settings. In a seminal paper, (Toiviainen et al., 2010) could establish connections between metrical levels and movement directions: vertical movement was mostly aligned with the beat level while horizontal movement mostly with the bar level. (Burger et al., 2013), for instance, could show that dancers follow rhythmic and timbral structures of the music to a large extent: high pulse clarity, percussive sounds, and high spectral fluctuation in both low and high frequencies were embodied with increased amount and speed of movement in several body parts. (Van Dyck et al., 2013) found the more present the bass drum, the more active and synchronized the participants’ movement. Moreover, techno music was embodied with higher acceleration and more overall movement compared to Latin, funk, and jazz (Burger et al., 2020). (Burger et al., 2013) found relationships between perceived emotions and music-induced movement, such that happy music was embodied with complex, rotating movement, angry music with jerky movement and a lack of body rotation, and sad music with low complexity. Also individual characteristics, such as dancers’ personality traits, affect how we move: (Luck et al., 2010) found that extroverts moved more actively than introverts, while dancers high on the openness trait also exhibited more open body.

Video examples related to (Burger et al., 2013): How does music with high pulse clarity affect movement:

How does spectral flux in high frequency affect movement:

When comparing individuals dancing alone and in pairs, (Carlson et al., 2018) could show that the amount of movement increases when dancing with a partner compared to dancing alone. Moreover, empathic dancers appeared to react and adapt more to their partners than nonempathetic dancers. Solberg and Jensenius (Solberg et al., 2019) studied how small groups (9-10 people) interacted while moving to electronic dance music pieces in a club-like setting, detecting whole-group entrainment and more pronounced movement in the vertical dimension after the drop in the music (when the beat is introduced again after the break).

Capturing motion to study music and the body

Different systems for collecting movement data

In order to conduct research on human movement, it is necessary to employ methods that precisely describe the motion. This can be achieved using various approaches, ranging from gathering still pictures of body postures with a photo camera to capturing continuous movement with high-speed video or infrared cameras or non-visual techniques that consider orientation, acceleration, or force. Besides sensing technologies to capture the motion, computer technology is commonly used to process or store the data and represent the movement data in numerical form to allow quantitative analysis or real-time processing. Thus, motion capture (mocap) usually refers to motion representations in digital formats.

Motion data can be represented in several ways. Position/displacement data can be measured in two dimensions (i.e., on a plane) or in three dimensions (i.e., in a space). Additionally, motion data can be described in terms of three- dimensional orientation (referring to an object and its orientation in space). Such data is usually denoted in the form of six degrees of freedom (6DOF), that is 3-dimensional position and 3-dimensional orientation/rotation data.

There is a wide range of sensing and motion capture technologies available, ranging from inexpensive hand-held devices, such as mobile phones and game controllers, to high-precision tracking instruments used in film and game industries as well as in research, mainly in biomechanics and sports. The technologies mostly used in music-related research and applications comprise inertial, magnetic, mechanical, and optical (camera-based) tracking systems. The following list provides a short overview of the different systems.

  • Inertial systems: accelerometers and gyroscopes to measure acceleration and orientation/rotation
    • Affordability: inexpensive sensors, as e.g., implemented in smart phones or game controllers; high-end systems (e.g., whole-body suits) expensive
    • Usability: no direct visibility needed (sensors can be hidden); unaffected by light conditions or magnetic fields; requires sensor for each joint of interest; body suit potentially motion-restricting, customized
    • Portability: yes, light and small sensors and self-contained system
    • Precision: the higher quality the better; drift (position error), cumulating during time/length of data capture
    • Data: 6DOF derivative data, yielding drift when integrating
    • Example: XSens Body Suit (https://www.xsens.com/motion-capture)
  • Magnetic systems: position and orientation of objects measured in a magnetic field
    • Affordability: expensive
    • Usability: no direct visibility needed; unaffected by light conditions; affected by other magnetic fields
    • Portability: yes
    • Precision: high accuracy in near-field; distortion of other magnetic fields and in larger areas
    • Data: 6DOF
    • Example: Polhemus (https://polhemus.com/motion-tracking/overview/)
  • Mechanical systems: gears, potentiometers, and bend sensors measuring angles between joints
    • Affordability: the better, the more expensive
    • Usability: no direct visibility needed; unaffected by light conditions or magnetic fields; can be worn on the body (e.g., suit or glove) or detached as external controller device; require a sensor for each joint of interest; body suit potentially bulky, motion-restricting, customized
    • Portability: yes
    • Precision: high
    • Data: angular
    • Example: Joystick
  • Optical systems: camera-based systems using light (usually either light in the visible or in the infrared spectrum) and different imaging techniques
    • Affordability: wide range from low-quality, inexpensive devices (e.g., a web camera) to high-end, expensive systems (e.g., infrared motion capture)
    • Usability: unaffected by magnetic fields; partly affected by light conditions; direct visibility needed; limited capture space restricted by the cameras – in case of infrared systems: reflective markers, allowing flexible configurations, easy to attach, though might constrain or fall
    • Portability: yes, but restricted (since large equipment)
    • Precision: the higher quality, the better
    • Data: 2- or 3-dimensional position data with regular video cameras; with infrared cameras also 6DOF
    • Example: Qualisys (https://www.qualisys.com)

How does infrared motion capture work?

As most music-related movement research uses infrared-based optical motion capture, the following section explains this technique in more detail. Infrared cameras work with light detectable in the infrared part of the electromagnetic spectrum. Such light has longer wavelengths and lower frequencies than visible light. Infrared optical tracking is based on an active source emitting pulses of infrared light at a very high frequency. In a passive system, this emitting source is usually attached to the camera (e.g., LEDs arranged as a ring around the camera lens), which in turn captures the reflections of this light produced by small, usually spherical markers attached to the tracked object. These markers are placed on points of interest, such as characteristic joints of the human body. The temporal resolution of the system is specified by the rate of the infrared light pulses.

Standard capture speeds in music-related applications range from 60 to 240 frames per second, which is sufficient for a wide range of music-related movement, such as playing most instruments and dancing (some systems can capture several thousand frames per second. However, a higher frame rate increases the amount of data without necessarily describing the movement more accurately, as human movement cannot exceed a certain tempo). Optical systems require direct visibility of the tracked objects (i.e., if a marker is hidden, e.g., by clothes, it will not be recorded). Each camera captures a two-dimensional image, commonly with white pixels indicating the markers on a black background. If several cameras are set up in a network around the capture space, data in three dimensions can be obtained (the system can construct the 3D-location of a marker as soon as two cameras detect it). The use of several cameras also increases the field of view and thus minimizes the occurrence of hidden markers. It also reduces the measurement error (at least in some cases).

For capturing data in three dimensions, the first step of a data collection is to calibrate the system, so that it obtains the exact position and orientation of each camera with respect to the others and the floor (see covered volume depicted in Figure below). As a result, the calibration provides origin and orientation of the Cartesian coordinate system relative to which the locations of the markers are determined. This approach yields time-series data representing the displacement of the markers in three dimensions (x, y, and z). If, additionally, the orientation of objects is of interest, so-called rigid bodies can be employed. Rigid bodies consist of at least three markers that are mounted on an object as a fixed cluster (i.e., having stable distances and angles). These markers establish their own coordinate system with the center of the marker cluster as origin and track 6DOF data (see Figure).

Left: Light ‘cones’ sent out by each camera to construct the 3-D space. The covered volume (where markers will be recorded) is depicted in blue. Right: Different data representations: Marker in three dimensions (x, y, z) and rigid body with three markers in six dimensions (6DOF – x, y, z, pitch, roll, yaw)
Left: Light ‘cones’ sent out by each camera to construct the 3-D space. The covered volume (where markers will be recorded) is depicted in blue. Right: Different data representations: Marker in three dimensions (x, y, z) and rigid body with three markers in six dimensions (6DOF – x, y, z, pitch, roll, yaw)

In order to capture whole-body movement, reflective markers are placed on characteristic joints of the human body, such as head, arms, or legs (see Figure below). Where markers are places is highly dependent on the task: for example, if the lower body is not relevant and moreover usually hidden by an instrument, it is a suitable idea to use markers on the upper body only. Markers can also be places on instruments or other objects of interest. Optical motion capture systems basically record all reflections they detect without “knowing” what they represent. Thus, each marker needs to be correctly identified and labelled, usually manually (or half-automatically) after the recording. After the labelling, data are usually exported to be further processed in external software applications, such as Matlab.

Different marker configurations. Left: Full body setup for (gross) whole-body movement. Right: Marker setup on the hand for fine motor control when playing for instance a musical instrument.
Different marker configurations. Left: Full body setup for (gross) whole-body movement. Right: Marker setup on the hand for fine motor control when playing for instance a musical instrument.

How to analyze mocap data?

Several tools exist to analyze mocap data. One is the MoCap Toolbox (https://www.jyu.fi/hytk/fi/laitokset/mutku/en/research/materials/mocaptoolbox), a Matlab toolbox that visualizes and analyses mocap data (Burger et al., 2013). It is freely available online and offers basic functionality from reading in data, plotting the data as timeseries or single-frame marker visualizations, creating animations, gap filling to fill missing frames when markers were hidden, data transformations and rotations (e.g., to transform the data to a local coordinate system) or calculating time derivatives to receive velocity, acceleration, and jerk data. More advanced functionality encompasses kinetic analyses (e.g., for kinetic energy), periodicity analysis, and the calculation of various movement features, such as the distance travelled, the bounding rectangle (the area covered by movement) fluidity, or complexity of movement. After calculating these movement features, they can be related to, for instance, musical features, emotion ratings, or personality features (see Section 5 above).

Other analysis systems

Other systems exist that can be used integrate and analyze mocap data. EyesWeb (http://www.infomus.org/eyesweb_ita.php), for instance, is an open platform that supports designing synchronized real-time multimedia systems and interfaces including motion capture systems, video cameras, game controllers, multichannel audio, or physiological systems and offers real-time and offline analysis possibilities.

The Musical Gesture Toolbox (https://www.uio.no/ritmo/english/research/labs/fourms/software/musicalgesturestoolbox/mgt-matlab/index.html) is a toolbox (for Matlab, Python, Max, and Terminal) for analyzing music-related body motion, using sets of audio, video and motion capture data as source material (building upon MoCap Toolbox and MIRToolbox).

Real-time streaming

(Some) motion capture systems, like the Qualisys systems, have the possibility to stream motion capture data in real time, so that the 3D/6D coordinates are send to a receiving software (e.g., Max (http://cycling74.com) that, for instance, extracts parameters from the movement data that control sound parameters and create musical output resulting in an interactive performance. EyesWeb, for instance, has been used for such interactive performances, in which, for instance, dance movement were used to change parameters of sound, music, and visuals. The set of Max abstractions called Modosc (https://github.com/motiondescriptors/modosc) offers the possibility to extract movement features from mocap data in real time for movement analysis and interactive sound design.

References

  • Behne, K. -E. (1990). “Blicken Sie auf die Pianisten?!” Zur bildbeeinflussten Beurteilung von Klaviermusik im Fernsehen (Do you watch the pianists? On visually influenced judgements of piano music). Medienpsychologie, 2(2), 115–131.

  • Behne, K.-E., & Wöllner, C. (2011). Seeing or hearing the pianists? A synopsis of an early audiovisual perception experiment and a replication. Musicae Scientiae, 15(3), 324-342.

  • Broughton, M., & Stevens, C. (2009). Music, movement and marimba: an investigation of the role of movement and gesture in communicating musical expression to an audience. Psychology of Music, 37(2), 137–153. 10.1177/0305735608094511

  • Burger, B., Saarikallio, S., Luck, G., Thompson, M.R., & Toiviainen, P. (2013a). Relationships between perceived emotions in music and music-induced movement. Music Perception, 30(5), 519-535.

  • Burger, B., Thompson, M.R., Saarikallio, S., Luck, G., & Toiviainen, P. (2013b). Influences of rhythm- and timbre-related musical features on characteristics of music-induced movement. Frontiers in Psychology, 4:183.

  • Burger, B., & Toiviainen, P. (2020). Embodiment in Electronic Dance Music: Effects of musical content and structure on body movement. Musicae Scientiae, 24(2), 186-205.

  • Burger, B. & Toiviainen, P. (2013). MoCap Toolbox – A Matlab toolbox for computational analysis of movement data. In R. Bresin (Ed.), Proceedings of the 10th * Sound and Music Computing Conference (SMC10) (pp. 172–178). KTH. Stockholm, Sweden.

  • Carlson, E., Burger, B., & Toiviainen, P. (2018). Dance like someone is watching: A social relations model study of music-induced movement. Music & Science, 1.

  • Chapados, C., & Levitin, D.J. (2008). Cross-modal interactions in the experience of musical performances: physiological correlates. Cognition, 108(3). 639-651.

  • Davidson, J.W. (1993). Visual perception of performance manner in the movements of solo musicians. Psychology of Music, 21, 103–113

  • Hajdu, G. (2017). Embodiment and disembodiment in networked music performance. In C. Wöllner (ed.), Body, sound and space in music and beyond: Multimodal explorations (pp. 257–278). New York: Routledge.

  • Keller, P. E., Knoblich, G., & Repp, B. H. (2007). Pianists duet better when they play with themselves: On the possible role of action simulation in synchronization. Consciousness and Cognition, 16, 102–111.

  • Leman, M. (2007). Embodied music cognition and mediation technology. MIT Press.

  • Luck, G., Saarikallio, S., Burger, B., Thompson, M.R., & Toiviainen, P. (2010). Effects of the Big Five and musical genre on music-induced movement. Journal of Research in Personality, 44(6), 714-720.

  • Miranda, E.R., & Wanderley. M.M. (2006). New Digital Musical Instruments: Control and interaction beyond the keyboard. Madison, WI: A-R Editions.

  • Schutz, M., & Lipscomb, S. (2007). Hearing gestures, seeing music: vision influences perceived tone duration. Perception, 36(6), 888–897. 10.1068/p5635

  • Solberg, R.T., & Jensenius, A.R. (2019). Group behaviour and interpersonal synchronization to electronic dance music. Musicae Scientiae, 23(1), 111-134.

  • Todd, N.P.M.A. & Cody, F.W. (2000). Vestibular responses to loud dance music: A physiological basis of the “rock and roll threshold”. Journal of the Acoustical Society of America 107, 496. 10.1121/1.428317

  • Toiviainen, P., Luck, G., & Thompson, M.R. (2010). Embodied meter: Hierarchical eigenmodes in music-induced movement. Music Perception, 28(1), 59-70.

  • Trainor et al. (2009). The primal role of the vestibular system in determining musical rhythm

  • Van Dyck, E., Moelants, D., Demey, M., Deweppe, A., Coussement, P., & Leman, M. (2013). The impact of the bass drum on human dance movement. Music Perception, 30(4), 349-359.

  • Varela, F.J., Thompson, E., & Rosch, E. (1991). The embodied mind: Cognitive science and human experience. MITPress.

  • Vines, B.W., Krumhansl, C.L., Wanderley, M.M., & Levitin, D.J. (2006). Cross-modal interactions in the perception of musical performance. Cognition, 101(1), 80-113. DOI: 10.1016/j.cognition.2005.09.003.

  • Weiss, A.E., Nusseck, M. & Spahn, C. (2018). Motion types of ancillary gestures in clarinet playing and their influence on the perception of musical performance. Journal of New Music Research, 47(2), 129–142. DOI: 10.1080/09298215.2017.1413119

  • Wöllner, C. (2019). Anticipated sonic actions and sounds in performance. In M. Grimshaw, M. Walther-Hansen, & M. Knakkergaard (Eds.), The Oxford handbook of sound and imagination, vol. 2 (pp. 37–57). New York: Oxford University Press.

  • Wöllner, C., Deconinck, F.J.A., Parkinson, J., Hove, M.J., & Keller, P.E. (2012). The perception of prototypical motion: Synchronization is enhanced with quantitatively morphed gestures of musical conductors. Journal of Experimental Psychology: Human Perception and Performance, 38(6), 1390–1403. [DOI 10.1037/a0028130.]

Quiz

  1. Musician move their body more than would be necessary for producing the sounds on their instruments. Why is this so?

  2. Are there any commonalities in musicians’ body movements across different instruments?

  3. What does “mapping” mean in the context of electronic music? And what are the challenges involved?

  4. How could body movement be used in artistic performances (e.g. creating sound/music)?

  5. Does the “rock and roll threshold” only work for rock and roll music?

  6. Why could it be easier to capture certain instruments using optical motion capture than others?

  7. What could be advantages and challenges when trying to track/record movement data outside the lab environment (e.g., during concerts or artistic performances)?

References

  1. Behne, K.-E. ““Blicken Sie auf die Pianisten?!” Zur bildbeeinflussten Beurteilung von Klaviermusik im Fernsehen (Do you watch the pianists? On visually influenced judgements of piano music)”. Medienpsychologie, 2(2): 115-131. 1990.
  2. Behne, K.-E. & Wöllner, C. “Seeing or hearing the pianists? A synopsis of an early audiovisual perception experiment and a replication”. Musicae Scientiae, 15(3): 324-342. 2011.
  3. Broughton, M. & Stevens, C. “Music, movement and marimba: an investigation of the role of movement and gesture in communicating musical expression to an audience”. Psychology of Music, 37(2): 137-153. 2009. 10.1177/0305735608094511
  4. Burger, B. & Saarikallio, S. & Luck, G. & Thompson, M.R. & Toiviainen, P. “Relationships between perceived emotions in music and music-induced movement”. Music Perception, 30(5): 519-535. 2013.
  5. Burger, B. & Thompson, M.R. & Saarikallio, S. & Luck, G. & Toiviainen, P. “Influences of rhythm- and timbre-related musical features on characteristics of music-induced movement”. Frontiers in Psychology, 4:183. 2013.
  6. Burger, B. & Toiviainen, P. “Embodiment in Electronic Dance Music: Effects of musical content and structure on body movement”. Musicae Scientiae, 24(2): 186-205. 2020.
  7. Burger, B. & Toiviainen, P. “MoCap Toolbox – A Matlab toolbox for computational analysis of movement data.”. Proceedings of the 10th Sound and Music Computing Conference (SMC10) (pp. 172–178). 2013.
  8. Carlson, E. & Burger, B. & Toiviainen, P. “Dance like someone is watching: A social relations model study of music-induced movement”. Music & Science, 1. 2018.
  9. Chapados, C. & Levitin, D.J. “Cross-modal interactions in the experience of musical performances: physiological correlates”. Cognition, 108(3): 639-651. 2008.
  10. Davidson, J.W. “Visual perception of performance manner in the movements of solo musicians”. Psychology of Music, 21: 103-113. 1993.
  11. Hajdu, G. “Embodiment and disembodiment in networked music performance”. In Body, sound and space in music and beyond: Multimodal explorations, Wöllner, C. (ed.), (pp. 257–278). New York: Routledge. 2017.
  12. Keller, P.E. & Knoblich, G. & Repp, B.H. “Pianists duet better when they play with themselves: On the possible role of action simulation in synchronization”. Conciousness and Cognition, 16: 102-111. 2007.
  13. Leman, M. “Embodied music cognition and mediation technology”. MIT Press. 2007.
  14. Luck, G. & Saarikallio, S. & Burger, B. & Thompson, M.R. & Toiviainen, P. “Effects of the Big Five and musical genre on music-induced movement”. Journal of Research in Personality, 44(6): 714-720. 2010.
  15. Miranda, E.R. & Wanderley, M.M. “New Digital Musical Instruments: Control and interaction beyond the keyboard”. Madison, WI: A-R Editions. 2006.
  16. Schutz, M. & Lipscomb, S. “Hearing gestures, seeing music: vision influences perceived tone duration”. Perception, 36(6): 888-897. 2007.
  17. Solberg, R.T. & Jensenius, A.R. “Group behaviour and interpersonal synchronization to electronic dance music”. Musicae Scientiae, 23(1): 111-134. 2019.
  18. Todd, N.P.M.A. & Cody, F.W. “Vestibular responses to loud dance music: A physiological basis of the “rock and roll threshold””. Journal of the Acoustical Society of America, 107: 496. 2000. 10.1121/1.428317
  19. Toiviainen, P. & Luck, G. & Thompson, M.R. “Embodied meter: Hierarchical eigenmodes in music-induced movement”. Music Perception, 28(1): 59-70. 2010.
  20. Trainor, L.J. & Gao, X. & Lei, J.-J. & Lehtovaara, K. & Harris, L.R. “The primal role of the vestibular system in determining musical rhythm”. Cortex, 45(1): 35-43. 2009.
  21. Van Dyck, E. & Moelants, D. & Demey, M. & Deweppe, A. & Coussement, P. & Leman, M. “The impact of the bass drum on human dance movement”. Music Perception, 30(4): 349-359. 2013.
  22. Varela, F.J. & Thompson, E. & Rosch, E. “The embodied mind: Cognitive science and human experience”. MIT Press. 1991.
  23. Vines, B.W. & Krumhansl, C.L. & Wanderley, M.M. & Levitin, D.J. “Cross-modal interactions in the perception of musical performance”. Cognition, 101(1): 80-113. 2006. 10.1016/j.cognition.2005.09.003
  24. Weiss, A.E. & Nusseck, M. & Spahn, C. “Motion types of ancillary gestures in clarinet playing and their influence on the perception of musical performance”. Journal of New Music Research, 47(2): 129-142. 2018. 10.1080/09298215.2017.1413119
  25. Wöllner, C. “Anticipated sonic actions and sounds in performance”. In The Oxford handbook of sound and imagination, Grimshaw, M., Walther-Hansen, M., & Knakkergaard, M. (Eds.), vol. 2 (pp. 37–57). New York: Oxford University Press. 2019.
  26. Wöllner, C. & Deconinck, F.J.A & Parkinson, J. & Hove, M.J. & Keller, P.E. “The perception of prototypical motion: Synchronization is enhanced with quantitatively morphed gestures of musical conductors”. Journal of Experimental Psychology: Human Perception and Performance, 38(6): 1390–1403. 2012. 10.1037/a0028130

Authors

Topics

  • embodied cognition
  • music performance
  • dance
  • motion capture
  • interactive performance systems
  • physiological responses
  • audiovisual
  • expressiveness

Contents