Conducting Data now live!

Still from on of the “above” videos.

Today we can announce that the first version (0.1) of the Capturing the Contemporary Conducting data is now live on Repovizz. We aim to make further data available during 2020, including c3d files from this blog.

The three datapacks can be found in the CtCC folder on the Repovizz site. They are CtCC_S1P1_01, CtCC_S2P1_01 and CtCC_S3P1_01.

Further technical details will soon be available, but each data pack contains the following:

Mocap marker data: plugin-gait (plus additional leg markers), HAWK hand markers for both hands, a subset of the Southampton Face Marker Placement Protocol (some cleanup is still required for the remaining markers). 100Hz. Bones included to help visualise the data.

Audio data: a stereo mix; room stereo pair; individual mics for: oboe, violin, trumpet, percussion, double-bass, bass-clarinet; direct feeds of electronic part, Rhodes piano and click-track. All stereo (apart from click-track), 24bit, 48kHz.

Video data: conductor close, conductor full, ensemble from above conductor, 360 degree (unwrapped), whole ensemble from left, whole ensemble from right, and a single video combining the others in a single frame. Each has visual timecode burned in.

EMG data: bicep and tricep data from each arm, using wireless EMG sensors. 1kHz

Metadata: descriptions of the above and a pdf of the score.

Motion data plus two camera angles in Repovizz

MOCO 2019

Setting up for MOCO 2019 presentation

Richard attended the 6th International Conference on Movement and Computing (MOCO 2019) in October in Tempe Arizona to present a paper titled Force & Motion: Conducting to the Click. It was a very interesting conference, with wide variety of applications involving movement and computing. A particularly interesting paper combined a number of Kinect devices to cover a large space – something which could be particularly useful when capturing in large or complex environments.

In our paper we took a section of the Captured piece where the conductors work to a click-track (which the performers can’t hear) and so all three conductors are working to the same timing, giving a temporal ground truth to analyse. We then compared four sources of timing information for musical beats to study differences in the results: the click track itself, manually annotated beats from observing the video, marker position data from the conductors’ hands, and ground reaction force data from the force plate the conductors’ stood on.

There were a number of findings in the paper, but interestingly we were able to extract the beats from the force plate data with comparable precision to using the hand marker, providing a potential means for capturing conducting beats in live performance in a non-intrusive manner (no hand-held controller or attached sensors) and without issues of lighting or line of sight associated with optical systems.

Our paper can be found here: https://dl.acm.org/doi/abs/10.1145/3347122.3347139

Testing…testing…

We are nearing the time that the conducting data will be ready for upload to this website and to repovizz (repovizz.upf.edu). Repovizz is an open repository for multiple data streams, including mo-cap data that allows visualising and simultaneous playback of a wide variety of content. Our data includes multiple camera angles, multiple audio tracks in addition to fully marker labelled and gap-filled mo-cap data and EMG sensor outputs. We’re now testing on repovizz as you can see in the screenshots below.

Initial test: Two video views, plus plain mo-cap data

This has taken many hours of editing and rendering A-V data, many, many hours of data processing in Vicon’s Nexus software, and a number of other tools (including Mokka and Matlab) to finally get all material time-aligned, cropped, correctly formatted and bundled into repovizz datapacks to be uploaded.

Audio test: Multiple audio tracks successfully loaded.

The process from data capture through to publishing will be fully documented in the release in order to make sure that what the data contains is well understood, but also to aid any future conducting mo-cap studies.

EMG Test: Upper arm data, left and right bicep/tricep

The work is still ongoing, but we thought we’d share a few screenshots to whet appetites. The data will be available with open access online later this year.

Six camera angles, plus composite. Mo-cap coloured, points scaled and bones added.

Richard will be presenting a paper using some of the data from the project at MOCO 2019 next month… expect a blog post here afterwards.

As ever, our appreciative thanks to the project funders – the British Academy/Leverhulme Trust – Vicon Motion Systems for their technical help and equipment loans, and the conductors and instrumentalists who took part.

the ctcc team.

OCI Conducting Studies Conference

I was pleased to present a paper at the Oxford Conducting Insitute Conducting Studies International Conference in June. There were some excellent papers and great to talk with some like-minded people about all things conducting!

Here is the title and abstract of the paper:

Title: 
Capturing the Contemporary Conductor: Using Motion-Capture Technology to Study Conducting Gesture 

Abstract:
In September 2017 seven professional instrumentalists and three conductors (Holly Mathieson, Geoffrey Paterson and myself) convened in the Biomechanics Laboratory at the University of Southampton to rehearse and record my brand new three movement composition Captured: Three Mo-Cap Conducting Experiments for Small Ensemble.

More than one hundred motion-capture markers were attached to each conductor (face, hands and body) following a specially designed and tested protocol, which allowed the multi-disciplinary project team to capture precise three-dimensional representations of all movements made by the conductors with a high-end, high-resolution motion capture system. A multi-track audio recording of the ensemble and a wide range of video documentation were also collected. The mo-cap data set and other documentation is currently being prepared for future study by the team and other researchers through open online access.

In this paper I will discuss my contribution to the project outlining some of the key conducting gestures I identified for capture through drawing on my own conducting experience and a review of relevant literature, considering how my compositional approach and preoccupations were informed by the unique exploitation of the new work and reflecting on my experiences as a conducting laboratory guinea pig. I will also use the data-set to undertake a comparative analysis of how the three conductors approached a short section of Captured. 

This project is funded by the British Academy and the Leverhulme Trust. We are grateful to VICON who provided additional cameras and technical support. 

And here is a sneeky peek of one of my examples from the presentation, which shows my beat gets smaller (I predicted that) and the ictus gets higher (didn’t know that!) as the tempo increases…

 

Capture Day!

After months of planning and two days of setting up equipment, Capture Day finally took place on 27th September 2017. The team arrived early to make sure the systems were up and running ready for the conductors and ensemble.

Bob Dimmock from Vicon setting up the additional motion capture cameras prior to Capture Day.

Three conductors – our own Ben Oliver, Holly Mathieson  and Geoffrey Paterson were captured in turn, each having rehearsal time with the ensemble prior to giving two performances which were both captured.

Ben Oliver in full capture mode!

Thanks to Vicon’s generosity in lending us additional cameras and devices for the capture system, as well as technical support to optimise the system, we managed to obtain high quality face, body and hand data from the conductors.

Conductor Holly Mathieson being marked up by Dan Halford

In addition to the Vicon motion data, we have video of the conductors and ensemble from 7 cameras (including a 360 degree video) in various locations, plus motion data from a Kinect 2, audio recording from stereo and close microphones and muscle data from four wireless EMG devices on the conductors’ biceps and triceps.

Conductor Geoff Paterson with markers and EMG sensors on view.

The day ran very smoothly, with help from Harry Matthews, Beth Walker and Sergiu Rusu (student and recent graduates) to help with unsticking and sticking markers amongst other activities!

Richard Polfreman and Dan Halford keeping an eye on the tech!

We are deeply grateful to Bob Dimmock, Phil Bacon, Matt Oughton and everyone at Vicon for all their help and generosity in making this day a success. We’d also like to thank Geoff and Holly for being such willing guinea pigs, and also our ensemble, for their wonderful patience and playing: Anna Durance (oboe + electronics); Vicky Wright (bass clarinet); Julian Poore (trumpet); Joley Cragg (percussion); Liga Korne (electric piano); Aisha Orazbayeva (violin) and Dan Molloy (double bass).

Finally, of course our thanks to The British Academy and the Leverhulme Trust for their funding of the project.

A Visit from Vicon

This week we were especially pleased to welcome a team from Vicon Motion Systems to the University. Vicon are a world-leading developer of motion-capture systems and they came to show us their face capture technologies to explore their potential use in our Capturing the Contemporary Conductor research project  – Richard Polfreman (Music), Benjamin Oliver (Music), Cheryl Metcalf (Health Sciences), and research assistant Dan Halford. Vicon mo-cap systems are used in biomechanics research as well as the entertainment industry, where facial expression capture is used to drive animated characters to bring them more realistically to life. Previous studies of conducting have highlighted the importance of facial expressions in communicating to the ensemble, and so we are interested in capturing this information in addition to body motion and beyond simple video recording.

Matt Oughton (EMEA Sales Manager), Dami Phillips (Technical Sales Engineer) and Katie Davies (Support Engineer) arrived at the motion capture lab with two boxes of kit which we were keen to unpack and try out as soon as possible. First was their brand new Cara Lite system, recently announced at SIGGRAPH 2017. This two-camera based system can be used markerless with analysis software to model the movement of the wearer’s face once recorded. The system certainly felt lighter than the previous Cara system, and this version is designed to be customisable to different client needs.

Research team member, composer and conductor Dr Benjamin Oliver wearing the Cara Lite motion capture system.

Next we tried the original Cara system, a complete 4-camera true 3D marker-based solution providing complete 3D motion data from the markers on the subject’s face. The marker layout is up to the user, and so this can be customised for the particular level of detail needed.

Engineer Katie Davies helping Research Assistant Dan Halford try on the Cara face capture system.

 

The Cara headset fitted very snugly and felt secure and comfortable. Heavier than the Cara Lite, but the openness of the camera mounts facilitated a clear line of sight to the score for conducting. The fact that the end result of the processed capture is the 3D motion data of markers (as the body motion data is) may be helpful.

Dr Benjamin Oliver practicing with the Cara headset

Depending on the application, the main Vicon system can be used without a headset to capture facial expressions, but whether this is appropriate depends on the range movement of the subject, the number of cameras in the system, the other markers being captured etc. For the Capturing the Contemporary Conductor project, we are carrying out tests to decide which option will work best for us!

Our grateful thanks go to Matt Oughton and the team at Vicon for bringing their amazing systems to show us, it was a very interesting and helpful meeting.

‘Captured: Three Mo-Cap Experiments for conductor and small ensemble’

I’ve just finished a new piece called Captured: Three Mo-Cap Experiments for conductor and small ensemble. This work will be rehearsed and recorded on 27 September n the motion-capture laboratory at the University of Southampton by seven professional musicians and three conductors (me, Geoffrey Paterson and Holly Mathieson). The piece is scored for seven players (oboe, bass clarinet, trumpet, electric piano, percussion, violin & double bass) and is in three movements (or experiments).

Geoffrey Paterson

Each of the three experiments in this work are designed to challenge the conductors in different ways and facilitate understanding of how conductors work and the gestures they use to communicate effectively to performers.

For example in conducting Experiment 1: “Wonky Blues” some of the following elements will be need to be managed: asymmetric time signatures at different tempi; phrasing of sustained melodic ideas, sometimes in counterpoint; articulation and dynamic variation; syncopations; abrupt tempo changes; fermatas; tempo modulations (rit. & accel.); and cues. Whereas in Experiment 2: “Watch Out” the conductor has more of a controller/facilitator role, coordinating the ensemble through conducting flag notations and double arm gestures. The conductor in this second movement also has to negotiate conducting with a click track. In Experiment 3: “Three Rooms with the Same Wallpaper” the conductors will have to: understand an execute a range of metric modulations; deal with complex textures; manage homophonic texture coordination; conduct asymmetric time signatures at slow, medium and fast tempi; and shape lines.

Holly Mathieson

I am certain each of the conductors will find their own way of managing the different challenges but that there will be shared techniques and approaches that will be seen across all three of us. It will be fascinating to explore the data after the capture day.

My compositional approach and preoccupations were certainly informed by the context in which Captured… will be used, i.e. in an experiment, and I’m excited to see how the piece works.

How to capture conducting motion?

In recent months I’ve been looking at how conducting motion has been captured in the past, and the applications for which it has been used. Conducting motion has been explored in a number of different ways previously. While we are planning to use infra-red retro-reflective marker-based motion capture, it is useful to review work by others in the field, both for performance studies and for computational analysis and control.

In general, four main types of transducer have been used: optical, inertial, electromagnetic and bioelectrical, each of which has its own advantages and disadvantages. In general, bioelectrical can be discrete, but is an indirect measure of motion, subject to noise and other issues; electromagnetic detection can cover a large capture volume and has no line-of sight problems, but suffers from electromagnetic interference and historically relatively large transmitters; likewise inertial sensors can cover a large space with no line-of sight problems, but usually suffer from drift and is rather intrusive; optical systems range from simple video (with computational analysis) through to high-precision 3D retro-reflective or active marker IR tracking, all of which require clear lines-of-sight and can have other issues such as ease of portability and susceptibility to lighting issues.

Bioelectrical sensors have been used to read muscle activity from the bicep/tricep combination, notably Marrin’s Conductor’s Jacket used EMG sensors to provide arm movement data for control of other systems or for study.

Electromagnetic sensors can provide 3-axis position and orientation data from a small sensing object placed in a generated magnetic field. These were also used in one version of the Conductor’s Jacket, and also Ilmonen and Takala for gesture recognition. Max Matthews’ Radio Baton  electromagnetic controller has also been used in conducting experiments.

Inertial sensors have been used in a number of studies, including using wiimotes (e.g. Bradshaw and Ng), as well as other MEMS devices in projects such as Augmented Conductor, mConduct, and Conducting Master. Inertial devices measure acceleration and orientation (and magnetic) data which can be used to calculate the position of the of the sensor as it is moved, although drift remains an issue with these.

Optical approaches vary from simple video capture and manual analysis, to baton mounted LED’s with computational video analysis, the use of 3D gaming controllers (Microsoft Kinect) and high-end optical mocap systems from the biomechanics and entertainment industries. In CtCC (this project!) we will be taking this final approach due to the potentially very high level of detail and precision achievable, but we may supplement this with data from other devices for comparison with other studies and assessment of the quality of other systems.

What does a conductor of contemporary music do?

Over the past few months I have been undertaking a literature review of books focused on conducting and music direction of contemporary music. This has included: conducting related pedagogical textbooks/manuals; books by and interviews with leading conductors; texts dealing with more general issues in performing contemporary music; and doctoral dissertations on issues relating to conducting.

The focus of this work has been on the physical aspects of conducting focusing on the musical intention of different gestures and approaches. There are, of course, numerous other facets to conducting including, for example, rehearsal techniques, personal and professional interaction with performers and score preparation. These aspects, and many more, have been given much attention in the literature by scholars and practicing conductors alike. There is also an emerging field of ‘conducting studies’ in which scholars have explored and written about the practice of conducting in terms of music semiotics, gesture studies and from anthropological standpoints. Here, however, the concentration is on the physical gestures involved in conducting with a specific focus on contemporary music practice.

The original intention was to create a taxonomy of conducting gestures but it has become apparent that this is likely an impossible (and doomed) task as there are so many different viewpoints, conducting techniques and schools of thought when it comes to thinking about the physical gestures required of a conductor. Numerous scholars and conductors have underlined the inherent individuality of different conductors and that conducting ‘technique’ is fluid and hard to pin down to a series of specific gestures.

However, there are a number of key gestures, basic techniques and conventions that conductors are taught, or develop through practice, that allow them to communicate effectively (or not) with performers. The approach in this review, therefore, has been to try and summarise some of the different ideas and approaches of conducting technique as a starting place for thinking about what motions it might be useful to ‘capture’ in the Capturing the Contemporary Conductor project. This review is also helping me focus my ideas for writing a new piece for the session with live musicians and guest conductors in September 2017.

IMU Based Motion Capture

Ben Oliver wearing the Neuron motion capture suit
Ben Oliver wearing the Neuron motion capture suit, with Dan Halford

While the final capture session for the project will focus on using highly accurate optical motion capture, as a preliminary test we wanted to quickly capture some conducting motion data for analysis.

We had access to a Perception Neuron IMU mocap suit, designed for small animation/game studios and education establishments. This uses put to 32 “Neurons” or small Inertial Measurement Units which house gyro, accelerometer and magnetometer. The Axis Neuron software translates this data into 3D position data for each Neuron and streams this in real-time and can record it to disk.

Willing victim Dr Benjamin Oliver was strapped into the Neuron suit during a rehearsal with Workers Union Ensemble which was also video and audio recorded so we could compare the data from each source.

We captured approximately 60 minutes of rehearsal time, which was then integrated by Research Assistant Dan Halford, a sample of which can be seen below which features Helen Papaioannou‘s Backscatter (2017).