[Summary: guest post from Tim – shared learning and notes from a hack day]
It’s not often that Rachel and I get to collaborate on work projects – but this weekend we had the opportunity at the Berklee Music Therapy Hack Day: a 24 hour event hosted by the Berklee College of Music to explore how different computer software and hardware could be used to enhance music therapy practice.
A lot of the experiments focussed on taking advantage of low-cost sensors like the Makey Makey, or the new movement tracking capabilities in the web browser (e.g. this eye-tracking demo that you can try in Chrome with your webcam). These technologies can be useful in niche contexts where conventional music instruments are difficult to use, and the current alternatives likesoundbeam are proprietary and prohibitively expensive.
However, we chose to focus on behind the scenes support for the practitioner, exploring how day-to-day tasks of note-taking can be streamlined, making more time for reflective practice. The result was a prototype tool for music therapy indexing, note making and analysis.
Problems and a prototype
Many music therapists video or audio record their work, and use a process of ‘indexing’ to analyse sessions, identifying moments of progress and opportunities to develop practice with a client in future. Right now – that will generally either involve working on paper or in a spreadsheet, or in very clunky software. So the first challenge we set out to address was a simple interface to indexing music therapy content.
Fortunately, HTML5 (the latest version of the standard for web pages) includes lot more features for dealing with audio and video, and building a number of fantastic open source tools likeWaveSurfer.js we were able to rapidly create a browser-based tool for playing through a video or audio recording, and adding simple annotations to particular points in the video with keyboard shortcuts. The result is a list of key moments in a session, tagged with key words so that the flow or key moments in the session can be later visualised.
The second thing we looked at were ‘rating scales‘. There are a wide range of different rating scales used in Music Therapy, and some might be used once or twice over a course of work with a client, whilst others could be used every session, and others at multiple points in a session – to understand in detail the areas where clients are progressing. Using the Glasgow Coma Scales as an example we mocked up an interface for recording responses against the scale, and explored how radar charts could be used to visualise the results of a rating scale – both within an individual session record, and overlaying the responses from multiple sessions.
Throughout this exploration we had to keep in mind a number of important consideration about patient confidentiality, informed consent, and data minimisation. A number of people at the hackathon questions whether it would be possible to use tools like SoundCloud for annotating audio: but music therapy session recordings are confidential patient records, and so should never be stored on a cloud service. Although we built the prototype to work through a web-browser, it’s designed to work locally on a users own computer, ideally running from an encrypted drive so that data is always stored securely.
A lot of the other presentations at the Hackathon emphasised the potential for digital tools to be ‘self-documenting': capturing detailed streams of data about a music therapy session. With a common format and some common approaches to visually representing this data, it may be possible to develop some general approaches that bring together sensor information info client records – whilst also managing informed consent policies – encouraging good practice in data minimisation and data removal, keeping only that information which is important for the practitioner to improve their work.
The big advantage of a lightweight tool that structures records from Music Therapy sessions, is that those records can be exported in different ways. For example, the clinical summary can be exported as text to go into formal patient notes systems, or extracts flagged during indexing can be converted into a summary audio file to share in team meetings.
The prototype is very rough-and-ready code right now, but it should be possible to extract some of the indexing features into small stand-along open source tools, and I’ll be looking for time to work on that in the coming months. And perhaps other opportunities to develop the wider tool further might emerge.