The following interview was conducted via an email questionnaire in mid-2013 by Ryan Ross Smith.

RRS: When did you start using animated notation in your work?

JF: I started getting really interested in open-form and unusually-formatted scores around 2003, when I was still a graduate student at Columbia. The first piece where I used real-time notation, though, was in 2005 (Glimmer, for chamber orchestra and audience participation). It began as a practical solution to a problem. I needed a way to communicate performance instructions (specifically, pitch and dynamic information) independently to each of the 25 players in the chamber orchestra, without a conductor, based on real time audience input. Since the audience input was coming from light (glow sticks), I decided that the real-time notation should be based on light as well. I set up a series of these computer-controlled LED light tubes on each music stand and used color and brightness to instruct each musician independently as to what to play.

Basically, each light changed among four different colors, each of which signified a single note. Brightness mapped to dynamic and a bit to timbre as well (i.e. sul ponticello for strings). Flashes represented accented attacks, fadeins/outs were gradual entrances/exits from/to niente. Color changes were always preceded by a little "anticipatory" flash to help the musician mentally prepare to change notes. We made practice DVDs for the musicians so they could practice individually in advance of group rehearsal.

There is a lot of documentation of all of this online: http://www.jasonfreeman.net/glimmer

RRS: What led you to start using animated notation? [This could be aesthetic/artistic concerns, technological experimentation, a bet and/or dare, etc.]

JF: I started using real-time notation because I wanted a way to engage instrumental musicians more deeply in interactive systems. I wasn't really satisfied with the traditional model of a player performing from a score or improvising either with a tape or even with an interactive system that analyzed and responded to their playing somehow. I wanted a chance for the system to directly influence what the player was playing and not just vice versa. I also have always preferred working with acoustic musicians over electroacoustic sound, so this is also a technique that helps me use technology without relying on it too much for sound production.

A lot of this work has been focused on audience participation in specific, i.e. the audience's input into an interactive system drives the real-time score generation, and I think that's a particularly powerful way to create a new kind of connection between musicians onstage and audiences. But I've also used real time notation in other contexts (i.e. laptop orchestras) where I see a need for something to connect two different entities involved in the performance.

RRS: Where there particular compositions/notational approaches/technologies/video games/etc. that exerted any influence over your [early and/or present] work?

JF: I think open-form scores in general (i.e. Stockhausen, Earle Brown, etc.) have had a big influence, as have works that engage audiences in unusual ways (particularly Fluxus) and sound art (e.g. Max Neuhaus). But ultimately I'd look back to Charles Ives; I think his notions of the role of amateurs in music-making, of the process of making music, and of the importance of music-making experiences have been a tremendous influence on my own approaches to music.

RRS: How would you describe your current work with animated notation?

JF: I'm trying to move away from "one-off" technological solutions and more towards generalizable frameworks that can help me (and potentially others) focus less on technical infrastructure and more on the key design challenges that emerge in interactive systems with real-time notation. To that end, my lab at Georgia Tech has developed LOLC, a text-based improvisation environment for laptop ensembles that incorporates real-time music notation as a first-class citizen. I used it to create SGLC, for 4 laptops and 4 acoustic instruments, and also to drive a collaboration with ETHEL (the string quartet in New York).

We're also finishing up a public release of massMobile, a general-purpose framework for audience participation via mobile phone. I've used it to power a number of my works recently (Sketching; Saxophone Etudes; Teamwork) that involve real-time notation in some way, and it's been transformative for me to be able to go so quickly from concept to implementation and to be able to iterate on my design ideas so rapidly.

RRS: Where do you see your work with animated notation going in the future?

JF: I want to continue investing in these generalized frameworks as I described above. I want to more carefully study how real-time notation is used in interactive systems from an HCI perspective. I also want to bring together some of my non-real time work with dynamic notation (in projects like Piano Etudes and Graph Theory) with my real-time participatory work to see how these different trajectories can inform each other.

RRS: What potential, if any, does animated notation have for future work IN GENERAL?

JF: I think real time notation stands at a productive intersection between composition, improvisation, and interactivity, and its greatest potential is in its ability to bridge the gaps between these modes and to engage performers and audiences with experimental music in a deep and meaningful way.

About

"Jason Freeman is an Associate Professor of Music in the College of Architecture at Georgia Tech. As a composer and computer musician, Freeman uses technology to create collaborative musical experiences in live concert performances and in online musical environments, utilizing his research in mobile music, dynamic music notation, and networked music to develop new interfaces for collaborative creativity. His music has been presented at major festivals and venues, including the Adrienne Arsht Center (Miami), Carnegie Hall (New York), the Lincoln Center Festival (New York), Transmediale (Berlin), and Sonar (Barcelona), and it has been covered in the New York Times, on National Public Radio, and in Wired and Billboard. Freeman received his B.A. in music from Yale University and his M.A. and D.M.A. in composition from Columbia University. "[1]

1. "About," distributedmusic.gatech.edu, accessed November 30, 2013, http://distributedmusic.gatech.edu/sandvox/about/.

  • Visit Jason's website