Ever attended a chamber concert where the audience helped play the music? Thanks in part to MIT’s Eran Egozy, you can. Egozy is a clarinetist in Radius Ensemble, a Cambridge-based chamber music group that on March 5 will perform the world premiere of “12,” a new piece by composer Eun Young Lee. Unusually, the composition incorporates percussion sounds that certain audience members will deploy using their phones. Egozy — a co-founder and former chief technology officer at Harmonix, creators of the game “Guitar Hero” — created the digital program the audience members will use. He is also teaching courses in interactive, high-tech music at MIT, as a professor of the practice. MIT News spoke to Egozy about the new concert and his work.
Q. What is “12”?
A. “12” is a new work that Radius Ensemble commissioned for the concert season this year, by Eun Young Lee, who is a composer in the Boston area and teaches at Boston Conservatory. Her piece is inspired by the 12 signs of the zodiac. It’s a nice piece because it’s written specifically for us, Radius Ensemble, which is an ensemble of nine musicians. Normally we don’t all play together in the same [pieces].
“12” is broken up into 12 short movements, each inspired by a particular zodiac sign. And each movement will feature one to four players, except for the last one, where we all play at the same time. We chose four of the 12 movements to have audience participation.
Q. How did you develop the concept of audience participation that you use in this piece?
A. We started by imagining what an audience participation experience could be like. There have been a few examples in the past, but not many. We wanted to create an experience where audience members can meaningfully join in the music-making process that is normally only available to the musician on stage. I thought it would be fun to have people use their smartphones because they are very comfortable and familiar devices, much like the instruments used by professional musicians. We want each participant in the audience who is performing along with the musicians to really feel like they have an effect on what’s happening.
We opted to invite 12 audience members, with three smartphone players for each of the interactive movements. They will bring up a Web browser on their phone and type in a specific URL. On their screen, they will see a custom interface that I created, which allows them to play a variety of sounds we had determined would make sense within the context of the piece. These sounds are predominantly percussion sounds. Since Radius Ensemble has no percussionist, one of the motivations was to pick a set of sounds that is not normally available to the ensemble.
We have had two rehearsals to test the technology and gauge people’s reactions. Folks have been thoughtful and listened to what the musicians onstage are playing. In rehearsal, [stand-in] audience members experimented a little, then felt more confident about interjecting their sounds into the piece. And that’s the whole point: to have these sounds join the instrumental music to create the complete movement.
Q. You are now teaching at MIT and have introduced a new course this semester, 21M.359 (Interactive Music Systems). What is that about?
A. The course is actually very related to the piece, “12.” The course asks the question: How do you get computers to produce music in an intuitive way, particularly if you’re not a musician? We explore sound production, music theory, visualization of music, and human-computer interaction design.
The students learn these fundamental concepts during the first part of the semester, and then apply their learning in a final project where they create a system that accepts real-time user input, and produces something musically and graphically interesting. Sadly, I had to turn away many students due to over-enrollment. I’m teaching 18 [students], but I’ll be offering the class every semester going forward. The most fun part, for me, is really working with students one-on-one on their final projects.