Skip to content ↓

Sound and technology unlock innovation at MIT

Cross-disciplinary projects at MIT probe the technological and aesthetic limits of sound.
Press Inquiries

Press Contact:

Leah Talatinian
Phone: 617-253-5351
Arts at MIT
Close
MIT student Ben Bloomberg stands behind the soundboard at a Jacob Collier concert, December 2018.
Caption:
MIT student Ben Bloomberg stands behind the soundboard at a Jacob Collier concert, December 2018.
Credits:
Photo: Justin Knight Photography
MIT Bose Challenge winners, Runner's High
Caption:
MIT Bose Challenge winners, Runner's High
Credits:
Image courtesy of Runner's High.
"The Laughing Room"
Caption:
"The Laughing Room"
Credits:
Image courtesy of ARTificial Intelligence.
Project Daredevil, 2018 Creative Arts Competition first prize recipients
Caption:
Project Daredevil, 2018 Creative Arts Competition first prize recipients
Credits:
Photo courtesy of the artists.

Sound is a powerfully evocative medium, capable of conjuring authentic emotions and unlocking new experiences. This fall, several cross-disciplinary projects at MIT probed the technological and aesthetic limits of sound, resulting in new innovations and perspectives, from motion-sensing headphones that enable joggers to maintain a steady pace, virtual reality technology that enables blind people to experience comic book action, as well as projects that challenge our very relationship with technology.

Sound as political participation

“Sound is by nature a democratic medium,” says Ian Condry, an anthropologist and professor in MIT’s Department of Global Studies and Languages, adding that “sound lets us listen around the margins and to follow multiple voices coming from multiple directions.”

That concept informed this year’s Hacking Arts Hackathon Signature Hack, which Condry helped coordinate. The multi-channel audio installation sampled and abstracted audio excerpts from recent presidential inaugural addresses, then blended them with breathing sounds that the team recorded from a live audience. Building on this soundtrack, two team members acted as event DJs, instructing the audience to hum and breathe in unison, while their phones — controlled by an app created for the hackathon — played additional breathing and humming sounds.

“We wanted to play with multiple streams of speech and audio,” says Adam Haar Horowitz, a second-year master’s student at the MIT Media Lab, and member of the winning team. “Not just the words, which can be divisive, but the texture and pauses between the words.”

A guy walks into a library…

What happens when artificial intelligence decides what’s funny? Sound and democracy played prominently in "The Laughing Room," an installation conceived by a team including author, illustrator, and MIT PhD candidate Jonny Sun and Stephanie Frampton, MIT associate professor of literature, as part of her project called ARTificial Intelligence, a collaboration between MIT Libraries and the Cambridge Public Library.

Funded in part by a Fay Chandler Faculty Creativity Seed Grant from the MIT Center for Art, Science and Technology (CAST), "The Laughing Room" invited public library visitors into a set that evoked a television sitcom living room, where they told stories or jokes that were analyzed by the room’s AI. If the algorithm determined a story was funny, it played a recorded laugh track. "The Laughing Room" — as well as the AI’s algorithmic calculations — were then broadcast on screens in "The Control Room," a companion installation at MIT’s Hayden Library.

While fun for the public, the project also mined more serious issues. “There is a tension in society around technology,” says Sun, “between the things technology allows you to do, like having an algorithm tell you your joke is funny, and the price we pay for that technology, which is usually our privacy.”

Using sound to keep the pace

How can audio augmented reality enhance our quality of life? That challenge was explored by more than 70 students from multiple disciplines who competed in the Bose MIT Challenge in October. The competition, organized by Eran Egozy, professor of the practice in music technology and an MIT graduate who co-founded Harmonix, the company that developed iconic video games Guitar Hero and Rock Band, encourages students to invent real-life applications for Bose AR, a new audio augmented reality technology and platform.

This year’s winning entry adapted the Bose’s motion-sensing AR headphones to enable runners to stay on pace as they train. When the runner accelerates, the music is heard behind them. When their place slows, the music sounds as if it’s ahead of them.

“I’d joined hackathons at my home university,” said Dominic Co, a one-year exchange student in architecture from the University of Hong Kong and member of the three-person winning team. “But there’s such a strong culture of making things here at MIT. And so many opportunities to learn from other people.”

Creating a fuller picture with sound

Sound — and the technology that delivers it — has the capacity to enhance everyone’s quality of life, especially for the 8.4 million Americans without sight. That was the target audience of Project Daredevil, which won the MIT Creative Arts Competition last April.

Daniel Levine, a master’s candidate at the MIT Media Lab, teamed with Matthew Shifrin, a sophomore at the New England Conservatory of Music, to create a virtual-reality system for the blind. The system’s wearable vestibular-stimulating helmet enables the sightless to experience sensations like flying, falling, and acceleration as they listen to an accompanying soundtrack.

Shifrin approached Levine two years ago for help in developing an immersive 3-D experience around the Daredevil comic books — a series whose superhero, like Shifrin, is blind. As a child, Shifrin’s father read Daredevil to him aloud, carefully describing the action in every pane. Project Daredevil has advanced that childhood experience using technology.

“Because of Dan and his engineering expertise, this project has expanded far beyond our initial plan,” says Shifrin. “It’s not just a thing for blind people. Anyone who is into virtual reality and gaming can wear the device.”

A beautiful marriage of art and technology

Another cross-disciplinary partnership in sound and technology that resulted in elegant outcomes this fall is the ongoing partnership between CAST Visiting Artist Jacob Collier and MIT PhD candidate Ben Bloomberg.

Bloomberg, who completed his undergraduate and master’s studies at MIT, studied music and performance design with Tod Machover, the Muriel R. Cooper Professor of Music and Media and director of the Media Lab’s Opera of the Future group. Bloomberg discovered Collier’s music videos online about four years ago; he then wrote the artist to ask whether he needed any help in adapting his video performances to the stage. Fortunately, the answer was yes.

Working closely with Collier, Bloomberg developed a computerized audio/visual performance platform that enables the charismatic composer and performer to move seamlessly from instrument to instrument on stage and sing multiple parts simultaneously. The duo continues to develop and perfect the technology in performance. “It’s like a technological prosthesis,” says Bloomberg, who has worked with dozens of artists, including Bjork and Ariana Grande.

While technology has opened the door to richer sound explorations, Bloomberg firmly places it in an artistic realm. “None of this would make any sense were it not for Jacob’s amazing talent. He pushes me to develop new technologies, or to find new ways to apply existing technology. The goal here isn’t to integrate technology just because we can, but to support the music and further its meaning.”

Explorations in sound continue into 2019 with the innovative annual performance series MIT Sounding. Highlights of the 2018-2019 season include a collaboration with the Boston Modern Orchestra Project in honor of MIT Institute Professor John Harbison’s 80th birthday, the American premiere of the Spider’s Canvas, a virtual 3-D reconstruction of a spider’s web with each strand tuned to a different note, and residencies by two divergent musicians: the Haitian singer and rapper BIC and the innovative American pianist Joel Fan performing works by MIT composers.

Related Links

Related Topics

Related Articles

More MIT News