Skip to content ↓

Topic

Computer vision

Download RSS feed: News Articles / In the Media / Audio

Displaying 151 - 165 of 179 news clips related to this topic.
Show:

BetaBoston

MIT researchers have developed a new algorithm that allows robots to work together to efficiently serve drinks, Nidhi Subbaraman writes for BetaBoston. Subbaraman explains that the technique provides a “smarter approach to collaboration, preparing for possible missteps like dropping a bottle, or picking up the wrong one.” 

The Daily Beast

Charlotte Lytton writes for The Daily Beast about SenseGlass, a mirror created by graduate student Javier Hernandez that uses Google Glass technology to register physiological and emotional changes in the viewer. “I believe mirrors are a great platform for health monitoring as we use them [everyday],” says Hernandez. 

The Guardian

“MIT PhD student Abe Davis has developed video technology that reveals an object’s hidden properties,” writes Joanna Goodman for The Guardian. “Davis uses high-speed silent video to capture and reproduce sound, including music and intelligible speech, from the vibrations it makes on inanimate objects.”

Popular Science

Popular Science reporter Levi Sharpe writes that MIT researchers have developed an object recognition system that can accurately identify and distinguish items. “This system could help future robots interact with objects more efficiently while they navigate our complex world,” Sharpe explains. 

Financial Times

Financial Times reporter Richard Waters writes about how graduate student Abe Davis’ motion magnification research could be used to create more realistic virtual worlds. Waters writes that Davis’ work presents the “possibility of capturing and manipulating real-world objects in virtual space.”

BBC News

In this video, BBC Click’s LJ Rich explores how researchers at MIT CSAIL have devised a system that can reconstruct sound from a video recording. “I think what’s really different about this technology is that it provides you with a way to image this information,” says graduate student Abe Davis.

MarketWatch

MarketWatch reporter Sally French writes that researchers from MIT CSAIL have developed an algorithm that can be used to predict how memorable a person’s is. “The algorithm was created from a database of more than 2,000 images that were awarded a “memorability score” based on human volunteers’ ability to remember the pictures,” French writes. 

Forbes

Steven Rosenbaum highlights PhD student Abe Davis’ TED talk in a piece for Forbes. Rosenbaum writes that Davis “has co-created the world’s most improbable audio instrument.”

The Washington Post

Rachel Feltman writes for The Washington Post about how MIT researchers have developed new technology that can amplify microscopic movements invisible to the human eye. “MIT researchers recently published a study in which they extracted intelligible audio by analyzing the movements of a nearby bag of chips,” Feltman writes.

Wired

A team of MIT researchers has developed an algorithm that will help NASA crews clean up debris in space, reports Nick Stockton for Wired. The research will allow crews to clear pieces of satellites spinning so wildly that they would typically be dangerous to collect.

CNN

Heather Kelly of CNN reports on how MIT researchers have developed a new technique to recreate audio from silent video. "We showed that we can determine pretty reliably the gender of a speaker from low-quality sound we managed to recover from a tissue box," says Dr. Michael Rubinstein. 

PBS NewsHour

Colleen Shalby reports for the PBS NewHour on the “visual microphone” developed by MIT researchers that can detect and reconstruct audio by analyzing the sound waves traveling through objects. 

Bloomberg Businessweek

Bloomberg Businessweek reporter Drake Bennett writes about how MIT researchers have developed a technique for extracting audio by analyzing the sound vibrations traveling through objects. Bennett reports that the researchers found that sound waves could be detected even when using cell phone camera sensors. 

ABC News

Alyssa Newcomb of ABC News reports on how MIT researchers have developed a new method that can uncover intelligible audio by videotaping everyday objects and translating the sound vibrations back into intelligible sound. 

NPR

NPR’s Melissa Block examines the new MIT algorithm that can translate visual information into sound. Abe Davis explains that by analyzing sound waves traveling through an object, “you can start to filter out some of that noise and you can actually recover the sound that produced that motion.”