Skip to content ↓

Topic

Augmented and virtual reality

Download RSS feed: News Articles / In the Media / Audio

Displaying 16 - 30 of 36 news clips related to this topic.
Show:

Motherboard

In a new data sonification project, a team of MIT researchers have translated the vibrations of a spider’s web into music, writes Maddie Bender for Motherboard. The team "used the physics of spiderwebs to assign audible tones to a given string’s unique tension and vibration," writes Bender. "Summing up every string’s tone created an interactive model of a web that could produce sound through manipulation or VR navigation."

Gizmodo

A team of MIT researchers have translated the vibrations of a spider’s web into music, reports Isaac Schultz for Gizmodo. “Spiders live in this vibrational universe,” says Prof. Markus Buehler. “They live in this world of vibrations and frequencies, which we can now access. One of the things we can do with this instrument with this approach is we can, for the first time, begin to feel a little bit like a spider or experience the world like the spider does.”

Forbes

Forbes contributor Andrea Morris spotlights how MIT researchers have created a virtual reality experience that allows people to experience a spider web’s vibrations as music. "The team is working on a study exploring the boundaries between the kinds of compositions we humans create from synthetic instruments and our own conventional tuning, and compositions created from instruments that have been crafted and tuned by other biological beings, like spiders," writes Morris. 

New Scientist

MIT researchers have created a new audio-visual virtual reality that can provide a sense of what it’s like to be a spider by converting a spider web’s vibrations into sounds that humans can hear, reports Ian Morse for New Scientist. “The spider web can be viewed as an extension of the body of the spider, in that it lives within it, but also uses it as a sensor,” says Prof. Markus Buehler. “When you go into the virtual reality world and you dive inside the web, being able to hear what’s going on allows you to understand what you see.”

The Boston Globe

Writing for The Boston Globe, Prof. D. Fox Harrell, Francesca Panetta and Pakinam Amer of the MIT Center for Advanced Virtuality explore the potential dangers posed by deepfake videos. “Combatting misinformation in the media requires a shared commitment to human rights and dignity — a precondition for addressing many social ills, malevolent deepfakes included,” they write.

Gizmodo

Researchers at MIT and UMass Lowell have developed a completely flat fisheye camera lens. These lenses “could be used as depth sensors in smartphones, laptops, and wearables,” writes Victoria Song for Gizmodo. “The team also believes there could be medical applications—think imaging devices like endoscopes.”

TechCrunch

MIT researchers have designed a completely flat wide-angle lens that can produce clear, 180-degree images, reports Darrell Etherington for TechCrunch. “The engineers were able to make it work by patterning a thin wafer of glass on one side with microscopic, three-dimensional structures that are positioned very precisely in order to scatter any inbound light in precisely the same way that a curved piece of glass would,” writes Etherington.

Fortune

Researchers at MIT’s Center for Advanced Virtuality have created a deepfake video of President Richard Nixon discussing a failed moon landing. “[The video is] meant to serve as a warning of the coming wave of impressively realistic deepfake false videos about to hit us that use A.I. to convincingly reproduce the appearance and sound of real people,” write Aaron Pressman and David Z. Morris for Fortune.

Boston 25 News

Boston 25’s Chris Flanagan reports that MIT researchers developed a website aimed at educating the public about deepfake technology and misinformation. “This project is part of an awareness campaign to get people aware of what is possible with both AI technologies like our deepfake, but also really simple video editing technologies,” says Francesca Panetta, XR creative director at MIT’s Center for Advanced Virtuality.

Space.com

MIT researchers created a deepfake video and website to help educate the public of the dangers of deepfakes and misinformation, reports Mike Wall for Space.com. “This alternative history shows how new technologies can obfuscate the truth around us, encouraging our audience to think carefully about the media they encounter daily,” says Francesca Panetta, XR creative director at MIT’s Center for Advanced Virtuality.

Scientific American

Scientific American explores how MIT researchers created a new website aimed at exploring the potential perils and possibilities of deepfakes. “One of the things I most love about this project is that it’s using deepfakes as a medium and the arts to address the issue of misinformation in our society,” says Prof. D. Fox Harrell.

New York Times

Prof. Fox Harrell speaks with New York Times reporter Joshua Rothkopf about the educational potential of deepfake technology. “To have the savvy to negotiate a political media landscape where a video could potentially be a deepfake, or a legitimate video could be called a deepfake, I think those are cases people need to be aware of,” says Harrell.

The Daily Beast

Daily Beast reporter David Axe spotlights graduate student Guillermo Bernal’s work developing virtual reality avatars that can convey realistic human emotions. “As this medium moves forward, this and other tools are what will help the field of virtual reality expand from a medium of surface-level experience to one of deep, emotionally compelling human-to-human connection,” Bernal explains.

U.S. News & World Report Generic Logo

U.S. News & World Report contributor Linda Childers spotlights how the Sloan School of Management is integrating virtual reality tools into its curriculum. Prof. John Sterman explains that a climate simulation game “teaches our business students skills such as improvising, negotiating and public speaking,” adding that, “it reinforces how their decisions can have consequences that last for decades.”

Boston Globe

Boston Globe columnist Scott Kirsner highlights the RealityVirtually event that took place at MIT, bringing together nearly 450 people from 35 countries around the world to create new software for virtual and augmented reality headsets. Kirsner writes that the event “offered an incredible glimpse into the nascent medium’s potential and possibilities.”