Skip to content ↓

Topic

Augmented and virtual reality

Download RSS feed: News Articles / In the Media

Displaying 1 - 15 of 30 news clips related to this topic.
Show:

Forbes

Kirin Sinha ‘14 founded Illumix, a technology company that combines 2D models to interactive 3D models to provide immersive AR experiences, reports Rebecca Suhrawardi for Forbes. “Illumix is enabling high-quality and real-time virtual try-on which has implications for the fashion industry ranging from higher conversion rates, fewer returns, and more environmentally-friendly,” says Sinha.

WHDH 7

MIT researchers have created a new headset, called X-AR, that can help users find hidden or lost items by sending a wireless signal to any item that has a designated tag on it, reports WHDH. The augmented reality headset “allows them to see things that are otherwise not visible to the human eye,” explains Prof. Fadel Adib. “It visualizes items for people and then it guides them towards items.” 

Boston.com

Boston.com reporter Ross Cristantiello writes that MIT researchers have developed a new augmented reality headset that combines computer vision and wireless perception to allow users to track and find objects hidden from view. “The system relies on radio frequency signals that can pass through everyday materials like cardboard, plastic, and wood,” Cristantiello explains.

The Daily Beast

MIT engineers have developed an augmented reality headset that uses RFID technology to allow wearers to find objects, reports Tony Ho Tran for The Daily Beast. “The device is intended to assist workers in places like e-commerce warehouses and retail stores to quickly find and identify objects,” writes Tran. “It can also help technicians find tools and items they need to assemble products.” 

Popular Science

An augmented reality headset developed by MIT engineers, called X-AR, uses RFID technology to help users find hidden objects, reports Andrew Paul for Popular Science. “X-AR’s creators were able to guide users with nearly 99 percent accuracy to items scattered throughout a warehouse testing environment,” writes Paul. “When those products were hidden within boxes, the X-AR still even boasted an almost 92 percent accuracy rate.” 

The Economist

MIT researchers devised a new way to arrange LED pixels to create screens with a much higher resolution than is currently possible, reports The Economist. The new technique, which involves stacking micro LEDS, could also be used to make “VR images that appear far more lifelike than today’s.”

Politico

Prof. Cynthia Breazeal discusses her work exploring how artificial intelligence can help students impacted by Covid, including refugees or children with disabilities, reports Ryan Heath for Politico. “We want to be super clear on what the role is of the robot versus the community, of which this robot is a part of. That's part of the ethical design thinking,” says Breazeal. “We don't want to have the robot overstep its responsibilities. All of our data that we collect is protected and encrypted.”

Mashable

Mashable spotlights how MIT’s baseball pitching coach is using motion capture technology to help analyze and teach pitching techniques. Using the technology, Coach Todd Carroll can “suggest real-time adjustments as a player is pitching so that just one session using the technology improves their game.”

Reuters

MIT researchers have created 3D models of spiderwebs to help transform the web’s vibrations into sounds that humans can hear, writes Angela Moore for Reuters. “Spiders utilize vibrations as a way to communicate with the environment, with other spiders,” says Prof. Markus Buehler. “We have recorded these vibrations from spiders and used artificial intelligence to learn these vibrational patterns and associate them with certain actions, basically learning the spider’s language.” 

Motherboard

In a new data sonification project, a team of MIT researchers have translated the vibrations of a spider’s web into music, writes Maddie Bender for Motherboard. The team "used the physics of spiderwebs to assign audible tones to a given string’s unique tension and vibration," writes Bender. "Summing up every string’s tone created an interactive model of a web that could produce sound through manipulation or VR navigation."

Gizmodo

A team of MIT researchers have translated the vibrations of a spider’s web into music, reports Isaac Schultz for Gizmodo. “Spiders live in this vibrational universe,” says Prof. Markus Buehler. “They live in this world of vibrations and frequencies, which we can now access. One of the things we can do with this instrument with this approach is we can, for the first time, begin to feel a little bit like a spider or experience the world like the spider does.”

Forbes

Forbes contributor Andrea Morris spotlights how MIT researchers have created a virtual reality experience that allows people to experience a spider web’s vibrations as music. "The team is working on a study exploring the boundaries between the kinds of compositions we humans create from synthetic instruments and our own conventional tuning, and compositions created from instruments that have been crafted and tuned by other biological beings, like spiders," writes Morris. 

New Scientist

MIT researchers have created a new audio-visual virtual reality that can provide a sense of what it’s like to be a spider by converting a spider web’s vibrations into sounds that humans can hear, reports Ian Morse for New Scientist. “The spider web can be viewed as an extension of the body of the spider, in that it lives within it, but also uses it as a sensor,” says Prof. Markus Buehler. “When you go into the virtual reality world and you dive inside the web, being able to hear what’s going on allows you to understand what you see.”

The Boston Globe

Writing for The Boston Globe, Prof. D. Fox Harrell, Francesca Panetta and Pakinam Amer of the MIT Center for Advanced Virtuality explore the potential dangers posed by deepfake videos. “Combatting misinformation in the media requires a shared commitment to human rights and dignity — a precondition for addressing many social ills, malevolent deepfakes included,” they write.

Gizmodo

Researchers at MIT and UMass Lowell have developed a completely flat fisheye camera lens. These lenses “could be used as depth sensors in smartphones, laptops, and wearables,” writes Victoria Song for Gizmodo. “The team also believes there could be medical applications—think imaging devices like endoscopes.”