Skip to content ↓

Topic

Computer vision

Download RSS feed: News Articles / In the Media / Audio

Displaying 1 - 15 of 191 news clips related to this topic.
Show:

Cambridge Day

Cambridge Day reporter Zoe Beketova, a student in MIT’s Graduate Program in Science Writing, visits Prof. Xuanhe Zhao’s lab to get a hands-on look at the group’s ultrasound wristband that can map movements of the human body using sound waves, part of the group’s work aimed at changing “how we gather information from inside the body.” Says Zhao: “The mission of my lab is really merging humans with machines and AI. We believe there’s a huge opportunity [with] this interface.”

Tech Briefs

Prof. Xuanhe Zhao speaks with Tech Briefs reporter Andrew Corselli about his team’s work developing an ultrasound wristband that precisely tracks a wearer’s hand movements in real time and can communicate device these motions to a robot or a virtual environment. “For the future of human society, humanized robots will do lots of different work for us. For that work, we need a dexterous robotic hand,” explains Zhao. “We believe this ultrasound wristband, based on variable imaging, could be the future of really knowing the human hand motions.”

Popular Science

MIT researchers have developed an ultrasound wristband that can transmit a user’s motions to a robotic hand or a virtual environment, reports Mack Degeurin for Popular Science. “Volunteers wearing the device could direct the robotic hand to grab tennis balls, make hand signs, and even play notes on a piano,” Degeurin explains. “That same technique can also be applied to digital environments, which means future wearers could control a phone screen without ever touching it, or interact with virtual reality in ways that feel more immersive.” 

Scientific American

MIT researchers have developed “GelSight,” a system that provides robots with a sense of touch, reports Ben Guarino for Scientific American. “GelSight can identify by touch the tiny letters spelling out LEGO on the stud of a toy brick,” explains Guarino. 

Interesting Engineering

MIT researchers have developed a deep-learning model “capable of predicting the precise movements, divisions, and restructuring of thousands of cells during the embryo’s transition from a simple cluster to a complex organism,” reports Mrigakshi Dixit for Interesting Engineering. “This model currently provides a sneak peek into the fruit fly’s earliest developmental stage,” explains Dixit. “In the future, it could be used to predict how more complex tissues, organs, and organisms develop.” 

CNN

In a video for CNN, graduate student Alex Kachkine explains his work developing a method using AI to create a reversible polymer film that could be used to restore damaged oil paintings, making the process faster than manual restoration. Kachkine explains that he hopes his work helps “get more paintings out of storage and into public view as there are many paintings that are damaged that I would love to see and it’s a real shame that there aren’t the resources necessary to restore them.” 

New York Times

Graduate student Alex Kachkine speaks with New York Times reporter Ephrat Livni about his work creating a new AI technique for restoring paintings, and how his research on microchips helped inspire the development. Microchips “require very high degrees of precision,” Kachkine explains. “And it turns out a lot of the techniques we use to achieve that level of precision are applicable to art restoration.” Kachkine adds that he hopes conservators will be able to “leverage the benefits” of the techniques he gleaned from engineering to preserve “really valuable cultural heritage.”

The Guardian

Writing for The Guardian, Prof. Carlo Ratti highlights his work using “AI to compare footage of public spaces from the 1970s with recent video” from the same locations in Boston, New York and Philadelphia. “The findings are striking: people walk faster, linger less, and are less likely to meet up,” explains Ratti. “By using AI to study urban public spaces, we can gather data, pick out patterns and test new designs that could help us rethink, for our time, our modern versions of the agora– the market and main public gathering place of Athens.” 

Fast Company

Prof. Philip Isola speaks with Fast Company reporter Victor Dey about the impact and use of agentic AI. “In some domains we truly have automatic verification that we can trust, like theorem proving in formal systems. In other domains, human judgment is still crucial,” says Isola. “If we use an AI as the critic for self-improvement, and if the AI is wrong, the system could go off the rails.”

Gizmodo

Researchers at MIT have developed a new tool, called Meschers, that allows users to create detailed computer representations of mathematically impossible objects, reports Gayoung Lee for Gizmodo. “In addition to creating aesthetically quirky objects,” Lee explains, “Meschers could eventually assist in research across geometry, thermodynamics, and even art and architecture." 

Dezeen

A study by researchers at MIT has found that “pedestrians are walking 15 percent faster and stopping to linger 14 percent less than they used to,” reports Rima Sabina Aouf for Dezeen. “Using computer vision and artificial intelligence to analyze videos of four public spaces across three American cities, the study found that walking speeds rose notably between 1980 and 2010, while instances of people lingering or interacting with others fell,” writes Aouf. 

Newsweek

Researchers at MIT have found that “pedestrians in three major northeastern U.S. cities – Boston, New York and Philadelphia —are moving 15 percent faster than they did in 1980,” reports Lucy Notarantonio for Newsweek. Notarantonio explains that: “The researchers hope their work will inform how cities design and redesign public areas — especially at a time when digital polarization is reshaping how people connect in real life.”

Ars Technica

Graduate student Alex Kachkine has developed a new technique that “uses AI-generated polymer films to physically restore damaged paintings in hours,” reports Benj Edwards for Ars Technica. “Kachkine's method works by printing a transparent ‘mask’ containing thousands of precisely color-matched regions that conservators can apply directly to an original artwork,” explains Edwards. “Unlike traditional restoration, which permanently alters the painting, these masks can reportedly be removed whenever needed. So it's a reversible process that does not permanently change a painting.” 

The Guardian

Guardian reporter Ian Sample highlights how graduate student Alex Kachkine has developed a new approach to restoring age-damaged artwork in hours“The technique draws on artificial intelligence and other computer tools to create a digital reconstruction of the damaged painting,” explains Sample. “This is then printed on to a transparent polymer sheet that is carefully laid over the work.” 

Nature

Graduate student Alex Kachkine speaks with Nature reporter Amanda Heidt about his work developing a new restoration method for restoring damaged artwork. The method uses “digital tools to create a ‘mask’ of pigments that can be printed and varnished onto damaged paintings,” explains Heidt. The method “reduces both the cost and time associated with art restoration and could one day give new life to many of the paintings held in institutional collections — perhaps as many as 70% — that remain hidden from public view owing to damage.”