Skip to content ↓

Topic

Computer vision

Download RSS feed: News Articles / In the Media / Audio

Displaying 46 - 60 of 179 news clips related to this topic.
Show:

Wired

Wired reporter Will Knight spotlights how MIT researchers have showed that “an AI program trained to verify that code will run safely can be deceived by making a few careful changes, like substituting certain variables, to create a harmful program.”

TechCrunch

MIT researchers have developed a new robot, dubbed RF Grasp, that can sense hidden objects using radio waves, reports Brian Heater for TechCrunch. “The tech allows RF Grasp to pick up things that are covered up or otherwise out of its line of vision,” writes Heater.

Gizmodo

Researchers at MIT and UMass Lowell have developed a completely flat fisheye camera lens. These lenses “could be used as depth sensors in smartphones, laptops, and wearables,” writes Victoria Song for Gizmodo. “The team also believes there could be medical applications—think imaging devices like endoscopes.”

TechCrunch

MIT researchers have designed a completely flat wide-angle lens that can produce clear, 180-degree images, reports Darrell Etherington for TechCrunch. “The engineers were able to make it work by patterning a thin wafer of glass on one side with microscopic, three-dimensional structures that are positioned very precisely in order to scatter any inbound light in precisely the same way that a curved piece of glass would,” writes Etherington.

Fast Company

Fast Company reporter Adele Peters spotlights how, as part of an effort to reduce Covid-19 risk for health care workers, researchers from MIT and Brigham and Women’s Hospital developed a new system that enables remote vital sign monitoring. "We started to think about how we could protect healthcare providers and minimize contact with folks that might be infectious,” says Prof. Giovanni Traverso.

CNN

CNN reporter Allen Kim spotlights how researchers from MIT and Brigham and Women’s Hospital have modified a robotic dog so that it can be used to help measure a patient’s vital signs. “The researchers expect to focus on triage applications in the short term, with the goal of ultimately deploying robots like this to patients' hospital rooms to continuously monitor them and let doctors to check in on them without ever having to step into the room,” writes Kim.

CBS Boston

CBS Boston features how researchers from MIT and Brigham and Women’s Hospital equipped a robot from Boston Dynamics with technology to enable remote vital sign monitoring.

Bloomberg

In this video, Bloomberg News spotlights how researchers from MIT and Brigham and Women’s Hospital have developed a new system that facilitates remote monitoring of a patient’s vital signs, as part of an effort to help reduce healthcare workers’ Covid-19 risk. Researchers have successfully measured temperature, breathing rate, pulse rate and blood oxygen saturation in healthy patients.”

Boston Herald

Researchers from MIT and Brigham and Women’s Hospital have repurposed a robotic dog from Boston Dynamics with technology that enables doctors to remotely measure a patient’s vital signs, reports Rick Sobey for The Boston Herald. “Using four cameras mounted on the dog-like robot, the researchers have shown that they can measure skin temperature, breathing rate, pulse rate and blood oxygen saturation in healthy patients,” writes Sobey.

ZDNet

A new tool developed by MIT researchers sheds light on the operations of generative adversarial network models and allows users to edit these machine learning models to generate new images, reports Daphne Leprince-Ringuet for ZDNet. "The real challenge I'm trying to breach here," says graduate student David Bau, "is how to create models of the world based on people's imagination."

The Verge

Verge reporter James Vincent writes that researchers at the MIT-IBM Watson AI Lab have developed an algorithm that can transform selfies into artistic portraits. The algorithm is “trained on 45,000 classical portraits to render your face in faux oil, watercolor, or ink, “Vincent explains.

BBC

Paul Carter of BBC’s Click highlights CSAIL research to teach a robot how to feel an object just by looking at it. This will ultimately help the robot “grip better when lifting things like the handle of a mug,” says Carter.

Gizmodo

Gizmodo reporter Victoria Song writes that MIT researchers have developed a new system that can teach a machine how to make pizza by examining a photograph. “The researchers set out to teach machines how to recognize different steps in cooking by dissecting images of pizza for individual ingredients,” Song explains.

CNN

Using a tactile sensor and web camera, MIT researchers developed an AI system that allows robots to predict what something feels like just by looking at it, reports David Williams for CNN. “This technology could be used to help robots figure out the best way to hold an object just by looking at it,” explains Williams.

Forbes

Forbes contributor Charles Towers-Clark explores how CSAIL researchers have developed a database of tactile and visual information that could be used to allow robots to infer how different objects look and feel. “This breakthrough could lead to far more sensitive and practical robotic arms that could improve any number of delicate or mission-critical operations,” Towers-Clark writes.