Skip to content ↓

Topic

Computer vision

Download RSS feed: News Articles / In the Media

Displaying 1 - 15 of 145 news clips related to this topic.
Show:

The Wall Street Journal

CSAIL researchers have developed a robotic arm equipped with a sensorized soft brush that can untangle hair, reports Douglas Belkin for The Wall Street Journal. “The laboratory brush is outfitted with sensors that detect tension," writes Belkin. “That tension reads as pain and is used to determine whether to use long strokes or shorter ones.”

TechCrunch

TechCrunch reporter Kyle Wiggers spotlights how MIT researchers have developed a new computer vision algorithm that can identify images down to the individual pixel. The new algorithm is a “vast improvement over the conventional method of ‘teaching’ an algorithm to spot and classify objects in pictures and videos,” writes Wiggers.

Gizmodo

Gizmodo reporter Andrew Liszewski writes that MIT researchers “used a high-resolution video camera with excellent low-light performance (the amount of sensor noise has to be as minimal as possible) to capture enough footage of a blank well that special processing techniques were able to not only see the shadow’s movements, but extrapolate who was creating them.”

TechCrunch

TechCrunch reporter Brian Heater spotlights RFusion, a fully-integrated robotic arm developed by MIT researchers that “uses an RF antenna and camera mounted to an arm to find lost objects.”

The Wall Street Journal

Wall Street Journal reporters Angus Loten and Kevin Hand spotlight how MIT researchers are developing robots with humanlike senses that will be able to assist with a range of tasks. GelSight, a technology developed by CSAIL researchers, outfits robot arms with a small gel pad that can be pressed into objects to sense their size and texture, while another team of researchers is “working to bridge the gap between touch and sight by training an AI system to predict what a seen object feels like and what a felt object looks like.”

Economist

Graduate student Shashank Srikant speaks with The Economist about his work developing a new model that can detect computer bugs and vulnerabilities that have been maliciously inserted into computer code.

The Wall Street Journal

MIT researchers have developed a new robot that can help locate hidden items using AI and wireless technologies, reports Benoit Morenne for The Wall Street Journal. “The latest version of the robot has a 96% success rate at finding and picking up objects in a lab setting, including clothes and household items,” writes Morenne. “In the future, this home helper could also retrieve a specific wrench or screwdriver from a toolbox and assist a human in assembling a piece of furniture.”

Mashable

MIT researchers have developed a new robot with a tactile sensing finger that can find objects buried in sand or rice, reports Emmett Smith for Mashable. “The robot could eventually perform other underground duties like identifying buried cables or disarming bombs or land mines.”

Mashable

Mashable spotlights Strolling Cities, a video project from the MIT-IBM Watson AI Lab, which uses AI to allow users to imagine what different words would like as a location. “Unlike other image-generating AI systems, Strolling Cities creates fictional cities every time,” Mashable notes.

Fast Company

Fast Company reporter Mark Wilson spotlights Strolling Cities, a new AI video project developed by researchers from the MIT-IBM Watson AI Lab, which recreates the streets of Italy based on millions of photos and words. “I decided that the beauty and sentiment, the social, historical, and psychological contents of my memories of Italy could become an artistic project, probably a form of emotional consolation,” says Mauro Martino of the MIT-IBM Watson AI Lab. “Something beautiful always comes out of nostalgia.”

TechCrunch

TechCrunch reporter Brian Heater writes that MIT researchers have developed a new robotic finger, dubbed the Digger Finger, that can sense and identify objects underground. “It’s a useful skill that could someday be deployed for landmines, finding underground cables and a variety of other tasks.”

Wired

Wired reporter Will Knight spotlights how MIT researchers have showed that “an AI program trained to verify that code will run safely can be deceived by making a few careful changes, like substituting certain variables, to create a harmful program.”

TechCrunch

MIT researchers have developed a new robot, dubbed RF Grasp, that can sense hidden objects using radio waves, reports Brian Heater for TechCrunch. “The tech allows RF Grasp to pick up things that are covered up or otherwise out of its line of vision,” writes Heater.

Gizmodo

Researchers at MIT and UMass Lowell have developed a completely flat fisheye camera lens. These lenses “could be used as depth sensors in smartphones, laptops, and wearables,” writes Victoria Song for Gizmodo. “The team also believes there could be medical applications—think imaging devices like endoscopes.”

TechCrunch

MIT researchers have designed a completely flat wide-angle lens that can produce clear, 180-degree images, reports Darrell Etherington for TechCrunch. “The engineers were able to make it work by patterning a thin wafer of glass on one side with microscopic, three-dimensional structures that are positioned very precisely in order to scatter any inbound light in precisely the same way that a curved piece of glass would,” writes Etherington.