BBC
Paul Carter of BBC’s Click highlights CSAIL research to teach a robot how to feel an object just by looking at it. This will ultimately help the robot “grip better when lifting things like the handle of a mug,” says Carter.
Paul Carter of BBC’s Click highlights CSAIL research to teach a robot how to feel an object just by looking at it. This will ultimately help the robot “grip better when lifting things like the handle of a mug,” says Carter.
Using a tactile sensor and web camera, MIT researchers developed an AI system that allows robots to predict what something feels like just by looking at it, reports David Williams for CNN. “This technology could be used to help robots figure out the best way to hold an object just by looking at it,” explains Williams.
Forbes contributor Charles Towers-Clark explores how CSAIL researchers have developed a database of tactile and visual information that could be used to allow robots to infer how different objects look and feel. “This breakthrough could lead to far more sensitive and practical robotic arms that could improve any number of delicate or mission-critical operations,” Towers-Clark writes.
MIT researchers have created a new system that enables robots to identify objects using tactile information, reports Darrell Etherington for TechCrunch. “This type of AI also could be used to help robots operate more efficiently and effectively in low-light environments without requiring advanced sensors,” Etherington explains.
Fast Company reporter Michael Grothaus writes that CSAIL researchers have developed a new system that allows robots to determine what objects look like by touching them. “The breakthrough could ultimately help robots become better at manipulating objects,” Grothaus explains.
A new sensory glove developed by MIT researchers provides insight into how humans grasp and manipulate objects, reports The Economist. The glove will not only “be useful in programming robots to mimic people more closely when they pick objects up,” but also could “provide insights into how the different parts of the hand work together when grasping things.”
A new glove embedded with sensors can enable AI systems to identify the shape and weight of different objects, writes HealthDay reporter Dennis Thompson. Using the glove, “researchers have been able to clearly unravel or quantify how the different regions of the hand come together to perform a grasping task,” explains MIT alumnus Subramanian Sundaram.
New Scientist reporter Chelsea Whyte writes that MIT researchers have developed a smart glove that enables neural networks to identify objects by touch alone. “There’s been a lot of hope that we’ll be able to understand the human grasp someday and this will unlock our potential to create this dexterity in robots,” explains MIT alumnus Subramanian Sundaram.
MIT researchers have developed a low-cost electronic glove equipped with sensors that can use tactical information to identify objects, reports Katherine Wu for NOVA Next. Wu writes that the glove is “easy and economical to manufacture, carrying a wallet-friendly price tag of only $10 per glove, and could someday inform the design of prosthetics, surgical tools, and more.”
Wired reporter Aarian Marshall spotlights how Prof. Sarah Williams has been developing digital tools to help map bus routes in areas that lack transportation maps. “The maps show that there is an order,” Williams explains. “There is, in fact, a system, and the system could be used to help plan new transportation initiatives.”
A study by MIT researchers examines the historical impact of technology on the labor market in an attempt to better understand the potential effect of AI systems, reports Adi Gaskell for Forbes. “The authors propose a number of solutions for improving data on the skills required in the workforce today, and from that the potential for AI to automate or augment those skills,” Gaskell explains.
Fast Company reporter Michael Grothaus writes that CSAIL researchers have developed a deep learning model that could predict whether a woman might develop breast cancer. The system “could accurately predict about 31% of all cancer patients in a high-risk category,” Grothaus explains, which is “significantly better than traditional ways of predicting breast cancer risks.”
WCVB-TV’s Jennifer Eagan reports that researchers from MIT and MGH have developed a deep learning model that can predict a patient’s risk of developing breast cancer in the future from a mammogram image. Prof. Regina Barzilay explains that the model “can look at lots of pixels and variations of the pixels and capture very subtle patterns.”
HealthDay News reporter Amy Norton writes that MIT researchers have developed an AI system that can help predict a woman’s risk of developing breast cancer and provide more personalized care. “If you know a woman is at high risk, maybe she can be screened more frequently, or be screened using MRI,” explains graduate student Adam Yala.
In an article about how the social messaging app WhatsApp could have a large influence on the upcoming election in India, the Financial Times spotlights postdoctoral associate Kiran Garimella’s work examining how misinformation spreads in India through platforms such as WhatsApp.