Skip to content ↓

Topic

Machine learning

Download RSS feed: News Articles / In the Media / Audio

Displaying 511 - 525 of 751 news clips related to this topic.
Show:

STAT

Diana Cai writes for STAT about Prof. Markus Buehler’s new research to turn amino acids into music. “Buehler thinks the technology could help in understanding genetic diseases caused by misfolded proteins,” writes Cai, noting that, “AI may conceivably ‘hear’ patterns of misfolding that could distinguish dangerous mutations from harmless ones.”

Motherboard

In a new study, Prof. Markus Buehler converted 20 types of amino acids into a 20-tone scale to create musical compositions. “Those altered compositions were converted back into a conceptual amino acid chain, which enabled the team to generate variations of proteins that have never been seen in nature,” writes Becky Ferreira for Motherboard.

Gizmodo

Gizmodo reporter Victoria Song writes that MIT researchers have developed a new system that can teach a machine how to make pizza by examining a photograph. “The researchers set out to teach machines how to recognize different steps in cooking by dissecting images of pizza for individual ingredients,” Song explains.

CNN

Using a tactile sensor and web camera, MIT researchers developed an AI system that allows robots to predict what something feels like just by looking at it, reports David Williams for CNN. “This technology could be used to help robots figure out the best way to hold an object just by looking at it,” explains Williams.

Forbes

Forbes contributor Charles Towers-Clark explores how CSAIL researchers have developed a database of tactile and visual information that could be used to allow robots to infer how different objects look and feel. “This breakthrough could lead to far more sensitive and practical robotic arms that could improve any number of delicate or mission-critical operations,” Towers-Clark writes.

TechCrunch

MIT researchers have created a new system that enables robots to identify objects using tactile information, reports Darrell Etherington for TechCrunch. “This type of AI also could be used to help robots operate more efficiently and effectively in low-light environments without requiring advanced sensors,” Etherington explains.

Fast Company

Fast Company reporter Michael Grothaus writes that CSAIL researchers have developed a new system that allows robots to determine what objects look like by touching them. “The breakthrough could ultimately help robots become better at manipulating objects,” Grothaus explains.

TechCrunch

TechCrunch reporter Darrell Etherington writes that MIT researchers have developed a system that can predict a perso's trajectory. The tool could allow “robots that typically freeze in the face of anything even vaguely resembling a person walking in their path to continue to operate and move around the flow of human foot traffic."

Motherboard

Motherboard reporter Rob Dozier writes about Glitch, an MIT startup that uses machine learning to design clothing. “These tools are meant to empower human designers,” explains graduate student Emily Salvador. “What I think is really cool about these creative-focused AI tools is that there’s still this really compelling need for a human to intervene with the algorithm.”

Forbes

Forbes reporter Joe McKendrick highlights a Nature review article by MIT researchers that calls for expanding the study of AI. “We’re seeing the rise of machines with agency, machines that are actors making decisions and taking actions autonomously," they write. "This calls for a new field of scientific study that looks at them not solely as products of engineering and computer science.”

Economist

A new sensory glove developed by MIT researchers provides insight into how humans grasp and manipulate objects, reports The Economist. The glove will not only “be useful in programming robots to mimic people more closely when they pick objects up,” but also could “provide insights into how the different parts of the hand work together when grasping things.”

HealthDay News

A new glove embedded with sensors can enable AI systems to identify the shape and weight of different objects, writes HealthDay reporter Dennis Thompson. Using the glove, “researchers have been able to clearly unravel or quantify how the different regions of the hand come together to perform a grasping task,” explains MIT alumnus Subramanian Sundaram.

New Scientist

New Scientist reporter Chelsea Whyte writes that MIT researchers have developed a smart glove that enables neural networks to identify objects by touch alone. “There’s been a lot of hope that we’ll be able to understand the human grasp someday and this will unlock our potential to create this dexterity in robots,” explains MIT alumnus Subramanian Sundaram.

PBS NOVA

MIT researchers have developed a low-cost electronic glove equipped with sensors that can use tactical information to identify objects, reports Katherine Wu for NOVA Next. Wu writes that the glove is “easy and economical to manufacture, carrying a wallet-friendly price tag of only $10 per glove, and could someday inform the design of prosthetics, surgical tools, and more.”

VentureBeat

Researchers from MIT and a number of other institutions have found that grammar-enriched deep learning models had a better understanding of key linguistic rules, reports Kyle Wiggers for VentureBeat. The researchers found that an AI system provided with knowledge of basic grammar, “consistently performed better than systems trained on little-to-no grammar using a fraction of the data, and that it could comprehend ‘fairly sophisticated’ rules.”