Skip to content ↓

Topic

Artificial intelligence

Download RSS feed: News Articles / In the Media / Audio

Displaying 736 - 750 of 1216 news clips related to this topic.
Show:

Gizmodo

Gizmodo reporter Victoria Song writes that MIT researchers have developed a new system that can teach a machine how to make pizza by examining a photograph. “The researchers set out to teach machines how to recognize different steps in cooking by dissecting images of pizza for individual ingredients,” Song explains.

CNN

Using a tactile sensor and web camera, MIT researchers developed an AI system that allows robots to predict what something feels like just by looking at it, reports David Williams for CNN. “This technology could be used to help robots figure out the best way to hold an object just by looking at it,” explains Williams.

Forbes

Forbes contributor Charles Towers-Clark explores how CSAIL researchers have developed a database of tactile and visual information that could be used to allow robots to infer how different objects look and feel. “This breakthrough could lead to far more sensitive and practical robotic arms that could improve any number of delicate or mission-critical operations,” Towers-Clark writes.

TechCrunch

MIT researchers have created a new system that enables robots to identify objects using tactile information, reports Darrell Etherington for TechCrunch. “This type of AI also could be used to help robots operate more efficiently and effectively in low-light environments without requiring advanced sensors,” Etherington explains.

Fast Company

Fast Company reporter Michael Grothaus writes that CSAIL researchers have developed a new system that allows robots to determine what objects look like by touching them. “The breakthrough could ultimately help robots become better at manipulating objects,” Grothaus explains.

TechCrunch

TechCrunch reporter Darrell Etherington writes that MIT researchers have developed a system that can predict a perso's trajectory. The tool could allow “robots that typically freeze in the face of anything even vaguely resembling a person walking in their path to continue to operate and move around the flow of human foot traffic."

Times Higher Education

During a Times Higher Ed summit, Prof. Shigeru Miyagawa, senior associate dean for open learning, emphasized the importance of integrating attention to ethical implications into AI education, reports Paul Baskin. “My plan is to educate a new generation of young people who will have intuition behind computational thinking,” says Miyagawa.

Mashable

Mashable highlights how MIT researchers have developed a new system of computationally simple robots inspired by biological cells that can connect in large groups to move around, transport objects and complete tasks. Mashable explains that robots made up of simplistic components, “could enable more scalable, flexible and robust systems.”

Motherboard

Motherboard reporter Rob Dozier writes about Glitch, an MIT startup that uses machine learning to design clothing. “These tools are meant to empower human designers,” explains graduate student Emily Salvador. “What I think is really cool about these creative-focused AI tools is that there’s still this really compelling need for a human to intervene with the algorithm.”

VentureBeat

Researchers from MIT and a number of other institutions have found that grammar-enriched deep learning models had a better understanding of key linguistic rules, reports Kyle Wiggers for VentureBeat. The researchers found that an AI system provided with knowledge of basic grammar, “consistently performed better than systems trained on little-to-no grammar using a fraction of the data, and that it could comprehend ‘fairly sophisticated’ rules.”

Mashable

In this video, Mashable highlights how CSAIL researchers have developed a new system that can help lift heavy objects by mirroring human activity. The system uses sensors that monitor muscle activity and detect changes in the user’s arm.

Gizmodo

In an article for Gizmodo, Dell Cameron writes that graduate student Joy Buolamwini testified before Congress about the inherent biases of facial recognition systems. Buolamwini’s research on face recognition tools “identified a 35-percent error rate for photos of darker skinned women, as opposed to database searches using photos of white men, which proved accurate 99 percent of the time.”

Wired

Wired reporter Lily Hay Newman highlights graduate student Joy Buolamwini’s Congressional testimony about the bias of facial recognition systems. “New research is showing bias in the use of facial analysis technology for health care purposes, and facial recognition is being sold to schools,” said Buolamwini. “Our faces may well be the final frontier of privacy.” 

Popular Science

Popular Science reporter Rob Verger writes that MIT researchers have developed a new mechanical system that can help humans lift heavy objects. “Overall the system aims to make it easier for people and robots to work together as a team on physical tasks,” explains graduate student Joseph DelPreto.

TechCrunch

MIT and the U.S. Air Force “are teaming up to launch a new accelerator focused on artificial intelligence applications,” writes Danny Crichton for TechCrunch. The goal is that projects developed in the MIT-Air Force AI Accelerator would be “addressing challenges that are important to both the Air Force and society more broadly.”