Skip to content ↓

Topic

Algorithms

Download RSS feed: News Articles / In the Media / Audio

Displaying 196 - 210 of 567 news clips related to this topic.
Show:

TechCrunch

Researchers at MIT and Brown University created an interactive data system that “could give everyone AI superpowers,” writes Darrell Etherington for TechCrunch. Known as ‘Northstar,’ the system can instantly generate machine-learning models to use with existing data sets in order to generate useful predictions, explains Etherington.

CNN

Using a tactile sensor and web camera, MIT researchers developed an AI system that allows robots to predict what something feels like just by looking at it, reports David Williams for CNN. “This technology could be used to help robots figure out the best way to hold an object just by looking at it,” explains Williams.

Forbes

Forbes contributor Charles Towers-Clark explores how CSAIL researchers have developed a database of tactile and visual information that could be used to allow robots to infer how different objects look and feel. “This breakthrough could lead to far more sensitive and practical robotic arms that could improve any number of delicate or mission-critical operations,” Towers-Clark writes.

TechCrunch

MIT researchers have created a new system that enables robots to identify objects using tactile information, reports Darrell Etherington for TechCrunch. “This type of AI also could be used to help robots operate more efficiently and effectively in low-light environments without requiring advanced sensors,” Etherington explains.

Fast Company

Fast Company reporter Michael Grothaus writes that CSAIL researchers have developed a new system that allows robots to determine what objects look like by touching them. “The breakthrough could ultimately help robots become better at manipulating objects,” Grothaus explains.

TechCrunch

TechCrunch reporter Darrell Etherington writes that MIT researchers have developed a system that can predict a perso's trajectory. The tool could allow “robots that typically freeze in the face of anything even vaguely resembling a person walking in their path to continue to operate and move around the flow of human foot traffic."

Bloomberg News

Bloomberg News spotlights how MIT researchers have developed a fleet of autonomous boats that can automatically latch onto one another. Bloomberg notes that the boats will be able to “transport goods and people, collect trash and assemble into floating stages and bridges.”

Boston Globe

Boston Globe reporter Martin Finucane writes that MIT researchers have developed an automated latching system that could enable a fleet of autonomous boats to connect to docking stations and other boats. Finucane explains that in turbulent water, “after a missed first attempt, the system can autonomously adapt, repositioning the roboat and latching.”

Popular Mechanics

Popular Mechanics reporter Daisy Hernandez writes that MIT researchers have developed an autonomous aquatic boat that can target and latch onto one another to form new structures. Hernandez writes that the boats were conceived “as a way to explore new modes of transportation and help improve traffic flow.”

Mashable

Mashable highlights how MIT researchers have developed a new system of computationally simple robots inspired by biological cells that can connect in large groups to move around, transport objects and complete tasks. Mashable explains that robots made up of simplistic components, “could enable more scalable, flexible and robust systems.”

TechCrunch

TechCrunch reporter Darrell Etherington writes that MIT researchers have developed a new system that enables autonomous boats to latch onto one another to create new structures. Etherington explains that the researchers envision fleets of autonomous boats forming “on-demand urban infrastructure, including stages for concerts, walking bridges or even entire outdoor markets.”

Motherboard

Motherboard reporter Rob Dozier writes about Glitch, an MIT startup that uses machine learning to design clothing. “These tools are meant to empower human designers,” explains graduate student Emily Salvador. “What I think is really cool about these creative-focused AI tools is that there’s still this really compelling need for a human to intervene with the algorithm.”

Economist

A new sensory glove developed by MIT researchers provides insight into how humans grasp and manipulate objects, reports The Economist. The glove will not only “be useful in programming robots to mimic people more closely when they pick objects up,” but also could “provide insights into how the different parts of the hand work together when grasping things.”

HealthDay News

A new glove embedded with sensors can enable AI systems to identify the shape and weight of different objects, writes HealthDay reporter Dennis Thompson. Using the glove, “researchers have been able to clearly unravel or quantify how the different regions of the hand come together to perform a grasping task,” explains MIT alumnus Subramanian Sundaram.

New Scientist

New Scientist reporter Chelsea Whyte writes that MIT researchers have developed a smart glove that enables neural networks to identify objects by touch alone. “There’s been a lot of hope that we’ll be able to understand the human grasp someday and this will unlock our potential to create this dexterity in robots,” explains MIT alumnus Subramanian Sundaram.