Showing robots how to do your chores
By observing humans, robots learn to perform complex tasks, such as setting a table.
By observing humans, robots learn to perform complex tasks, such as setting a table.
Computer model of face processing could reveal how the brain produces richly detailed visual representations so quickly.
Researchers discover that no magic is required to explain why deep networks generalize despite going against statistical intuition.
Technique may help scientists more accurately map vast underground geologic structures.
Weather’s a problem for autonomous cars. MIT’s new system shows promise by using “ground-penetrating radar” instead of cameras or lasers.
Tech-based solutions sought for challenges in work environments, education for girls and women, maternal and newborn health, and sustainable food.
MIT duo uses music, videos, and real-world examples to teach students the foundations of artificial intelligence.
PatternEx merges human and machine expertise to spot and respond to hacks.
In a Starr Forum talk, Luis Videgaray, director of MIT’s AI Policy for the World Project, outlines key facets of regulating new technologies.
A deep-learning model identifies a powerful new drug that can kill many species of antibiotic-resistant bacteria.
MIT graduate student is assessing the impacts of artificial intelligence on military power, with a focus on the US and China.
The mission of SENSE.nano is to foster the development and use of novel sensors, sensing systems, and sensing solutions.
By organizing performance data and predicting problems, Tagup helps energy companies keep their equipment running.
Researchers develop a more robust machine-vision architecture by studying how human vision responds to changing viewpoints of objects.
Three-day hackathon explores methods for making artificial intelligence faster and more sustainable.