Singing in the brain
MIT neuroscientists have identified a population of neurons in the human brain that respond to singing but not other types of music.
MIT neuroscientists have identified a population of neurons in the human brain that respond to singing but not other types of music.
Professor and cognitive neuroscientist recognized for groundbreaking work on the functional organization of the human brain.
MIT neuroscientists have developed a computer model that can answer that question as well as the human brain.
Computational modeling shows that both our ears and our environment influence how we hear.
Study suggests this area of the visual cortex emerges much earlier in development than previously thought.
A new machine-learning system helps robots understand and perform certain social interactions.
Neuroscientists find the internal workings of next-word prediction models resemble those of language-processing centers in the brain.
When asked to classify odors, artificial neural networks adopt a structure that closely resembles that of the brain’s olfactory circuitry.
We seem to be wired to calculate not the shortest path but the “pointiest” one, facing us toward our destination as much as possible.
Brain and cognitive sciences professor will lead the Institute’s interdisciplinary initiative to advance research in natural and artificial intelligence.
EECS faculty head of artificial intelligence and decision making honored for significant and extended contributions to the field of AI.
Adding a module that mimics part of the brain can prevent common errors made by computer vision models.
What's SSUP? The Sample, Simulate, Update cognitive model developed by MIT researchers learns to use tools like humans do.
Recurrent processing via prefrontal cortex, necessary for quick visual object processing in primates, provides a key insight for developing brain-like artificial intelligence.
In some situations, asking “what if everyone did that?” is a common strategy for judging whether an action is right or wrong.