Skip to content ↓

Topic

Machine learning

Download RSS feed: News Articles / In the Media / Audio

Displaying 706 - 720 of 828 news clips related to this topic.
Show:

co.design

MapLite, a new system developed by CSAIL, aims to help autonomous vehicles navigate uncharted areas, writes Jesus Diaz for Co.Design. “[I]f autonomous cars can reach the millions of people who live beyond the city and are unable to pilot their own vehicles,” said graduate student Teddy Ort, “they will be uniquely capable of providing mobility to those who have very few alternatives.”

The Verge

Writing for The Verge, Angela Chen highlights advances in AI that are allowing researchers to discover and understand new materials at a rapid pace. Chen cites a study co-authored by Assistant Prof. Elsa Olivetti, who “developed a machine-learning system that scans academic papers to figure out which ones include instructions for making certain materials.”

PBS NOVA

MIT researchers have developed “the first artificial system to mimic the way the brain interprets sound – and it rivals humans in its accuracy,” reports Samia Bouzik for NOVA Next. “The research offers a tantalizing new way to study the brain…[and] could boost some neuroscience research into the fast track,” writes Bouzik.

co.design

After several years of experimentation, graduate student Arnav Kapur developed AlterEgo, a device to interpret subvocalization that can be used to control digital applications. Describing the implications as “exciting,” Katharine Schwab at Co.Design writes, “The technology would enable a new way of thinking about how we interact with computers, one that doesn’t require a screen but that still preserves the privacy of our thoughts.”

The Guardian

AlterEgo, a device developed by Media Lab graduate student Arnav Kapur, “can transcribe words that wearers verbalise internally but do not say out loud, using electrodes attached to the skin,” writes Samuel Gibbs of The Guardian. “Kapur and team are currently working on collecting data to improve recognition and widen the number of words AlterEgo can detect.”

Popular Science

Researchers at the Media Lab have developed a device, known as “AlterEgo,” which allows an individual to discreetly query the internet and control devices by using a headset “where a handful of electrodes pick up the miniscule electrical signals generated by the subtle internal muscle motions that occur when you silently talk to yourself,” writes Rob Verger for Popular Science.

New Scientist

A new headset developed by graduate student Arnav Kapur reads the small muscle movements in the face that occur when the wearer thinks about speaking, and then uses “artificial intelligence algorithms to decipher their meaning,” writes Chelsea Whyte for New Scientist. Known as AlterEgo, the device “is directly linked to a program that can query Google and then speak the answers.”

The Guardian

In a forthcoming book excerpted in The Guardian, Alex Beard describes Prof. Deb Roy's project to record his infant son's learning behaviors. Beard explains that while Roy set out to create machines that learned like humans, he was ultimately blown away by "the incredible sophistication of what a language learner in the flesh actually looks like and does." "The learning process wasn’t decoding, as he had originally thought, but something infinitely more continuous, complex and social."

WGBH

A recent study from Media Lab graduate student Joy Buolamwini addresses errors in facial recognition software that create concern for civil liberties. “If programmers are training artificial intelligence on a set of images primarily made up of white male faces, their systems will reflect that bias,” writes Cristina Quinn for WGBH.

Quartz

In a new working paper, Prof. Daron Acemoglu and his co-author argue that the rise in automation is linked to the aging of the blue-collar population. “The study shows that workers feeling the brunt of automation in lost jobs and lower wages are between the ages of 36 and 55. Those findings should make it easier for policy makers to track down the most affected workers—and help them survive the robot rush,” writes Ana Campoy for Quartz.

Mashable

Mashable highlights the robotic system, developed by researchers at MIT and Princeton, that can pick up, recognize, and place assorted objects. The researchers created an algorithm that allows the crane to “grab and sort objects (such as medicine bottles) into bins making it a potential timesaver for medical experts.”

Financial Times

In an article for Financial Times, CSAIL Director Daniela Rus explains why humans should collaborate rather than compete with AI. “Technology and people do not have to be in competition,” writes Rus. “Collaborating with AI systems, we can augment and amplify many aspects of work and life.”

Quartz

Lecturer Luis Perez-Breva writes for Quartz about why most retail corporations’ definition of AI is flawed. “'AI' is at its best when we program it to address problems that are hard for humans; when not used to upskill humans, however, all it does is shift work from employees to customers,” Perez-Breva writes.

Xinhuanet

AI leader SenseTime is the first company to join the MIT Intelligence Quest since its launch, writes Xinhua editor Xiang Bo. “As the largest provider of AI algorithms in China, we are very excited to work with MIT to lead global AI research into the next frontier,” said Xu Li, CEO of SenseTime.

Financial Times

A video from Financial Times highlights work being done by CSAIL to develop robot teams. Prof. Daniela Rus discusses how partnering robots has the potential to “form much more adaptive and complex systems that will be able to take on a wider set of tasks."