Skip to content ↓

Topic

Machine learning

Download RSS feed: News Articles / In the Media / Audio

Displaying 721 - 735 of 793 news clips related to this topic.
Show:

Bloomberg

IBM has invested $240 million to develop a new AI research lab with MIT, reports Jing Cao for Bloomberg News. “The MIT-IBM Watson AI Lab will fund projects in four broad areas, including creating better hardware to handle complex computations and figuring out applications of AI in specific industries,” Cao explains. 

CNBC

CNBC reporter Jordan Novet writes that MIT and IBM have established a new lab to pursue fundamental AI research. Novet notes that MIT, “was home to one of the first AI labs and continues to be well regarded as a place to do work in the sector.”

Boston Globe

Boston Globe reporter Andy Rosen writes that MIT and IBM have established a new AI research lab.  “It’s amazing that we have a company that’s also interested in the fundamental research,” explains Anantha Chandrakasan, dean of the School of Engineering. “That’s very basic research that may not be in a product next year, but provides very important insights.”

Fortune- CNN

Writing for Fortune, Barb Darrow highlights how IBM has committed $240 million to establish a new joint AI lab with MIT. Darrow explains that, “the resulting MIT–IBM Watson AI Lab will focus on a handful of key AI areas including the development of new 'deep learning' algorithms.”

New Scientist

New Scientist reporter Matt Reynolds writes that MIT researchers have developed a new system that can determine how much pain a patient is experiencing. “By examining tiny facial expressions and calibrating the system to each person, it provides a level of objectivity in an area where that’s normally hard to come by,” explains Reynolds. 

Newsweek

An algorithm developed by Prof. Iyad Rahwan and graduate student Bjarke Felbo has been trained to detect sarcasm in tweets that use emojis, writes Josh Lowe for Newsweek.  After reading over 1 billion tweets with emojis, the algorithm predicted, “which emoji would be associated with a given tweet based on its emotional tone,” explains Lowe. 

Wired

Wired reporter Liz Stinson writes that researchers from MIT and Google have developed a new algorithm that can automatically retouch images on a mobile phone. “The neural network identifies exactly how to make it look better—increase contrast a smidge, tone down brightness, whatever—and apply the changes in under 20 milliseconds,” Stinson explains. 

NPR

CSAIL researchers have developed an artificial neural network that generates recipes from pictures of food, reports Laurel Dalrymple for NPR. The researchers input recipes into an AI system, which learned patterns “connections between the ingredients in the recipes and the photos of food,” explains Dalrymple.

Wired

A team of researchers from MIT and Princeton participating in the Amazon Robotics Challenge are using GelSight technology to give robots a sense of touch, reports Tom Simonite for Wired. Simonite explains that the, “rubbery membranes on the robot’s fingers are tracked from the inside by tiny cameras as they are deformed by objects it touches.”

USA Today

In this video for USA Today, Sean Dowling highlights Pic2Recipe, the artificial intelligence system developed by CSAIL researchers that can predict recipes based off images of food. The researchers hope the app could one day be used to help, “people track daily nutrition by seeing what’s in their food.”

BBC News

Researchers at MIT have developed an algorithm that can identify recipes based on a photo, writes BBC News reporter Zoe Kleinman. The algorithm, which was trained using a database of over one million photos, could be developed to show “how a food is prepared and could also be adapted to provide nutritional information,” writes Kleinman.

New Scientist

MIT researchers have developed a new machine learning algorithm that can look at photos of food and suggest a recipe to create the pictured dish, reports Matt Reynolds for New Scientist. Reynolds explains that, “eventually people could use an improved version of the algorithm to help them track their diet throughout the day.”

Wired

CSAIL researchers have trained an AI system to look at images of food, predict the ingredients used, and even suggest recipes, writes Matt Burgess for Wired. The system could also analyze meals to determine their nutritional value or “manipulate an existing recipe to be healthier or to conform to certain dietary restrictions," explains graduate student Nick Hynes.

Forbes

Forbes reporter Kevin Murnane writes about how MIT researchers have used a computer vision system to examine how several American cities physically improved or deteriorated over time. Murnane writes that the study “provides important support for nuanced versions of traditional theories about why urban neighborhoods change over time.”

United Press International (UPI)

UPI reporter Amy Wallace writes that MIT researchers have applied a computer vision system to help quantify the physical improvement of American neighborhoods. The researchers found that “density of highly educated residents, proximity to central business districts and other attractive areas, and the initial safety score assigned by the computer system are strongly related to improvements.”