Skip to content ↓

Topic

Computer Science and Artificial Intelligence Laboratory (CSAIL)

Download RSS feed: News Articles / In the Media / Audio

Displaying 601 - 615 of 759 news clips related to this topic.
Show:

TechCrunch

TechCrunch reporter Brian Heater spotlights a new device developed by MIT researchers that can wirelessly monitor sleep. “Thanks to new AI technology, the system is now able to translate subtle movement into meaningful information about the subject’s sleep patterns, including sleep stages (light/deep/R.E.M.), movement and breathing rate." 

Boston Globe

MIT researchers have developed a new device that can track sleep patterns using radio waves, reports Alyssa Meyers for The Boston Globe. The researchers plan to “use the device to study how Parkinson’s disease affects sleep,” Meyers explains, adding that it could also be helpful with, “studying Alzheimer’s disease, insomnia, sleep apnea, and epilepsy.”

Wired

Wired reporter Liz Stinson writes that researchers from MIT and Google have developed a new algorithm that can automatically retouch images on a mobile phone. “The neural network identifies exactly how to make it look better—increase contrast a smidge, tone down brightness, whatever—and apply the changes in under 20 milliseconds,” Stinson explains. 

NPR

CSAIL researchers have developed an artificial neural network that generates recipes from pictures of food, reports Laurel Dalrymple for NPR. The researchers input recipes into an AI system, which learned patterns “connections between the ingredients in the recipes and the photos of food,” explains Dalrymple.

Wired

A team of researchers from MIT and Princeton participating in the Amazon Robotics Challenge are using GelSight technology to give robots a sense of touch, reports Tom Simonite for Wired. Simonite explains that the, “rubbery membranes on the robot’s fingers are tracked from the inside by tiny cameras as they are deformed by objects it touches.”

USA Today

In this video for USA Today, Sean Dowling highlights Pic2Recipe, the artificial intelligence system developed by CSAIL researchers that can predict recipes based off images of food. The researchers hope the app could one day be used to help, “people track daily nutrition by seeing what’s in their food.”

BBC News

Researchers at MIT have developed an algorithm that can identify recipes based on a photo, writes BBC News reporter Zoe Kleinman. The algorithm, which was trained using a database of over one million photos, could be developed to show “how a food is prepared and could also be adapted to provide nutritional information,” writes Kleinman.

New Scientist

MIT researchers have developed a new machine learning algorithm that can look at photos of food and suggest a recipe to create the pictured dish, reports Matt Reynolds for New Scientist. Reynolds explains that, “eventually people could use an improved version of the algorithm to help them track their diet throughout the day.”

Wired

CSAIL researchers have trained an AI system to look at images of food, predict the ingredients used, and even suggest recipes, writes Matt Burgess for Wired. The system could also analyze meals to determine their nutritional value or “manipulate an existing recipe to be healthier or to conform to certain dietary restrictions," explains graduate student Nick Hynes.

CBS News

CBS This Morning’s Dana Jacobson explores how MIT researchers are developing technology to enable robots to assist with disaster response, including a robotic cheetah and a system that 3-D prints robots. Prof. Russ Tedrake says that, “there's a natural transition from the robots in the labs now into the robots doing meaningful work.” 

BBC News

CSAIL researchers have developed drones that can drive and fly through a city-like setting, reports Gareth Mitchell for BBC News. The goal for this research is to have the vehicles “coordinate with each other and make intelligent decisions when they fly and drive,” says graduate student Brandon Araki. 

Fox News

FOX News reporter Grace Williams writes that MIT researchers have developed a new system to assist people with visual impairments in navigating their surroundings. “We wanted to primarily complement the white cane to allow users with visual impairments to quickly assess their environment in a contactless manner,” explains graduate student Robert Katzschmann. 

New Scientist

New Scientist reporter Evelyn Wang writes that a study by MIT researchers finds that, “the question of whether a scrambled Rubik’s cube of any size can be solved in a given number of moves is what’s called NP-complete – that’s maths lingo for a problem even mathematicians find hard to solve.”

BBC News

Prof. Daniela Rus and graduate student Robert Katzschmann speak with BBC reporter Gareth Mitchell about the device they developed to help the visually impaired navigate. Rus explains that they applied the technologies used for autonomous driving to develop a system that can, “guide a visually impaired person in the same way a suite of sensors can guide a self-driving car.”

Wired

Wired reporter Tom Simonite highlights how Prof. Daniela Rus is developing technology that enables a car’s computer to take control from human drivers to help prevent accidents. “Eventually everyone will get to autonomy, but the technology’s not ready yet,” explains Rus. “This is an intermediate step we can take to make driving safer in the meantime.”