Skip to content ↓

Topic

Machine learning

Download RSS feed: News Articles / In the Media / Audio

Displaying 466 - 480 of 788 news clips related to this topic.
Show:

TechCrunch

TechCrunch writer Devin Coldewey reports on the ReSkin project, an AI project focused on developing a new electronic skin and fingertip meant to expand the sense of touch in robots. The ReSkin project is rooted in GelSight, a technology developed by MIT researchers that allows robots to gauge an object’s hardness.

Axios

Axios reporter Alison Snyder writes that a new study by MIT researchers demonstrates how AI algorithms could provide insight into the human brain’s processing abilities. The researchers found “Predicting the next word someone might say — like AI algorithms now do when you search the internet or text a friend — may be a key part of the human brain's ability to process language,” writes Snyder.

Scientific American

Using an integrative modeling technique, MIT researchers compared dozens of machine learning algorithms to brain scans as part of an effort to better understand how the brain processes language. The researchers found that “neural networks and computational science might, in fact, be critical tools in providing insight into the great mystery of how the brain processes information of all kinds,” writes Anna Blaustein for Scientific American.

Gizmodo

Gizmodo reporter Andrew Liszewski writes that MIT researchers “used a high-resolution video camera with excellent low-light performance (the amount of sensor noise has to be as minimal as possible) to capture enough footage of a blank well that special processing techniques were able to not only see the shadow’s movements, but extrapolate who was creating them.”

TechCrunch

TechCrunch reporter Brian Heater spotlights RFusion, a fully-integrated robotic arm developed by MIT researchers that “uses an RF antenna and camera mounted to an arm to find lost objects.”

Scientific American

Scientific American reporter Sophie Bushwick writes that MIT researchers have developed a new system that can interpret shadows that are invisible to the human eye. “The system can automatically analyze footage of a blank wall in any room in real time, determining the number of people and their actions,” writes Bushwick.

Axios

Axios reporter Marisa Fernandez writes that researchers from MIT and Wilson Labs will be analyzing data from seven organ procurement organizations as part of an effort to better understand the American organ procurement system. "Working with this data is a first step towards making better decisions about how to save more lives through organ procurement and transplantation,” says Prof. Marzyeh Ghassemi. “We have an opportunity to use machine learning to understand potential issues and lead improvements in transparency and equity.”

The Wall Street Journal

Wall Street Journal reporter Sara Castellanos spotlights Prof. Markus Buehler’s work combining virtual reality with sound waves to help detect subtle changes in molecular motions. Castellanos notes that Buehler and his team recently found, “coronaviruses can be more lethal or infectious depending on the vibrations within the spike proteins that are found on the surface of the virus.”

Boston Globe

Boston Globe reporter Hiawatha Bray spotlights Venti Technologies, an MIT startup developing self-driving cargo trucks for seaports. “The trucks can automatically transport containers to dockside, where cranes can load them onto ships,” writes Bray. “Or they can pick up containers as they’re unloaded, and move them to staging areas where they can be transferred to other ships.”

VICE

Vice reporter Radhamely De Leon spotlights how researchers from MIT and Carnegie Mellon University have created “a search engine tool that shows what Google search results appear in different countries or languages, highlighting key differences in the algorithm between regions.”

Economist

Graduate student Shashank Srikant speaks with The Economist about his work developing a new model that can detect computer bugs and vulnerabilities that have been maliciously inserted into computer code.

Clinical OMICs

Koch Institute fellow Dr. Rameen Shakur and his colleagues have developed a new computer tool that could allow doctors to personalize treatments for patients with inherited heart disease. “In areas such as cardiology and oncology, where large amounts of clinical and genetic data need to be analyzed, adopting a computer-based approach…can make diagnosis, outcome prediction and treatment more effective and efficient,” writes Helen Albert for Clinical OMICs.

Mashable

Mashable spotlights Strolling Cities, a video project from the MIT-IBM Watson AI Lab, which uses AI to allow users to imagine what different words would like as a location. “Unlike other image-generating AI systems, Strolling Cities creates fictional cities every time,” Mashable notes.

STAT

A recent review by MIT researchers finds that “only about 23% of machine learning studies in health care used multiple datasets to establish their results, compared to 80% in the adjacent field of computer vision, and 58% in natural language processing,” writes Casey Ross for STAT. “If the performance results are not reproduced in clinical care to the standard that was used during [a study], then we risk approving algorithms that we can’t trust,” says graduate student Matthew McDermott. “They may actually end up worsening patient care.”

Times Higher Education

Times Higher Education reporter Simon Baker writes that Media Lab researchers have developed a new machine learning model that can predict research studies that will have the highest impact. The tool has the potential to “aid funders and research evaluators in making better decisions and avoiding the kind of biases and gaming that occurred with simpler metric assessments.”