Skip to content ↓

Topic

Human-computer interaction

Download RSS feed: News Articles / In the Media / Audio

Displaying 31 - 45 of 85 news clips related to this topic.
Show:

Forbes

Prof. Jacob Andreas explored the concept of language guided program synthesis at CSAIL’s Imagination in Action event, reports research affiliate John Werner for Forbes. “Language is a tool,” said Andreas during his talk. “Not just for training models, but actually interpreting them and sometimes improving them directly, again, in domains, not just involving languages (or) inputs, but also these kinds of visual domains as well.”

Science

In conversation with Matthew Huston at Science, Prof. John Horton discusses the possibility of using chatbots in research instead of humans. As he explains, a change like that would be similar to the transition from in-person to online surveys, “"People were like, ‘How can you run experiments online? Who are these people?’ And now it’s like, ‘Oh, yeah, of course you do that.’”

Forbes

Researchers from MIT have found that using generative AI chatbots can improve the speed and quality of simple writing tasks, but often lack factual accuracy, reports Richard Nieva for Forbes. “When we first started playing with ChatGPT, it was clear that it was a new breakthrough unlike anything we've seen before,” says graduate student Shakked Noy. “And it was pretty clear that it was going to have some kind of labor market impact.”

The Conversation

Writing for The Conversation, postdoc Ziv Epstein SM ’19, PhD ’23, graduate student Robert Mahari and Jessica Fjeld of Harvard Law School explore how the use of generative AI will impact creative work. “The ways in which existing laws are interpreted or reformed – and whether generative AI is appropriately treated as the tool it is – will have real consequences for the future of creative expression,” the authors note.  

Mashable

Researchers at MIT have developed a drone that can be controlled using hand gestures, reports Mashable. “I think it’s important to think carefully about how machine learning and robotics can help people to have a higher quality of life and be more productive,” says postdoc Joseph DelPreto. “So we want to combine what robots do well and what people do well so that they can be more effective teams.”

National Geographic

National Geographic reporter Maya Wei-Haas explores how the ancient art of origami is being applied to fields such a robotics, medicine and space exploration. Wei-Haas notes that Prof. Daniela Rus and her team developed a robot that can fold to fit inside a pill capsule, while Prof. Erik Demaine has designed complex, curving fold patterns. “You get these really impressive 3D forms with very simple creasing,” says Demaine.

Fortune

MIT researchers have found that “automation is the primary reason the income gap between more and less educated workers has continued to widen,” reports Ellen McGirt for Fortune. “This single one variable…explains 50 to 70% of the changes or variation between group inequality from 1980 to about 2016,” says Prof. Daron Acemoglu

Politico

Prof. Daron Acemoglu speaks with Politico reporter Derek Robertson about his new study examining the impacts of automation on the workforce and economy. “This discussion gets framed around ‘Will robots and AI destroy jobs, and lead to a jobless future,’ and I think that's the wrong framing,” says Acemoglu. “Industrial robots may have reduced U.S. employment by half a percent, which is not trivial, but nothing on that scale [of a “jobless future”] has happened — but if you look at the inequality implications, it's been massive.”

TechCrunch

TechCrunch reporter Brian Heater spotlights a new study by Prof. Daron Acemoglu that examines the impact of automation on the workforce. “We’re starting with a very clear premise here: in 21st-century America, the wealth gap is big and only getting bigger,” writes Heater. “The paper, ‘Tasks, Automation, and the Rise in U.S. Wage Inequality,’ attempts to explore the correlation between the growing income gap and automation.”

Popular Science

Popular Science reporter Andrew Paul writes that a study co-authored by Institute Prof. Daron Acemoglu examines the impact of automation on the workforce over the past four decades and finds that “‘so-so automation’ exacerbates wage gaps between white and blue collar workers more than almost any other factor.”

Politico

Prof. Cynthia Breazeal discusses her work exploring how artificial intelligence can help students impacted by Covid, including refugees or children with disabilities, reports Ryan Heath for Politico. “We want to be super clear on what the role is of the robot versus the community, of which this robot is a part of. That's part of the ethical design thinking,” says Breazeal. “We don't want to have the robot overstep its responsibilities. All of our data that we collect is protected and encrypted.”

TechCrunch

TechCrunch reporter Brian Heater spotlights new MIT robotics research, including a team of CSAIL researchers “working on a system that utilizes a robotic arm to help people get dressed.” Heater notes that the “issue is one of robotic vision — specifically finding a method to give the system a better view of the human arm it’s working to dress.”

The Economist

Prof. Julie Shah speaks with The Economist about her work developing systems to help robots operate safely and efficiently with humans. “Robots need to see us as more than just an obstacle to maneuver around,” says Shah. “They need to work with us and anticipate what we need.”

The Boston Globe

Assaf Biderman ‘05, associate director of the MIT SENSEable City Lab, discusses his startup Superpedestrian, a transportation robotics company that has developed electric scooters available in over 60 cities across the world.  “I think we hit the holy grail of micromobility, which is detecting when you’re on the sidewalk every time and stopping or slowing the vehicle,” said Biderman.

TechCrunch

A new study by MIT researchers finds people are more likely to interact with a smart device if it demonstrates more humanlike attributes, reports Brian Heater for TechCrunch. The researchers found “users are more likely to engage with both the device — and each other — more when it exhibits some form of social cues,” writes Heater. “That can mean something as simple as the face/screen of the device rotating to meet the speaker’s gaze.”