Skip to content ↓

Learning to think critically about machine learning

A multidisciplinary team of graduate students helps infuse ethical computing content into MIT’s largest machine learning course.
Press Inquiries

Press Contact:

Abby Abazorius
Phone: 617-253-2709
MIT News Office

Media Download

Marion Boulicault, Dheekshita Kumar, Serena Booth, and Rodrigo Ochigame graphic
Download Image
Caption: A multidisciplinary group of graduate students led an effort to infuse one of MIT’s largest machine learning courses with material related to ethical computing, data and model bias, and fairness in machine learning, as part of the Social and Ethical Responsibilities of Computing initiative. Clockwise from top left: Marion Boulicault, Dheekshita Kumar, Serena Booth, and Rodrigo Ochigame.
Credits: Image: MIT News. Photos courtesy of participants. Background photo by Christopher Harting. Boulicault photo by Jon Sachs. Ochigame photo by Gretchen Ertl.

*Terms of Use:

Images for download on the MIT News office website are made available to non-commercial entities, press and the general public under a Creative Commons Attribution Non-Commercial No Derivatives license. You may not alter the images provided, other than to crop them to size. A credit line must be used when reproducing images; if one is not provided below, credit the images to "MIT."

Close
Marion Boulicault, Dheekshita Kumar, Serena Booth, and Rodrigo Ochigame graphic
Caption:
A multidisciplinary group of graduate students led an effort to infuse one of MIT’s largest machine learning courses with material related to ethical computing, data and model bias, and fairness in machine learning, as part of the Social and Ethical Responsibilities of Computing initiative. Clockwise from top left: Marion Boulicault, Dheekshita Kumar, Serena Booth, and Rodrigo Ochigame.
Credits:
Image: MIT News. Photos courtesy of participants. Background photo by Christopher Harting. Boulicault photo by Jon Sachs. Ochigame photo by Gretchen Ertl.

Students in the MIT course 6.036 (Introduction to Machine Learning) study the principles behind powerful models that help physicians diagnose disease or aid recruiters in screening job candidates.

Now, thanks to the Social and Ethical Responsibilities of Computing (SERC) framework, these students will also stop to ponder the implications of these artificial intelligence tools, which sometimes come with their share of unintended consequences.

Last winter, a team of SERC Scholars worked with instructor Leslie Kaelbling, the Panasonic Professor of Computer Science and Engineering, and the 6.036 teaching assistants to infuse weekly labs with material covering ethical computing, data and model bias, and fairness in machine learning. The process was initiated in the fall of 2019 by Jacob Andreas, the X Consortium Assistant Professor in the Department of Electrical Engineering and Computer Science. SERC Scholars collaborate in multidisciplinary teams to help postdocs and faculty develop new course material.

Because 6.036 is such a large course, more than 500 students who were enrolled in the 2021 spring term grappled with these ethical dimensions alongside their efforts to learn new computing techniques. For some, it may have been their first experience thinking critically in an academic setting about the potential negative impacts of machine learning.

The SERC Scholars evaluated each lab to develop concrete examples and ethics-related questions to fit that week’s material. Each brought a different toolset. Serena Booth is a graduate student in the Interactive Robotics Group of the Computer Science and Artificial Intelligence Laboratory (CSAIL). Marion Boulicault was a graduate student in the Department of Linguistics and Philosophy, and is now a postdoc in the MIT Schwarzman College of Computing, where SERC is based. And Rodrigo Ochigame was a graduate student in the Program in History, Anthropology, and Science, Technology, and Society (HASTS) and is now an assistant professor at Leiden University in the Netherlands. They collaborated closely with teaching assistant Dheekshita Kumar, MEng ’21, who was instrumental in developing the course materials.

They brainstormed and iterated on each lab, while working closely with the teaching assistants to ensure the content fit and would advance the core learning objectives of the course. At the same time, they helped the teaching assistants determine the best way to present the material and lead conversations on topics with social implications, such as race, gender, and surveillance.

“In a class like 6.036, we are dealing with 500 people who are not there to learn about ethics. They think they are there to learn the nuts and bolts of machine learning, like loss functions, activation functions, and things like that. We have this challenge of trying to get those students to really participate in these discussions in a very active and engaged way. We did that by tying the social questions very intimately with the technical content,” Booth says.

For instance, in a lab on how to represent input features for a machine learning model, they introduced different definitions of fairness, asked students to consider the pros and cons of each definition, then challenged them to think about the features that should be input into a model to make it fair.

Four labs have now been published on MIT OpenCourseWare. A new team of SERC Scholars is revising the other eight, based on feedback from the instructors and students, with a focus on learning objectives, filling in gaps, and highlighting important concepts.

An intentional approach

The students’ efforts on 6.036 show how SERC aims to work with faculty in ways that work for them, says Julie Shah, associate dean of SERC and professor of aeronautics and astronautics. They adapted the SERC process due to the unique nature of this large course and tight time constraints.

SERC was established more than two years ago through the MIT Schwarzman College of Computing as an intentional approach to bring faculty from divergent disciplines together into a collaborative setting to co-create and launch new course material focused on social and responsible computing.

Each semester, the SERC team invites about a dozen faculty members to join an Action Group dedicated to developing new curricular materials (there are several SERC Action Groups, each with a different mission). They are purposeful in whom they invite, and seek to include faculty members who will likely form fruitful partnerships in smaller subgroups, says David Kaiser, associate dean of SERC, the Germeshausen Professor of the History of Science, and professor of physics.

These subgroups of two or three faculty members hone their shared interest over the course of the term to develop new ethics-related material. But rather than one discipline serving another, the process is a two-way street; every faculty member brings new material back to their course, Shah explains. Faculty are drawn to the Action Groups from all of MIT’s five schools.

“Part of this involves going outside your normal disciplinary boundaries and building a language, and then trusting and collaborating with someone new outside of your normal circles. That’s why I think our intentional approach has been so successful. It is good to pilot materials and bring new things back to your course, but building relationships is the core. That makes this something valuable for everybody,” she says.

Making an impact

Over the past two years, Shah and Kaiser have been impressed by the energy and enthusiasm surrounding these efforts.

They have worked with about 80 faculty members since the program started, and more than 2,100 students took courses that included new SERC content in the last year alone. Those students aren’t all necessarily engineers — about 500 were exposed to SERC content through courses offered in the School of Humanities, Arts, and Social Sciences, the Sloan School of Management, and the School of Architecture and Planning.

Central to SERC is the principle that ethics and social responsibility in computing should be integrated into all areas of teaching at MIT, so it becomes just as relevant as the technical parts of the curriculum, Shah says. Technology, and AI in particular, now touches nearly every industry, so students in all disciplines should have training that helps them understand these tools, and think deeply about their power and pitfalls.

“It is not someone else’s job to figure out the why or what happens when things go wrong. It is all of our responsibility and we can all be equipped to do it. Let’s get used to that. Let’s build up that muscle of being able to pause and ask those tough questions, even if we can’t identify a single answer at the end of a problem set,” Kaiser says.

For the three SERC Scholars, it was uniquely challenging to carefully craft ethical questions when there was no answer key to refer to. But thinking deeply about such thorny problems also helped Booth, Boulicault, and Ochigame learn, grow, and see the world through the lens of other disciplines.

They are hopeful the undergraduates and teaching assistants in 6.036 take these important lessons to heart, and into their future careers.

“I was inspired and energized by this process, and I learned so much, not just the technical material, but also what you can achieve when you collaborate across disciplines. Just the scale of this effort felt exciting. If we have this cohort of 500 students who go out into the world with a better understanding of how to think about these sorts of problems, I feel like we could really make a difference,” Boulicault says.

Related Links

Related Topics

Related Articles

More MIT News