Skip to content ↓

Topic

Equity and inclusion

Download RSS feed: News Articles / In the Media / Audio

Displaying 196 - 210 of 254 news clips related to this topic.
Show:

co.design

Recent research from graduate student Joy Buolamwini shows that facial recognition programs, which are increasingly being used by law enforcement, are failing to identify non-white faces. “When these systems can’t recognize darker faces with as much accuracy as lighter faces, there’s a higher likelihood that innocent people will be targeted by law enforcement,” writes Katharine Schwab for Co. Design

Gizmodo

Writing for Gizmodo, Sidney Fussell explains that a new Media Lab study finds facial-recognition software is most accurate when identifying men with lighter skin and least accurate for women with darker skin. The software analyzed by graduate student Joy Buolamwini “misidentified the gender of dark-skinned females 35 percent of the time,” explains Fussell.

Quartz

A study co-authored by MIT graduate student Joy Buolamwini finds that facial-recognition software is less accurate when identifying darker skin tones, especially those of women, writes Josh Horwitz of Quartz. According to the study, these errors could cause AI services to “treat individuals differently based on factors such as skin color or gender,” explains Horwitz.

The New York Times

Steve Lohr writes for the New York Times about graduate student Joy Buolamwini’s findings on the biases of artificial intelligence in facial recognition. “You can’t have ethical A.I. that’s not inclusive,” Buolamwini said. “And whoever is creating the technology is setting the standards.”

NPR

Graduate student Joy Buolamwini is featured on NPR’s TED Radio Hour explaining the racial bias of facial recognition software and how these problems can be rectified. “The minimum thing we can do is actually check for the performance of these systems across groups that we already know have historically been disenfranchised,” says Buolanwini.

Smithsonian Magazine

In an article co-written for Smithsonian, Prof. John Van Reenen writes about an analysis he and his colleagues conducted examining how socioeconomic background, race and gender can impact a child’s chances of becoming an inventor. The researchers found that, “young people’s exposure to innovators may be an important way to reduce these disparities and increase the number of inventors.”

Boston Globe

Prof. Junot Díaz speaks with Boston Globe reporter James Sullivan about his new children’s book, “Islandborn.” The book was inspired by two of his godchildren, who asked him to write a book featuring kids that looked like them. Díaz related to their request, noting that as a child, he felt “the world I was immersed in wasn’t represented at all.”

WHDH 7

WHDH speaks with MIT staff member Maia Weinstock, who designed the original concept for the Women of NASA LEGO set. Weinstock explained that she is “really excited to see teachers and parents and kids tell me their stories of how they are going to use the set.”

Boston Globe

Writing for The Boston Globe, MIT graduate student Matthew Claudel argues that innovation efforts should be focused on being more socially inclusive. “Municipalities that foster accessible innovation for livelihoods will reap the benefits of greater livability. It is those places, rather than techno-hubs that prize quick, marketable lifestyle amenities, that will emerge as the smartest cities of the future.”

The Boston Globe

Bryan Marquard of The Boston Globe writes about the legacy of Paul Gray, the 14th president of MIT, who died at 85 and was known for his efforts to increase diversity at MIT. Gray was a “transformative administrator who enrolled at MIT as an electrical engineering student in 1950 and retired in 1997 as chairman of the MIT Corporation, the institute’s governing body,” writes Marquard. 

Bloomberg Businessweek

Bloomberg Businessweek reporter Arianne Cohen profiles graduate student Joy Buolamwini, who founded the algorithmic Justice League in an effort to make people more aware of the biases embedded in AI systems. “We’re using facial analysis as an exemplar to show how we can include more inclusive training data in the first place,” says Buolamwini of her work. 

Guardian

Graduate student Joy Buolamwini speaks with Guardian reporter Ian Tucker about her work fighting algorithmic biases. Buolamwini explains that she is, “trying to identify bias, to point out cases where bias can occur so people can know what to look out for, but also develop tools where the creators of systems can check for a bias in their design.”

Boston Globe

Boston Globe reporter Kay Lazar writes about an event held at MIT “aimed at building alliances among diverse groups now also targeted for hate.” The event was organized by the Massachusetts chapter of the Indian American Forum for Political Education. 

BBC News

BBC News reporter Zoe Kleinman writes that graduate student Joy Buolamwini has developed an initiative aimed at tackling algorithmic bias. "If we are limited when it comes to being inclusive that's going to be reflected in the robots we develop or the tech that's incorporated within the robots,” says Buolamwini.

Careers & the disABLED

Paul Parravano, co-director of the Office of Government and Community Relations, speaks with Careers & the disABLED about working at MIT. Parravano notes that MIT works “hard to hire people who come from different backgrounds. There’s a clear benefit when you have people from different backgrounds and experiences. That’s how you best solve problems.”