Skip to content ↓

Topic

Social media

Download RSS feed: News Articles / In the Media / Audio

Displaying 31 - 45 of 151 news clips related to this topic.
Show:

The Wall Street Journal

Writing for The Wall Street Journal, Prof. Sherry Turkle explores how in order to fix social media platforms, we also need to alter our views on empathy and disagreement. “We lose out when we don’t take the time to listen to each other, especially to those who are not like us,” writes Turkle. “We need to learn again to tolerate difference and disagreement.”

Boston Globe

A study by MIT researchers finds that crowdsourced fact-checking of news stories by laypeople tend to be just as effective as professional fact-checkers, writes David Scharfenberg for The Boston Globe. The researchers found that “even when pooling a relatively small number of laypeople’s evaluations, the correlation between laypeople’s and fact-checkers’ evaluations was about the same as the correlation among the fact-checkers’.”

Mashable

Mashable reporter Matt Binder writes that a new study by MIT researchers finds that crowdsourced fact-checking of news stories can be as effective as using professional fact-checkers. “The study is positive news in the sense that everyday newsreaders appear to be able to, mostly, suss out misinformation,” writes Binder.

Salon

Salon reporter Amanda Marcotte spotlights a study by MIT researchers that finds correcting misinformation on social media platforms often leads to people sharing more misinformation. Research affiliate Mohsen Mosleh explains that after being corrected Twitter users " retweeted news that was significantly lower in quality and higher in partisan slant, and their retweets contained more toxic language." 

Fast Company

Fast Company reporter Mark Wilson writes that a new study by researchers from MIT and Google finds that simple user experience interventions can help stop people from sharing misinformation on Covid-19. “Researchers introduced several different prompts through a simple popup window, all with a single goal: to get people to think about the accuracy of what they’re about to share,” writes Wilson. “When primed to consider a story’s accuracy, people were up to 20% less likely to share a piece of fake news.”

Fast Company

Fast Company reporter Arianne Cohen writes that a new study by MIT researchers explores how polite corrections to online misinformation can lead to further sharing of incorrect information. The researchers found that after being politely corrected for sharing inaccurate information, “tweeters’ accuracy declined further—and even more so when they were corrected by someone matching their political leanings.”

Boston Globe

A new study by MIT researchers finds that attempting to correct misinformation on social media can lead to users sharing even less accurate information, reports Hiawatha Bray for The Boston Globe. “Being publicly corrected by another person makes them less attentive to what they retweet,” explains Prof. David Rand, “because it shifts their attention not to accuracy but toward social things like being embarrassed.”

Motherboard

A new study by MIT researchers finds that correcting people who were spreading misinformation on Twitter led to people retweeting and sharing even more misinformation, reports Matthew Gault for Motherboard. Prof. David Rand explains that the research is aimed at identifying “what kinds of interventions increase versus decrease the quality of news people share. There is no question that social media has changed the way people interact. But understanding how exactly it's changed things is really difficult.” 

Slate

Graduate student Crystal Lee speaks with Slate reporter Rebecca Onion about a new study that illustrates how social media users have used data visualizations to argue against public health measures during the Covid-19 pandemic. “The biggest point of diversion is the focus on different metrics—on deaths, rather than cases,” says Lee. “They focus on a very small slice of the data. And even then, they contest metrics in ways I think are fundamentally misleading.”

The New Yorker

New Yorker reporter Benjamin Wallace-Wells spotlights new research from the MIT Initiative on the Digital Economy, which shows “just telling people the accurate immunization rates in their country increased, by five per cent, the number who said that they would get the vaccine.”

Fox News

A new study by MIT researchers finds that political beliefs can help bring people together on social media networks, reports Brooke Crothers for Fox News. On both sides, users were roughly three times more likely to form social ties with strangers who identify with the same party, compared to "counter-partisans.”

The Washington Post

In an opinion piece for The Washington Post, Prof. Sinan Aral discusses the rise of GameStop stock and the real-world impact of social media. “The past week’s events exposed several potential sources of economic instability,” writes Aral. “If the social media crowd’s opinion alone drives market value, the market goes where the herd takes it, without the constraints of economic reality.”

TechCrunch

TechCrunch reporter Devin Coldewey writes that a new study co-authored by MIT researchers finds that debunking misinformation is the most effective method of addressing false news on social media platforms. “The team speculated as to the cause of this, suggesting that it fits with other indications that people are more likely to incorporate feedback into a preexisting judgment rather than alter that judgment as it’s being formed,” writes Coldewey. 

Forbes

Forbes contributor Wayne Rush spotlights Prof. David Rand’s research examining how to most effectively combat the spread of misinformation. “They forget to think about whether it’s true, but rather how many likes they’ll get,” says Rand of why people share misinformation on social media. “Another feature of social media is that people are more likely to be friends with people who share common ideas.”

Fortune

Prof. Sinan Aral speaks with Fortune reporter Danielle Abril about how social media companies can more effectively respond to misinformation and hate speech, following the attack on the U.S. Capitol. “This has been a steady momentum build of reaction by social media platforms,” says Aral. “This is a culmination of an understanding of social media companies that they need to do more [and] that the laissez-faire attitude isn’t going to cut it.”