Skip to content ↓

Topic

Social media

Download RSS feed: News Articles / In the Media / Audio

Displaying 16 - 30 of 139 news clips related to this topic.
Show:

Mashable

A new working paper by researchers from MIT and Yale finds that “80 percent of Americans think social media companies should take action to reduce the spread of misinformation,” reports Rachel Kraus for Mashable. “Our data suggests [Musk’s views] are not representative,” says Prof. David Rand. "A lot of people in Silicon Valley have this kind of maybe libertarian, extreme free speech orientation that I don't think is in line with actually how most Americans and social media platform users think about things."

Popular Science

Using machine learning techniques, MIT researchers analyzed social media sentiment around the world during the early days of the Covid-19 pandemic and found that the “pandemic precipitated a dramatic drop in happiness,” reports Charlotte Hu for Popular Science. “We wanted to do this global study to compare different countries because they were hit by the pandemic at different times,” explains Prof. Siqi Zheng, “and they have different cultures, different political systems, and different healthcare systems.”

The Washington Post

Writing for The Washington Post, Prof. Sinan Aral explores the information war underway over traditional and social media about the Russian invasion of Ukraine. “While it is hard to pinpoint the extent to which the information war is contributing to the overwhelming international unity against Putin’s aggression,” writes Aral, “one thing is clear: Social media, mainstream media and the narrative framing of the invasion of Ukraine undoubtedly will play an important role in how this conflict ends.”

The Boston Globe

Research fellow Maham Javaid writes for The Boston Globe about the impact TikTok has played in the Russian invasion of Ukraine. “TikTok is undoubtedly playing multiple roles in this war,” writes Javaid. “One of which is that the war and its accompanying acts of brutality are being documented and disseminated across the world.”

STAT

STAT reporter Faye Flam spotlights research from Prof. David Rand, University of Regina Prof. Gordon Pennycook and their colleagues that shows people “really want to share accurate information but give into the temptation to share juicy bits of gossip they think will please their friends or that make them look good.”

Bloomberg

Prof. David Rand and Prof. Gordon Pennycook of the University of Regina in Canada found that people improved the accuracy of their social media posts when asked to rate the accuracy of the headline first, reports Faye Flam for Bloomberg. “It’s not necessarily that [users] don’t care about accuracy. But instead, it’s that the social media context just distracts them, and they forget to think about whether it’s accurate or not before they decide to share it,” says Rand.

Wired

Writing for Wired, Prof. Nicholas De Monchaux compares the clear division between digital and physical reality presented in The Matrix films with life in real cities where the physical and virtual worlds are increasingly merging. “This new world is inhabited by our digital shadows,” writes De Monchaux. “They follow our steps in the real one and are born from the data trail we leave when we post on social media, search on Google Maps, order things from Amazon, or leave reviews on restaurant sites.”

Gizmodo

Gizmodo reporter Shoshana Wodinsky spotlights a new study by MIT researchers that finds videos are not likely to sway public political opinion more than their textual counterparts. “It’s possible that as you’re scrolling through your newsfeed, video captures your attention more than text would,” says Prof. David Rand. “You might be more likely to look at it. This doesn’t mean that the video is inherently more persuasive than text – just that it has the potential to reach a wider audience.”

The Wall Street Journal

Writing for The Wall Street Journal, Prof. Sherry Turkle explores how in order to fix social media platforms, we also need to alter our views on empathy and disagreement. “We lose out when we don’t take the time to listen to each other, especially to those who are not like us,” writes Turkle. “We need to learn again to tolerate difference and disagreement.”

Boston Globe

A study by MIT researchers finds that crowdsourced fact-checking of news stories by laypeople tend to be just as effective as professional fact-checkers, writes David Scharfenberg for The Boston Globe. The researchers found that “even when pooling a relatively small number of laypeople’s evaluations, the correlation between laypeople’s and fact-checkers’ evaluations was about the same as the correlation among the fact-checkers’.”

Mashable

Mashable reporter Matt Binder writes that a new study by MIT researchers finds that crowdsourced fact-checking of news stories can be as effective as using professional fact-checkers. “The study is positive news in the sense that everyday newsreaders appear to be able to, mostly, suss out misinformation,” writes Binder.

Salon

Salon reporter Amanda Marcotte spotlights a study by MIT researchers that finds correcting misinformation on social media platforms often leads to people sharing more misinformation. Research affiliate Mohsen Mosleh explains that after being corrected Twitter users " retweeted news that was significantly lower in quality and higher in partisan slant, and their retweets contained more toxic language." 

Fast Company

Fast Company reporter Mark Wilson writes that a new study by researchers from MIT and Google finds that simple user experience interventions can help stop people from sharing misinformation on Covid-19. “Researchers introduced several different prompts through a simple popup window, all with a single goal: to get people to think about the accuracy of what they’re about to share,” writes Wilson. “When primed to consider a story’s accuracy, people were up to 20% less likely to share a piece of fake news.”

Fast Company

Fast Company reporter Arianne Cohen writes that a new study by MIT researchers explores how polite corrections to online misinformation can lead to further sharing of incorrect information. The researchers found that after being politely corrected for sharing inaccurate information, “tweeters’ accuracy declined further—and even more so when they were corrected by someone matching their political leanings.”

Boston Globe

A new study by MIT researchers finds that attempting to correct misinformation on social media can lead to users sharing even less accurate information, reports Hiawatha Bray for The Boston Globe. “Being publicly corrected by another person makes them less attentive to what they retweet,” explains Prof. David Rand, “because it shifts their attention not to accuracy but toward social things like being embarrassed.”