Skip to content ↓



Download RSS feed: News Articles / In the Media / Audio

Displaying 1 - 15 of 63 news clips related to this topic.


NOVA program host Alok Patel explores the benefits of sharing personal data without compromising personal privacy. He visits Prof. Ramesh Raskar’s Camera Culture Lab, where Raskar discusses the concept of “smashing data” - the process of extracting useful information while obscuring private data. “Achieving privacy and benefits of the data is available, and it’s just a matter of convincing large companies to play along those rules,” said Raskar.

Interesting Engineering

MIT researchers have developed a machine-learning accelerator chip to make health-monitoring apps more secure, reports Aman Tripathi for Interesting Engineering. “The researchers subjected this new chip to intensive testing, simulating real-world hacking attempts, and the results were impressive,” explains Tripathi. “Even after millions of attempts, they were unable to recover any private information. In contrast, stealing data from an unprotected chip took only a few thousand samples.”

Government Technology

Senior Lecturer Luis Videgaray speaks with Government Technology reporter Nikki Davidson about concerns facing emerging AI programs and initiatives. Videgaray underscores the importance of finding vendors, "who are willing to protect the data in a way that is appropriate and also provides the state or local government agency with the required degree of transparency about the workings of the model, the data that was used for training and how that data will interact with the data supplied by the customer.”


MIT researchers have developed a new technique aimed at protecting images from AI generators, reports Kyle Barr for Gizmodo. The program uses "data poisoning techniques to essentially disturb pixels within an image to create invisible noise, effectively making AI art generators incapable of generating realistic deepfakes based on the photos they’re fed,” reports Kyle Barr for Gizmodo." 


TechCrunch reporter Kyle Wiggers spotlights DynamoFl, a company founded by Christian Lau PhD ’20 and Vaikkunth Mugunthan PhD ’22 that is developing a federated learning platform, a technique for preserving data privacy in AI systems. 

The Wall Street Journal

Prof. Emeritus Stuart Madnick and Keri Pearlson, executive director of Cybersecurity at MIT Sloan, write for The Wall Street Journal about the seven actions that corporate leaders can take to ensure that employees contribute to maintaining a secure organization. “New vulnerabilities emerge every day, as malicious cybersecurity actors find fresh ways to attack or infiltrate organizations. Technology can help, but it can only do so much,” write Madnick and Pearlson. “Just as important is a culture where all employees fill in the gaps—by noticing anomalies, questioning things that might look legitimate but are slightly off in some way, or stopping compromised processes that would otherwise proceed.”

The Wall Street Journal

Prof. Emeritus Stuart Madnick writes for The Wall Street Journal about the importance of transparency when companies are impacted by cyberattacks. “It has become clear that through laws and regulations, we need to increase the quantity, quality, and timeliness of cyberattack reporting,” writes Madnick. “Only by having more detailed information on who is getting attacked, how they are getting attacked and what is being stolen can everybody begin to arm themselves with the right defenses.”


Zero-knowledge proof (ZKP), a cryptographic method invented by three MIT researchers in 1985, enables authentication of private information without revealing information that could be compromised, reports Victor Shilo for Forbes. “ZKP has the potential to protect privacy in a wide range of cases,” writes Shilo. “By implementing ZKP, businesses and society can evolve to ‘open data 2.0’ where daily transactions are completed in today’s digital economy but without disclosing unnecessary sensitive information.”


CNBC reporter Dain Evans writes about how researchers from MIT’s Digital Currency Initiative and the Federal Reserve of Boston are exploring what a digital currency might look like in America. “I think that if there is a digital dollar, privacy is going to be a very, very important part of that,” says Neha Narula, director of the Digital Currency Initiative at the MIT Media Lab.

The Boston Globe

Writing for The Boston Globe, Prof. Ramesh Raskar underscores the importance of ensuring that every American has the opportunity to receive the Covid-19 vaccine without cost or without giving up their privacy. “By effectively communicating the privacy benefits of decentralized data collection and anonymized data reporting, mobile apps might diminish barriers to vaccination that exist due to privacy concerns,” writes Raskar.

The Wall Street Journal

MIT researchers have developed a new model that helps quantify a company’s security risk, and estimates possible financial losses, reports Catherine Stupp for The Wall Street Journal. The tool “collects encrypted data from companies about recent incidents and analyzes the anonymized information to determine the probability of different kinds of attacks more broadly,” writes Stupp.

Financial Times

In a letter to the Financial Times, graduate student Daniel Aronoff underscores his view that central banks should proceed with caution when considering digital currencies. “The US Fed and other central banks are wise to embark on the research that will keep them at the forefront of knowledge and, possibly, enable them to develop implementations that are safe and beneficial to the citizens of their countries,” writes Aronoff.

Financial Times

The Financial Times has named Prof. Tim Berners-Lee its "Boldness in Business" Person of the Year for his work aimed at providing people with more control over how their personal data is used online, reports John Thornhill. “We know how to fire rockets into the sky. We should be able to build constructive social networks,” says Lee.

Fast Company

MIT researchers have found that it’s easy to reidentify anonymized data compiled in massive datasets, reports Kelsey Campbell-Dollaghan for Fast Company. The findings show that urban planners, tech companies and designers, “who stand to learn so much from these big urban datasets,” writes Campbell-Dollaghan, “need to be careful about whether all that data could be combined to deanonymize it.”


A new study by MIT researchers provides evidence that compiling massive anonymized datasets of people’s movement patterns can put their private data at risk, reports the Xinhua news agency. The researchers found “data containing ‘location stamps’ – information with geographical coordinates and time stamps – could be used to easily track the mobility trajectories of how people live and work.”