For several years, the Human Dynamics research group at the MIT Media Lab has been using the standard sensors in smartphones to collect data about people’s social interactions, drawing surprising conclusions about the way political opinions, dietary habits and illnesses — among other things — spread through populations.
Now, the group is making its phone-based data-collection system available as a free, open-source download so that other researchers, and people interested in the burgeoning phenomenon of “self-tracking,” can not only use it but also help expand it, incorporating it into other applications or providing it with new features and functions.
“There are a lot of other research groups that are reinventing the wheel,” says Nadav Aharony, a PhD student in the group who led the software’s development. “We felt that we were so advanced in this field that we wanted to share this.”
Aharony; graduate student Wei Pan; Human Dynamics group leader Sandy Pentland, the Toshiba Professor of Media Arts and Sciences; MIT affiliate Cory Ip SM ’11; and Inas Khayal, a visiting scholar from the Masdar Institute in Abu Dhabi, described the system in a paper presented at the UbiComp ubiquitous-computing conference in Beijing in September. Together with Cody Sumter, a master’s student, and Alan Gardner ’05, a software developer whose participation in the project was funded by a grant from Google, the researchers turned the system into a user-friendly software package that was officially released today.
The system, dubbed Funf, has two main components: One is an application called Funf Journal, which runs on phones that use Google’s Android operating system and governs the collection and exportation of sensor data. The other is a set of tools for managing and visualizing that data, which run on a desktop or laptop computer.
Funf Journal provides intuitive checkbox menus that allow users to specify, for instance, how frequently a given phone sensor — an accelerometer, say, or the GPS receiver — will collect data and for how long. Sets of sensor configurations can be saved and loaded when appropriate: A self-tracker might want to perform frequent measurements only during the morning commute, for instance. The desktop application can also broadcast configuration updates to participants in a study, if they’ve granted it access to their phones.
Chain reaction
In the Funf framework, the controller for each sensor is known as a “probe.” But since the raw data generated by phone sensors can be difficult for novices to interpret — and time-consuming even for experts — Funf Journal comes with a number of higher-level probes that can look for patterns in the sensor data. The “activity monitor” probe, for instance, can distinguish the accelerometer data typical of, say, a phone on the hip of someone being jostled on a subway train from the data produced when the same person is walking briskly or climbing stairs. It can thus provide a single numerical score for the user’s physical activity over any specified time span. Each of the higher-level probes is configurable through the same type of menu that governs the sensors themselves.
Funf Journal comes with roughly 30 probes built in. But the Media Lab team is eager for developers outside MIT to invent additional high-level probes, and probes that use the data generated by those probes, and so on. Less tech-savvy users could still publish configuration settings that they’ve found useful for particular tasks. “You can imagine a free marketplace of these configurations and also of these probes,” Aharony says.
Visualization software can represent a Funf user's geographical movements as a 'heat map,' where colors at the red end of the spectrum designate highly frequented areas and colors at the blue end designate occasionally frequented areas. This video displays two Funf users' movement patterns, updated each day over the course of a month.
Video: Nadav Aharony
Aharony and his colleagues also provide developers with an application programming interface, which would allow them to incorporate probes or other Funf features into their own programs without explicitly using Funf Journal.
Under the hood
Many of those features address problems that arose chronically during the Human Dynamics group’s own research. One is power management: Frequent use of a phone’s sensors can quickly drain its battery, so Funf automatically adopts power-saving strategies such as delaying energy-intensive tasks until the phone is plugged in or postponing a GPS reading if the accelerometers and gyros indicate that the phone has remained stationary since the last one.
Another problem was privacy, so by default, Funf encrypts any data stored on the phone or uploaded to a server. The app can also be configured so that uploaded data remains anonymous, or so that the phone reports only conclusions drawn from its analysis of sensor data, not the raw data itself.
“It’s right-on architecturally in terms of its modularity and openness,” says Deborah Estrin, director of the Center for Embedded Networked Sensing and a professor of computer science at the University of California at Los Angeles. “It really draws on the many years of experience that their group has had innovating in this space.”
Estrin’s own research group uses cellphone data collection to assist in preventive medicine, and she says that she can “definitely” envision using the Funf system in her future work. “We have some other capabilities that aren’t yet implemented in that platform,” she says, “but because of this probe architecture, it allows for both contributing things that we have that conform to that architecture as well as incorporating pieces [of Funf], even if it’s not taken on wholesale.”
Pentland has been talking to Hal Abelson, the Class of 1922 Professor of Computer Science and Engineering and one of the lead developers of the Google App Inventor, about creating a system with an intuitive graphical interface that would let users with little prior programming experience develop new Funf probes. “We’re hoping to help build a community around the framework,” Aharony says.
Now, the group is making its phone-based data-collection system available as a free, open-source download so that other researchers, and people interested in the burgeoning phenomenon of “self-tracking,” can not only use it but also help expand it, incorporating it into other applications or providing it with new features and functions.
“There are a lot of other research groups that are reinventing the wheel,” says Nadav Aharony, a PhD student in the group who led the software’s development. “We felt that we were so advanced in this field that we wanted to share this.”
Aharony; graduate student Wei Pan; Human Dynamics group leader Sandy Pentland, the Toshiba Professor of Media Arts and Sciences; MIT affiliate Cory Ip SM ’11; and Inas Khayal, a visiting scholar from the Masdar Institute in Abu Dhabi, described the system in a paper presented at the UbiComp ubiquitous-computing conference in Beijing in September. Together with Cody Sumter, a master’s student, and Alan Gardner ’05, a software developer whose participation in the project was funded by a grant from Google, the researchers turned the system into a user-friendly software package that was officially released today.
The system, dubbed Funf, has two main components: One is an application called Funf Journal, which runs on phones that use Google’s Android operating system and governs the collection and exportation of sensor data. The other is a set of tools for managing and visualizing that data, which run on a desktop or laptop computer.
Funf Journal provides intuitive checkbox menus that allow users to specify, for instance, how frequently a given phone sensor — an accelerometer, say, or the GPS receiver — will collect data and for how long. Sets of sensor configurations can be saved and loaded when appropriate: A self-tracker might want to perform frequent measurements only during the morning commute, for instance. The desktop application can also broadcast configuration updates to participants in a study, if they’ve granted it access to their phones.
Chain reaction
In the Funf framework, the controller for each sensor is known as a “probe.” But since the raw data generated by phone sensors can be difficult for novices to interpret — and time-consuming even for experts — Funf Journal comes with a number of higher-level probes that can look for patterns in the sensor data. The “activity monitor” probe, for instance, can distinguish the accelerometer data typical of, say, a phone on the hip of someone being jostled on a subway train from the data produced when the same person is walking briskly or climbing stairs. It can thus provide a single numerical score for the user’s physical activity over any specified time span. Each of the higher-level probes is configurable through the same type of menu that governs the sensors themselves.
Funf Journal comes with roughly 30 probes built in. But the Media Lab team is eager for developers outside MIT to invent additional high-level probes, and probes that use the data generated by those probes, and so on. Less tech-savvy users could still publish configuration settings that they’ve found useful for particular tasks. “You can imagine a free marketplace of these configurations and also of these probes,” Aharony says.
Visualization software can represent a Funf user's geographical movements as a 'heat map,' where colors at the red end of the spectrum designate highly frequented areas and colors at the blue end designate occasionally frequented areas. This video displays two Funf users' movement patterns, updated each day over the course of a month.
Video: Nadav Aharony
Under the hood
Many of those features address problems that arose chronically during the Human Dynamics group’s own research. One is power management: Frequent use of a phone’s sensors can quickly drain its battery, so Funf automatically adopts power-saving strategies such as delaying energy-intensive tasks until the phone is plugged in or postponing a GPS reading if the accelerometers and gyros indicate that the phone has remained stationary since the last one.
Another problem was privacy, so by default, Funf encrypts any data stored on the phone or uploaded to a server. The app can also be configured so that uploaded data remains anonymous, or so that the phone reports only conclusions drawn from its analysis of sensor data, not the raw data itself.
“It’s right-on architecturally in terms of its modularity and openness,” says Deborah Estrin, director of the Center for Embedded Networked Sensing and a professor of computer science at the University of California at Los Angeles. “It really draws on the many years of experience that their group has had innovating in this space.”
Estrin’s own research group uses cellphone data collection to assist in preventive medicine, and she says that she can “definitely” envision using the Funf system in her future work. “We have some other capabilities that aren’t yet implemented in that platform,” she says, “but because of this probe architecture, it allows for both contributing things that we have that conform to that architecture as well as incorporating pieces [of Funf], even if it’s not taken on wholesale.”
Pentland has been talking to Hal Abelson, the Class of 1922 Professor of Computer Science and Engineering and one of the lead developers of the Google App Inventor, about creating a system with an intuitive graphical interface that would let users with little prior programming experience develop new Funf probes. “We’re hoping to help build a community around the framework,” Aharony says.