• LectureScape uses data on viewing behavior to help make video lectures more intuitive, dynamic, and effective for users.

    LectureScape uses data on viewing behavior to help make video lectures more intuitive, dynamic, and effective for users.

    Full Screen

What 6.9 million clicks tell us about how to fix online education

Watch Video

Press Contact

Adam Conner-Simons
Email: aconner@csail.mit.edu
Phone: 617-324-9135
MIT Computer Science & Artificial Intelligence Lab

The rise of online education and massively open online courses (MOOCs) have prompted much naysaying on their effectiveness, with detractors citing single-digit completion rates and short-lived pilot programs.

Amidst all the arguments about “flipped classrooms” and “hybrid learning,” however, few people have actually analyzed what makes MOOCs work (or fail): the content. Online learners spend most of their time watching videos — but are the videos any good?

This year edX, the online learning platform co-run by MIT and Harvard University, gave researchers at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) data on the second-by-second viewing habits of more than 100,000 learners perusing more than 6.9 million video sessions.

In a paper published this spring, the CSAIL team outlined some key findings on what online learners want from videos. These include:

  • Brevity (viewers generally tune out after six minutes)
  • Informality, with professors seated at a desk, not standing behind a podium
  • Lively visuals rather than static PowerPoint slides
  • Fast talkers (professors seen as the most engaging spoke at 254 words per minute)
  • More pauses, so viewers can soak in complex diagrams
  • Web-friendly lessons (existing videos broken into shorter chunks are less effective than ones crafted for online audiences)

These insights form the basis of the CSAIL team’s LectureScape, a “YouTube for MOOCs” that seeks to reinvent how online learners watch videos.

LectureScape uses data on viewing behavior — particularly the “interaction peaks” that correspond to points of interest or confusion — to present MOOC videos in a way that’s more intuitive, dynamic, and effective:

  • A timeline shows which parts other users have most frequently watched
  • An interactive transcript lets users enter keywords to find relevant segments
  • A mechanism automatically creates word clouds and summaries of individual sections, as well as the whole presentation
  • Content from popular slides automatically appears in the following slide, as users will likely want to refer back to that information

In summary, viewers can consume videos more efficiently, skipping specific sections or repeating trickier ones, without having to slog through the whole video.

A demo of LectureScape, a “YouTube for MOOCs” designed by MIT researchers that seeks to reinvent how online learners watch videos

Video courtesy of the researchers

Juho Kim, a graduate student in electrical engineering and computer science, says that the group’s previous work on the tutorial-focused platform ToolScape (PDF) demonstrated that users learn more effectively with this type of interface. He says that traditional MOOC metrics, such as completion rates, are “too simplistic,” and don’t account for the many learners seeking specific skills (versus intending to formally finish a course).

LectureScape was developed by Kim alongside former postdoc Philip Guo; EECS graduate students Carrie Cai and Shang-Wen Li; EECS professor Rob Miller; and Harvard's Krzysztof Gajos. Kim will present the research at the ACM Symposium on User Interface Software and Technology (UIST) in Honolulu in October.

Kim says the next steps for LectureScape include personalized lecture-video recommendations, in the style of Netflix, as well as “on-demand expansion,” which includes links to relevant videos to clarify potentially confusing topics.

He also hopes to implement the tool on a larger scale to quantify its effect on student engagement and performance. The technology can be easily applied to existing MOOC content, as the tool uses machine learning to automatically segment videos based on visual cues.

Topics: Computer Science and Artificial Intelligence Laboratory (CSAIL), Computer science and technology, Crowdsourcing, User interfaces


Great data, thank you!

MOOCs are to online learning what the Hummer is to driving. Overblown, inefficient and difficult to justify, especially if learning is the objective. Why no mention of the thousands of high quality, interactive courses available thru Open Educational Resource? They are free and, unlikeMOOCs have validated methods for determining learning outcomes.

The conclusion that viewers "want" shorter videos is based on a flawed methodology in the paper. Viewers are more likely to finish a shorter video, or watch a larger fraction of it, of course. This is how the paper defines "engagement" not because it's a particularly good measure but because they couldn't come up with a better one. The conclusion that shorter videos are somehow better for education is not valid.

I humbly suggest changing the title of this article to "What 69 Millions Clicks Tell Us About How to Fix MOOCs" -- since the article is about MOOCs in particular and not online education as a whole. As it stands, that click-bait title only adds to the conflation of MOOCs and "online education."

If the results of the study are intended to apply broadly to the entire spectrum of online education, then some actual content to that effect would be helpful.

Agreed 100%. I practice many of these points in my in class lectures. I have to finish a 50 or 75 minute lectures at a stretch. May be quiz after every 5 to 10 minutes will be helpful.

Thank you so much for writing this article. @thinkhmm's apposite comment aside, the content is well written, clearly expressed, and -halleluia- has references. Thanks for making me think about my own practice.

The mere title of this article attests to the popularity of MOOCs. As many others have commented, it is also immensely helpful that the researchers at MIT actually studied the effectiveness of MOOCs in order to produce a model of what a more effective program might look like. I would like to suggest that anyone interested in MOOCs take a look around courseworld.org. It takes the 'Netflix' approach (though, unlike Netflix, access to the website is free) and offers videos organized by content and searchable by keywords. The videos come from youtube and vary in terms of length and subject.

The points about video from the Spring paper have been considered best practices for online educational videos for many, many years.

Would be interested in seeing other functionalities, such as a) instructors tagging video timeline segments to learning outcomes at the program, course or unit/lesson level, and/or b) allowing students to rate which vid segments were most helpful to them in reaching a specific outcome.

Did someone analysis the offline courses? I think the results are far more worse.

Back to the top