Integrating Algorithmic and Behavioral Approaches to Crowdsourcing

With the advent of crowdsourcing and crowd work, human intelligence can be brought to bear on a great variety of useful tasks and on short notice. However, because of the varying attention span and inevitable errors made by human contributors, much research has focused on algorithmic and AI-based approaches to optimizing the quality of results aggregated from human input. In this talk, we present several projects providing examples of how human behavior in crowdsourcing can be not only modeled, but also managed. We show how the attention of crowd workers can be accurately predicted by the combination of certain features, allowing for targeted interventions. We present an experiment showing how different financial incentives, even at the same overall level of payment, can result in a trade-off between quality and speed. Finally, we provide some interesting results comparing paid workers to volunteers and the attention span of workers on Amazon Mechanical Turk. Our work sets the stage for richer crowdsourcing algorithms that can adapt to and even take advantage of differences in human behavior.

Joint work with Ece Kamar, Eric Horvitz, and Yiling Chen.

Speaker Bios

Andrew Mao is a fourth-year PhD student at Harvard University, where he is advised by Yiling Chen. He is particularly interested in using experimental and empirical techniques to study the behavior of human agents in online settings, especially in crowdsourcing and online communities. Andrew received a Yahoo! Key Scientific Challenges award in 2011 and is currently doing his second internship at Microsoft Research. He obtained a B.S.E in Computer Science and a B.S. in economics from the University of Pennsylvania in 2009.

Date:
Haut-parleurs:
Andrew Mao
Affiliation:
Harvard University/MSR Intern

Taille: Microsoft Research Talks