Joint Crowdsourcing of Multiple Tasks
Allocating tasks to workers so as to get the greatest amount of high-quality output for as little resources as possible is an overarching theme in crowdsourcing research. Among the factors that complicate this problem is the lack of information about the available workers’ skill, along with unknown difficulty of the tasks to be solved. Moreover, if a crowdsourcing platform customer is limited to a fixed-size worker pool to complete a large batch of jobs such as identifying a particular object in a collection of images or comparing the quality of many pairs of artifacts in crowdsourcing workflows, she inevitably faces the tradeoff between getting a few of these tasks done well or getting many done poorly. In this paper, we propose a framework called JOCR (Joint Crowdsourcing, pronounced as “Joker”) for analyzing joint allocations of many tasks to a pool of workers. JOCR encompasses a broad class of common crowdsourcing scenarios, and we pose the challenge of developing efficient algorithms for it.