Crowdsourcing systems are being widely used to overcome several challenges that require human intervention. While there is an increase in the adoption of the crowdsourcing paradigm as a solution, there are no established guidelines or tangible recommendations for task design with respect to key parameters such as task length, monetary incentive and time required for task completion. In this paper, we propose the tuning of these parameters based on our findings from extensive experiments and analysis of categorization tasks. We delve into the behavior of workers that consume categorization tasks to determine measures that can make task design more effective.
Authors: Ujwal Gadiraju, Patrick Siehndel, Besnik Fetahu, Ricardo Kawase