4.7 Article

Storm crowds: Evidence from Zooniverse on crowd contribution design

Journal

RESEARCH POLICY
Volume 51, Issue 1, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.respol.2021.104414

Keywords

Citizen science; Crowdsourcing; Difference-in-differences; Natural experiment; Zooniverse

Categories

Funding

  1. Alfred P. Sloan Foundation [G-2011-10-11]

Ask authors/readers for more resources

This research examines the impact of platform design on crowdsourcing contributions, specifically focusing on the element of tolerance to incompleteness and its effect on users' willingness to contribute. Through a quasi-experimental approach on Zooniverse, it was found that a format change reducing tolerance to incompleteness led to a decrease in the quantity of contributions but a decrease in quality.
What is the impact of platform design on crowdsourcing contributions? The proliferation of platforms with distributed content production, such as Wikipedia, Zooniverse, and others, has led to scholarly interest in understanding why individuals contribute to them. One stream of research has investigated contributor motivations, while another growing stream, scattered across several disciplines, has explored the effect of platform design on contributions. One important design element is the extent to which incomplete, or partial, contributions are possible -a concept we refer to in this paper as tolerance to incompleteness.We explore the relationship between this design element and crowds' willingness to contribute in the context of Zooniverse, the world's largest citizen science platform. Our quasi-experimental empirical approach exploits a format change that decreased tolerance to incompleteness in one Zooniverse project. The results of a difference-in-differences estimation show that after the format change, editors contributed fewer total edits, but more complete edits than predicted in the absence of a change. Users also spent less time contributing to the project post-change. Moreover, we find a trade-off between the quantity and quality of complete edits, with the quality of complete edits lower post-change. Our findings have implications for the design of a growing number of crowdsourcing platforms that involve simple, independent, and well-structured tasks.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available