Abstract
The layering of games over crowdsourcing tasks has increasingly become a viable approach to motivate participation in crowdsourcing projects. A crowdsourcing game utilizes entertainment to generate useful outputs as byproducts of gameplay. Since crowdsourcing relies on individuals' participation, understanding actual contribution patterns is an important area of research. However, this phenomenon has yet to be adequately investigated. In this paper, we build on our prior research to attempt to answer the question of whether a non-game-based mobile app for the crowdsourcing of location-based content would yield different types of contributions when compared against a game-based app. Our content analysis of 2,386 contributions in both apps reveals nine categories, divided into (1) those that provide information for navigational purposes or for others to learn about a specific place and (2) those that were meant for personal expressions related to the content creator or those around him/her with the location as a backdrop. The distribution of categories varied between the apps, indicating that the features afforded by the game shaped behavior differently from the non-game-based approach to crowdsourcing. Analysis of the contributions also suggests that the game-based app produced much higher informational content than the non-game-based one.
Original language | English |
---|---|
Pages (from-to) | 119-127 |
Number of pages | 9 |
Journal | Proceedings of the Association for Information Science and Technology |
Volume | 54 |
Issue number | 1 |
DOIs | |
Publication status | Published - Jan 2017 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:Copyright © 2017 by Association for Information Science and Technology
ASJC Scopus Subject Areas
- General Computer Science
- Library and Information Sciences
Keywords
- content analysis
- crowdsourcing games
- evaluation
- human computation.
- mobile content