Do you know Mechanical Turk? It’s a web-platform that is hosted by Amazon.com and allows you to perform small (and usually repetitive) tasks and to earn a little money. Everybody can set up a task, and everybody can perform a task, that’s why Mechanical Turk is called a “marketplace for work” (there are numerous other platforms like this). For the person who sets up a task, it’s like having a human-powered machine that performs tasks for you. And for researchers, it’s a fantastic playing field to explore crowdsourcing because of the sheer amount and diversity of people who perform tasks on the platform for little money! This post highlight three pieces of research that explore why people from India and the United States are active on Mechanical Turk.
The first piece of research simply explored participants motivations to participate. I’ve already blogged about a study about motivations to participate in co-creation, but here’s it’s a little different. Why? Because people are not asked to be creative, but simply to rate product reviews, compare images, verify contact information or other small mechanical tasks. A very recent paper written by three German researchers showed that payment was by far the strongest motivations of participants (see below). They surveyed an international sample of workers, mainly Americans and Indians, which arethe two most represented countries on the platform.
However, their research showed a bias: “Indian participants were […] generally rating between […] higher than those from the US. This clearly indicates the presence of an Acquiescent Response Style (ARS) bias“. The Acquiescent Response Bias is a population’s tendency to systematically agree on questions. Even though they corrected this bias by standardizing the data around a reference mean score, one can question if motivations are different between cultures. The German academics chose not to address this question, but others did:
Panos Ipeirotis studies Mechanical Turk for years now, and to my knowledge he’s the first one to having addresses cross-cultural differences between participants. In a survey, he basically asked workers to indicate why they participate in Mechanical Turk’s marketplace. Similarly to the Germans, he found that money is a primary motivating reason for workers to participate, but he also found that “significantly more workers from India participate on Mechanical Turk because [it’s] a primary source of income, while in the US most workers consider Mechanical Turk a secondary source of income“.
Also, he found that American participants participate a lot more for fun and to kill time. He says that these results are “not surprising given the average income level of an Indian worker vs. the income level of the US workers”. And indeed, “Workers based in India have significantly lower incomes [than workers in the United States] and more than 55% of the workers declared an income of less than $10,000/year“. This seems quite intuitive, and its only a survey (not a peer-reviewed, published research paper) but it’s the one of the few data set that compares crowdsourcing participation across countries.
Another recent piece of evidence was provided in April 2010 in Atlanta. In Who are the Crowdworkers? Shifting Demographics in Mechanical Turk, researchers from the Department of Informatics from the University of California showed that the MTurk workers were getting increasingly international, with a significant population of young, well-educated Indian workers. The following graphs illustrate a consequence of this: the fact that more Indians rely on Mechanical Turk to earn money:
Building on Ipeirotis’ data as well as on surveys initiated on their own, the researchers describe how the population of MTurkers is evolving towards full-time occupation, especially in low-income countries. “Amazon’s platform supports broader trends in organizational management of using freelance and part-time labor that can be hired and fired as company needs fluctuate,” they say. Their study has a significant number of methodological limitations, but the findings probably reflect a fundamental trend anyway! About the methodological issues, see the following piece of research, reported here, which has a much more rigorous look at cross-cultural differences.
Whether payment amounts to an appreciable sum depends on the worker and the socio-economic context (Antin & Shaw, 2012)
Specifically, Judd Antin from Yahoo! Research and Aaron Shaw from Berkeley examined how culture affects poeples’ response behavior. They examined pretty much the same motivations than Ipeirotos, but they also wanted to find out whether peoples’ tendency to deny socially undesirable traits and to admit socially desirable ones was at play. And the answer is yes: people over-report some motivations because they are socially more acceptable. Specifically, Americans heavily over-reported money as a motivator, and Indians over-reported sense of purpose and under-reported killing time and fun. So what?
Many online systems are designed on the basis of survey studies […]. If these studies misrepresent user motivations because of social desirability bias, the result may be non-optimal design decisions
As crowdsourcing goes global, this is definitely a subject worth exploring in future studies!