Yesterday we had a conversation about Human Ubiquitous Computing (Human UbiComp) with Lukas Biewald founder of CrowdFlower, Bjoern Hartman creator of the Mechanical Turk Cats Book and Aaron Koblin a conceptual artist who works with Amazon Mechanical Turk (AMT) to create art such as Ten Thousand Cents and The Sheep Market.
Human UbiComp has been described by Professor Zittrain as “fungible networked brainpower, ” or the ability for strangers to pay other strangers small amounts of money to complete menial tasks on the internet. The concept and the potential problems are described in a video by Professor Zittrain and more academically, in a paper.
One of the first things we discussed were the quality control metrics of Crowdflower that are absent on other Human UbiComp sites such as Amazon Mechanical Turk. This allows for quality control and also offers higher reliability of work as an incentive for cost of use of the platform.
CrowdFlower sends tasks specifically to African refugee camps, in areas where data plans are a low expense and work is difficult to come by. There is an app that works in conjunction with this program called Give Work, that allows users to complete the same tasks sent to these refugee camps for quality matching, to determine subjective factors such as cultural idioms or understandings that make certain tasks difficult for international communities to complete.
In CrowdFlower, just like in all successful UbiComp platforms, successful tasks must be clear enough that they almost have a “pass/fail” nature to them, said Biewald. But at the same time, these tasks inherently contain some degree of ambiguity,
Bjoern Hartman, after completing a book completely created with content from Mechanical Turk, amazing cat stories to be exact, began wondering what the opinion of the Turkers (as they’re called amongst those in the know) was of his use of their paid content for a book. He did the obvious, and asked them for their opinion of the book, on the site itself, for a small amount of pay, just like any other job on the site. He found that there was more criticism in the comments of a Boing-Boing entry about his work than he found from posting within the community.
We repeated this protocol, but about all of Human UbiComp for our course, and the results can be found here.
Aaron Koblin, used Mechanical Turk to solicit 10, 000 individual sheep drawings. He received 662 non-sheep total and only one of these drawings asked him, the creator of the task “why are you doing this?” He also received numerous emails after the project from people also wanting to draw and submit sheep, for free. He found that some people would spend up to 45 minutes on a particular sheep drawing, while others would complete their drawing in one minute or less.
Are these types of work legitimate labor? What types of concerns does it raise for the workers, the employers and the websites that facilitate these types of actions? We’ve divided up the three most tangible, pressing problems into three separate posts so that you can leave your comments on each accordingly: