Fig. 1. A taxonomy of crowdsourcing
work were proposed to solve the problem of geometricreasoning on MTurk , .
Named entity annotation
- Named entity recognitionis used to identify and categorize textual references toobjects in the world, such as persons and organizations.MTurk is a very promising tool for annotating large-scalecorpora, such as Arabic nicknames, Twitter data, largeemail datasets and medical named entities , , ,.
- Opinions are subjective preferences. Gath-ering opinions from the crowd can be achieved easilyin a crowdsourcing system. Mellebeek et al.  usedthe crowdsourcing paradigm to classify Spanish consumercomments. They demonstrated that non-expert MTurk an-notations outperformed expert annotations using a varityof classiﬁers.
- Obviously, humans can poss common-sense knowledge about the world, but computer programscannot. Many studies focused on collecting commonsenseknowledge in MTurk , .
- Humans have to read throughevery document in a corpus to determine its relevance to aset of test queries. Alonso et al.  proposed crowdsourc-ing for relevance evaluation, so that each crowdsourcingwork perform a small evaluation task.
Natural language annotation
- Natural language annota-tion is a task that is easy for humans but currently difﬁcultfor automated processes. Recently, researchers investi-gated MTurk as a source of non-expert natural languageannotation, which is a cheap and quick alternative toexpert annotations , , , , , . Akkayaet al.  showed that crowdsourcing for subjectivityword sense annotation is reliable. Callison-Burch andDredze  demonstrated their success on creating datafor speech and language applications with a very low cost.Gao and Vogel  proved that crowdsourcing workersoutperformed experts on word alignment tasks in termsof alignment error rate. Jha et al.  showed that itis possible to build up an accurate prepositional phraseattachment corpus by crowdsourcing workers. Parent andEskenazi  demonstrated a way to cluster a task of dictionary deﬁnitions in MTurk. Skory and Eskenazi submitted open cloze tasks to MTurk workers and dis-cussed ways to evaluate the quality of the results of thesetasks.
- Junk email cannot be determinedwithout the task of understanding content by humans.Some anti-spam mechanisms such as Vipul’s
usehuman votes to determine if a given email is spam.
B. Information Sharing System
Websites can help to share information easily among In-ternet users. Some crowdsourcing systems aim to share var-ious types of information among the crowd. For monitoringnoise pollution, Maisonneuve  designed a system calledNoiseTube which enables citizens to measure their personalexposure to noise in their everyday environment by using GPS-equipped mobile phones as noise sensors. The geo-localisedmeasures and user-generated meta-data can be automaticallysent and shared online with the public to contribute to the col-lective noise mapping of cities. Moreover, Choffnes et al. utilized the crowdsoured contributions to monitor service-levelnetwork events and studied the impacts of network events onservices in the view of end users. Furthermore, a lot of popularinformation sharing systems were launched on the Internet asshown in the following:
are online encyclopedias that are written byInternet users, and the writing is distributed in thatessentially almost anyone can contribute to the Wiki.
is a general question-answering forumto provide automated collection of human reviewed dataat Internet-scale. These human-reviewed data are oftenrequired by enterprise and web data processing.
Yahoo! Suggestion Board
is an Internet-scale feedback and suggestion system.
also collects goals from users, andin turn it provides a way for users to ﬁnd other users whohave the same goals, even if they are uncommon.
is a popular photo-sharing site and pro-vides a mechanism for users to caption their photos.These captions are already being used as alternative text.
is a social bookmark site on the Internetdeveloped by Golder and Huberman.
Vipuls razor web site, http://sourceforge.net/projects/razor
The free encyclopedia, http://en.wikipedia.org
Yahoo! answers, http://answers.yahoo.com/
Yahoo! Suggestion Board, http://suggestions.yahoo.com/
43things website for collecting goals from users, http://www.43things.com/
Yahoo’s ﬂickr, http://www.ﬂickr.com/