Speakers: Stuart Dunn (King’s College London, Department of Digital Humanities) and Anne Alexander (CRASSH / Cambridge Digital Humanities Network).
Cambridge Digital Humanities Network is organising a small workshop for researchers at Cambridge University who are interested in exploring crowdsourcing methods in more detail or sharing their experiences with crowdsourcing with other colleagues. Participants will be asked to contribute a short case-study of their own experience using crowdsourcing or outline a research problem which they believe can be tackled using these methods. Group discussion of the participants’ case-studies and research problems will form the second half of the workshop. As space is restricted, participants must be either research students or staff at the University of Cambridge, however, if there is enough interest, we will organise a larger follow-up event next academic year.
If you are interested in taking part please email Dr Anne Alexander, Cambridge Digital Humanities Network Co-ordinator (firstname.lastname@example.org) a short summary of your research interests and a paragraph outlining the case-study or research problem you want to discuss by 18th June 2013. Lunch will be provided so please let us know any special dietary needs in advance.
What is crowdsourcing?
Researchers in wide range of academic disciplines have begun to experiment with ‘crowdsourcing’ - creating or mobilising online communities of volunteers to assist them in their research. As Dunn remarks (2013) ‘crowdsourcing’ when applied in an academic context generally differs significantly to the meaning of the term in a business context where it was originally coined to describe a process of ‘outsourcing’ to ‘the crowd’, functions which were previously performed by employees (Howe: 2006). By contrast, academic crowdsourcing projects may have more in common with other kinds of participatory research initiatives. ‘Citizen science’, for example, which is described in a recent report by the UK Environmental Observation Framework, as “the involvement of volunteers in science”, is sometimes used interchangeably with ‘crowdsourcing’ as a label to describe research projects which involve an online, open call for participation by the public (see EOF 2012, for example).
The scale of public participation in academic research projects using crowdsourcing methods is impressively large. The Zooniverse projects have more than 800,000 registered users, (www.zooniverse.org), while the pioneering game Foldit, where participants play with digital models of proteins, has more than 300,000. Trove, which crowdsources annotations and corrections to scanned newspaper text in the collections of the National Library of Australia, has around 75,000 users who have produced nearly 100 million lines of corrected text since 2008. A notable feature of many of these projects is that the bulk of the work by ‘the crowd’ is actually done by a small core of highly committed volunteers (see Hagon, 2013 for a discussion of this phenomenon in Trove).
The tasks which volunteers perform in academic crowdsourcing projects vary widely across disciplines, but a common feature of many projects is that they mobilise large numbers of people to undertake tasks which computers cannot yet do effectively (such as pattern recognition or reading handwriting) on a scale which research institutions will not fund by employing professionals to carry out the same task. Paul Hagon estimates that the volunteers who corrected scanned text in Trove carried out work to the value of 12m Australian dollars. Other crowdsourcing projects may invite volunteers to share their localised or specialised knowledge with researchers or cultural heritage institutions, for example by enriching catalogue descriptions of museum and gallery holdings.
Thus, the combination of methods used by academic crowdsourcing projects requires building research teams which cross traditional disciplinary boundaries. Crowdsourcing methods are also a form of public engagement as their use depends on the mobilisation of volunteers who are either not academic researchers or not employed by the project concerned.
Major funders are beginning to support research projects using crowdsourcing methods, and the Arts and Humanities Research Council (AHRC) last year funded research into a scoping study on the use of crowdsourcing methods in the Humanities (Dunn and Hedges, 2012).
‘Crowdsourcing: a definition’
Jeff Howe (2006)
‘More than a business model: crowd-sourcing and impact in the humanities’
Stuart Dunn (2013)
Crowd-sourcing in the Humanities: A scoping study
Stuart Dunn, Mark Hedges (2012)
A longer version of the study is also available online here:
‘Trove crowdsourcing behaviour’
Paul Hagon (2013)
Citizen Science Alliance (2013)
Understanding Citizen Science and Environmental Monitoring
UK Environmental Observation Framework (2012)
Citizen science projects