Useful methodology texts for the study of digital activism are listed below. A (★) indicates that the text substantively references the digital context. Suggest an addition or edit by emailing Mary at mjoyce AT uw DOT edu.
Klandermans, B., & Staggenborg, S. (2002). Methods of social movement research. Minneapolis: University of Minnesota Press.
This is an edited volume containing a series of essays on different methods of social movement analysis, each with a different author. Among the methods included are survey research, discourse analysis, semi-structured interviewing, participant observation, case studies, network analysis, historical research, event analysis, macro-organizational analysis, and a conclusion on blended methods. Authors include some of the most respected scholars in the field, including Donatella della Porta, Dieter Rucht, and Mario Diani.
Krippendorff, K. (1980). Content analysis: an introduction to its methodology. Beverly Hills: Sage Publications.
Krippendorff is a founding father of content analysis and this is the definitive text. Where Neuendorf’s book (below) is a practical introduction for those wanting to be quickly introduced to the methodology, Krippendorff goes into more detail and is more theoretical. For example, while Neuendorf says that iterative coder training is necessary, Krippendorff suggests a process and strategy for variable refinement (more exhaustive and exclusive categories describing a frequently occurring dimension in the text) (p. 129). Providing coders with correct answers at the end of a training is useful in self-training, but coders should not rely on conversation and other unreplicable methods to increase reliability. Learn the method with Neuendorf. Go deeper (and get more theory) with Krippendorff.
Lombard, M., Snyder-Duch, J., & Bracken, C. C. (2002). Content Analysis in Mass Communication: Assessment and Reporting of Intercoder Reliability. Human Communication Research, 28(4), 587–604.
Though intercoder reliability is fundamentally important in judging the validity of content analysis findings, there has been a lack of detailed standards. This article seeks to set those standards. The article evaluates different ways of calculating agreement (a term they prefer to reliability) and proposes Krippendorff’s alpha (α) and Cohen’s kappa (κ) as the preferred measures, though the latter will generate lower scores as it is a conservative measure. As a standard for acceptable reliability rates they propose .90 or greater as acceptable to all, .80 or greater as acceptable in most situations, and .70 for exploratory research. At the end of the article the authors propose 10 intercoder reliability standards, including a minimum of three staged calculations of agreement throughout the life of a study and a prohibition on the exclusive use of simple (average) agreement.
Neuendorf, K. A. (2001). The content analysis guidebook. Thousand Oaks, CA: SAGE Publications.
This is a great introductory textbook for those wishing to teach themselves content analysis methods. Though technically a textbook, it is short and very readable. All the basic concepts (units, sampling, reliability, etc.) are clearly defined and the text goes into detail on such practical questions as how to select an intercoder reliability coefficient and how to use archives to identify the message population, which is necessary for sampling.
★ Van Selm, M., & Jankowski, N. (2004). Content analysis of internet-based documents. In M. van Selm & N. Jankowski (Eds.), Researching new media: An advanced-level textbook. Thousand Oaks, CA: Sage Publications.
If you don’t have time to read Neuendorf, read this. Read both if you can, since this article goes into more detail on the digital context while Neuendorf has more detail on the overall method. This article identifies features on online texts (called hypertexts) that make them different from other units of content analysis, such as nonlinearity, multimedia (ie, hypertexts are not just text), lack of durability, and disconnection from a time and place of creation. The authors then go through the steps of content analysis research design – research questions, media selection, unitizing sampling, coding and analysis – and describe how each is different when the content is online.
Koopermans, R., & Rucht, D. (2002). Protest event analysis. In B. Klandermans & S. Staggenberg (Eds.), Methods of social movement research (pp. 231-259). Thousand Oaks, CA: Sage Publications.
Protest event analysis (PEA) emerged as method in the 1960’s and 70’s and uses content analysis (see above) of public records, particularly news reports, to describe and interpret protests (public message events against a policy or entity). Once a research question is selected a unit of analysis must be determined, described in detail, and delineated from similar but distinct phenomena. Though newspapers are the most common type of source, multiple sources are preferable, because any medium will have some form of selection bias (not all event that occurred are recorded, see below). Suggestions on sources, variables, coding, and sampling are also provided.
Earl, J., Martin, A., McCarthy, J. D., & Soule, S. A. (2004). The use of newspaper data in the study of collective action. Annual Review of Sociology, 30(1), 65–80.
Newspapers have become the standard source for building event databases on collective action. Though they are authoritative and provide detailed event information they remain problematic, particularly regarding selection bias (which subset of events are covered) and description bias (the veracity of the coverage). This article includes a nice literature review on past uses of newspapers to construct event databases on collective action, and other articles of critique. Tips for detecting bias are also provided. Though not mentioned in the article, newspapers are particularly unreliable in their coverage of digital activism, another example of selection bias.
Social Network Analysis
Make a suggestion below!
Suggest a Resource