Discover more from Never Met a Science
Describing the Journal of Quantitative Description: Digital Media
Things happen quickly on the 'net
Writing this blog has already paid off. In one of my first posts, I argued about the importance of quantitative description, particularly of digital media, along with a wild dream about how to make that happen: “We need a Journal of Quantitative Description.”
Last week, I announced that I’m co-founding the Journal of Quantitative Description: Digital Media with Eszter Hargittai and Andy Guess. There’s plenty of information at the journal website:
The last two months have been a whirlwind of activity putting this all together. We’re extremely grateful to the members of our Editorial and Advisory Boards for committing to make the journal a success. (Seriously, we’re honored to be working so many impressive scholars.) The support thus far has been encouraging; there seems to be widespread agreement that quantitative descriptive knowledge of digital media is currently being undersupplied. Our primary disciplines are Political Science and Communication, but we don’t expect to be constrained by disciplinarity; the Boards also have members based in Sociology, Economics, Business and Computer Science. We expect to publish the first issue sometime in early 2021, and it will of course be completely open-access.
I want to thank Andy and Eszter for making this happen so quickly; I’m still coming to terms with all the different challenges running a journal entails, but I’m confident we can pull it off!
In the rest of this blog post, I discuss how this effort relates to what I see as my job as a researcher. I’m ostensibly a “political methodologist” (that is, I was hired on a “methods” line and the majority of classes I’ve taught have been primarily methodological rather than substantive), and while my department and disciplinary colleagues have been wonderfully flexible in allowing me to follow my research interests where they lead, I still primarily see my task as a political methodologist.
In an earlier post, I argued that “Meta-Science is Political Methodology”, where the latter is broadly defined as “vetting or inventing research practices that enhance the validity of political science knowledge production.” This used to happen at the level of the individual statistical test within a given paper, but the ongoing meta-scientific turn emphasizes improving the quality of “aggregate output of social science knowledge. It wants to ensure that the body of knowledge being produced has desirable statistical properties.”
With that in mind, from a political science perspective (and that's just one of many for the journal): the Journal of Quantitative Description: Digital Media is a contribution to the meta-science subfield of political methodology. (I know this doesn’t exist yet but I see it as inevitable; I’m just going to keep saying it until it happens.) In addition to innovation in the object of methodological research, meta-science requires innovation in the format of methodological research. The received form of this research is inextricably tied up with the history of the discipline and subfield, which over the past four decades has become almost synonymous with the Society for Political Methodology and its flagship journal Political Analysis.
I’d love to read a history of this institutional evolution, but I don’t think one exists. In general, we need more disciplinary histories. Quantitative Political Science in the postwar US mold is a rapidly evolving animal, and we lack the reflexive impulses of Sociology, the revulsion at the imperialistic origins of Anthropology or the overwhelming self-regard of Economics that leads these disciplines to study themselves so explicitly. So what follows are merely my impressions.
Just within my short time paying attention to this professionally, though, there has been a significant shift away from the centrality of the “derive a new estimator/scaling technique” cliche that once defined methodological work. There are still methodologists doing cutting-edge work in these areas (particularly in causal inference, where we seem to have eclipsed other disciplines), but more of the work is more applied, meeting practitioners where they are.
Computing technology has evolved rapidly, creating an expanded set of research techniques and data sources available to a much wider range of social scientists. Consider the explosion of social trace data, text data, and large-scale administrative or voting records. Methodologists have responded by working to both establish the validity of these practices and data sources, and to enhance their widespread usability. The latter represents a novel expansion of the scope of methods research. For example, it is now common for Political Analysis papers to include an associated R package that lowers the “costs” for other researchers.
The status of these R packages appears hard-won. My impression is that many grad students invested significant energy in developing these tools for the scientific community who feel that they did not receive sufficient recognition. This stems from the primacy of the paper as the unit of research. Developing a new estimator makes it possible to extract more knowledge from a given dataset; developing an R package “merely” makes it easier to do so. In the era of grand theories “supported” empirically through spurious cross-country regressions, the analogous contribution by the methodologist had to be similarly grand. Overall, though, the impact of the R packages on the aggregate scale and quality of social scientific knowledge produced has been extremely impressive.
Three things have been happening in parallel that should shift the form of political methodology further away from received forms.
Technology is moving us to a world of abundance in terms of the raw materials of social science.
Humanity has become dramatically larger and more interconnected.
The scope of knowledge production has become more local and in need of synthesis/aggregation.
So again, while it is necessary to continue to push the frontiers of causal inference, that’s super hard and I’m not personally good enough at math.
And it’s equally important for political methodologists to develop practical techniques designed to improve the aggregate production of social scientific knowledge. One of my favorite recent examples of this is Broockman, Kalla and Sekhon’s paper on The Design of Field Experiments With Survey Outcomes: A Framework for Selecting More Efficient, Robust, and Ethical Designs.”
This isn’t exactly a niche choice, and it has already been taken up by dozens of experimentalist practitioners. In many ways, the paper appears to be old school political methodology: it was published in PA and it involves a rigorous description of the statistical properties of a new technique. And while the technique has desirable bias-reduction and ethical properties, the thrust of the contribution is a radical reduction in the cost of running a class of field experiments.
I highlight this paper to say that my proposed expanded conception of political methodology is already well underway. Methodologists with a firm grasp of where the field is heading are already doing excellent work at the margin where it is most needed. But I think that smuggling in the meta-scientific innovation into the received form of a PA paper with a novel estimator is no accident; my suspicion is that the authors felt that “merely” demonstrating a method for cutting costs by 90% would not be adequately reward through traditional means.
This is my own interpretation, but the contribution of this paper will not primarily be in improving the quality of a given experiment reported in a given paper, but rather in increasing the number of such experiments which can be run for a fixed budget (and, perhaps optimistically, proper power analyses) and thus the aggregate amount of social science knowledge.
So I do think that treating the topic more explicitly at every stage of the knowledge production process will help “normalize” (in the sense of normal science) metascience. That means defining contributions in this vein, getting it on more grad syllabi, getting it on the academic agenda, accepting meta-scientific innovations that don’t look like traditional PA papers, and ultimately allocating tenure lines for people who specialize in metascience.
Given the pace of change in the best of times, this will take at least a decade. In the meantime, I hope that scholars with sufficient leeway in their career cycles will think along metascientific lines—about improving scientific practice at the level of increasing the aggregate amount of knowledge being produced. That is my primary motivation for co-founding the JQD:DM; let’s see how it works!
Subscribe to Never Met a Science
Political Communication, Social Science Methodology, and how the internet intersects each