Skip to content


Should public administration scholarship be subjected to the same transparency standards as political science? A response to Lars Tummers

This is an expanded response to Dr. Lars Tummers’ essay on transparency in Public Administration Review’s Speak Your Mind section. The original is located here. I also blogged about qualitative vs quantitative standards a few weeks back but I had not published that essay, which is live now and can be read here.

Professor Lars Tummers wrote an excellent essay for Public Administration Review‘s blog Speak Your Mind that raises fundamental issues with which I am very familiar with, as I’ve taken part of the discussions in the political science realm, from a qualitative researcher perspective. Many of the responses to Lars’ essay have addressed these points at much longer length than I could possibly do, and I can’t make justice to the debate in a few short lines, so I’m just going to raise a few points from a very personal perspective.

1) I am wholeheartedly in favor of more open science, replications, transparency, pre-registration and a number of elements of a more robust, causal-mechanism-driven social science. I work with experimental methods (I was originally trained as a chemical engineer, so I did experiments before experiments were cool in social sciences) and therefore I espouse many of the views of several posters here, including Professor Tummers.

2) I am trained as a political scientist who works in a public administration department. As a result, I have read (and written about) the debates (which were actually happening in political science well before public administration caught up to them). This debate hasn’t ended and is still heated. The LaCour and Green and Alice Goffman issues have only contributed to elevating the level of discussion and the need for an open conversation about this.

3) I do, however, conduct qualitative (primarily interview-based, ethnographic and discourse analysis) research. I don’t want to enter into a discussion about whether the epistemology and ontology of social sciences should be divided in qualitative and quantitative methods. This debate is never going to end, so I strongly recommend reading Tom Pepinsky and Jay Ulfelder on why the discussion on qual/quant divides may actually be detrimental rather than useful.

4) Based on the fact that I conduct ethnographic work in very vulnerable communities, I am rather wary of sharing raw data (e.g. my field notes) with anybody to leave them for interpretation. Identification of vulnerable populations in any kind of social science reporting is not well seen, and many would see it as actually harmful. The IRB principles are there for a reason. We need to protect those communities we study and avoid any kind of harm to them.

5) Let’s not forget the fact that I’m a pre-tenure, tenure-track professor (a factor that plays into me being disincentivized to share raw data – what if someone who writes/analyzes data faster than me comes up with a faster analysis than I can possibly produce? My tenure committee isn’t going to judge me for data production, they’ll judge me for publications in high impact journals).

6) Even when I do quantitative analysis, I very often create my own datasets and therefore I am wary of sharing them before I have exploited them (see point 5). Transparency is rad, and I’m all for letting people judge the quality and rigor of my work by examining my databases and how I processed them (e.g. publishing code, etc.) BUT I am not ok with someone publishing a paper with my database BEFORE I get a chance to do so. Again, we are not rewarding data production, we are rewarding journal article/book publication.

7) What I think is missing from the conversation is a series of tables (sorry, I think in diagrams and tables) creating possible scenarios and then describing EXACTLY what the TOP guidelines would require from a researcher to comply with them. For example – my next paper is on the politics of water privatization in Mexico. Do I need to publish the raw data? Do I need to submit the raw transcripts of my interviews? Should I post them anonymized (anonymity will reduce the ability of researchers who might want to replicate my study to learn more of the contextual elements of my analysis)? I think that’s what is missing. A simple, visual, easy guide with several scenarios, containing MANY examples from the qualitative and interpretive traditions.

In summary, do I think public administration scholarship could use more transparency? For sure. We need to be able to test whether claims in PA journals actually can be sustained by the evidence presented and by the data collected. But PA suffers from the same emerging chasm between quantitative and qualitative research. I don’t want PA to fetichisize quantitative analysis as more rigorous. There is plenty of qualitative, interpretive, ethnographic research that is robust and analytical. I want a more rigorous PA scholarship and stronger, robust, testable research designs. That’s what I want, and that’s what I hope we can all contribute.

(and yes, I just saw the 2016 IPMJ summary article on ethnography in public management research, BTW). I also participated in the session at PMRC where the analysis presented by Ospina, Esteve and their PhD student showed, and I quote Esteve from Twitter “7.5% qual studies in top PA journ. last 5 years. Presenting it at PMRC & S. Ospina :)”

Thanks Andy Whitford, Steve Kelman, Lars Tummers and Don Moynihan also for excellent discussions on Twitter on this topic. I leave you with one of the discussions I enjoyed the most about this topic in the tweets below:

You can share this blog post on the following social networks by clicking on their icon.

Posted in academia, public administration, research methods.

Tagged with , , , , , .


0 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.



Some HTML is OK

or, reply to this post via trackback.



shares