About

‘Quality’ is an ERC-funded research project hosted by CRASSH, University of Cambridge, and the Erasmus University Rotterdam (2017-2022).

Good public policy should make society healthier, happier, safer, and more productive. Social scientists can help: they can look for the causes that promote or prevent these outcomes. Some do this by building models, which they test against large data sets using statistical tools. Others use non-formal tools, applied to a small number of cases: tracing processes, comparing cases, interviewing participants, writing ethnographies. Policymakers can then try to weigh up all this ‘quantitative’ and ‘qualitative’ evidence in order to judge the likely effects of a policy.

There are established ways to weigh up the evidence from multiple quantitative studies, and for organizing the evidence from multiple qualitative studies. There are also excellent research designs that yield both quantitative and qualitative evidence. However, it’s currently unclear how to weigh up the evidence from multiple qualitative studies, either on their own or together with quantitative studies. For example, how should one judge the efficacy of a policy when the findings from quantitative studies contradict the findings from qualitative ones? This poses a momentous problem: it exposes one’s causal judgements to an increased risk of error, as it does any policies based upon these judgements.

This project aims to solve this problem by bringing together cutting-edge work in epistemology with the expertise of leading social scientists. It will analyse some prominent qualitative studies in sociology and political science, and it will contrast them with some prominent quantitative studies based on econometrics. This project will then determine whether there is, despite their differences, a handful of basic heuristics for causal inference that underlie both types of approach. If there is, we will use these heuristics to develop a collection of templates for weighing up quantitative and qualitative evidence.

If you have questions about this project, please email the project administrator.


QUALITY is funded by the European Research Council under the European Union’s Horizon 2020 Framework Programme for Research and Innovation (ERC grant agreement no. 715530).

People

Principle Investigator: Christopher Clarke (Senior Research Associate)

Post-Doctoral Research Associate: Rosie Worsdale (2018-2022)

Post-Doctoral Research Associate: Jack Wright

Project Administrator: Zeynep Kacmaz-Milne


Project Visitors

Michael Marcusa, Michaelmas 2018

Meghan Tinsley, Easter 2019

Lennart Ackermans, Easter 2019, Summer 2021 and Easter 2022

Catherine Herfeld, Easter 2022

Naftali Weinberger, Easter 2022

 

Project Summary

Qualitative and Quantitative Social Science: Unifying the Logic of Causal Inference?

Overview of Project QUALITY

Good public policy should make society healthier, happier, safer, and more productive. Social scientists can help: they can look for the causes that promote or prevent these outcomes. Some do this by building models, which they test against large datasets using statistical tools. Others use non-formal tools, applied to a small number of cases: tracing processes, comparing cases, interviewing participants, writing ethnographies. Policymakers can then try to weigh up all this ‘quantitative’ and ‘qualitative’ evidence in order to judge the likely effects of a policy.

QUALITY aims to solve this problem by bringing together cutting-edge work in epistemology with the expertise of leading social scientists. It will analyse some prominent qualitative studies in sociology and political science, and it will contrast them with some prominent quantitative studies based on econometrics. QUALITY will then determine whether there is, despite their differences, a handful of basic heuristics for causal inference that underlie both types of approach. If there is, QUALITY will use these heuristics to develop a collection of templates for weighing up quantitative and qualitative evidence.

There are established ways to weigh up the evidence from multiple quantitative studies, and for organizing the evidence from multiple qualitative studies. There are also excellent research designs that yield both quantitative and qualitative evidence. However, it’s currently unclear how to weigh up the evidence from multiple qualitative studies, either on their own or together with quantitative studies. For example, how should one judge the efficacy of a policy when the findings from quantitative studies contradict the findings from qualitative ones? This poses a momentous problem: it exposes one’s causal judgements to an increased risk of error, as it does any policies based upon these judgements.

The Problem

Between 1998 and 2001 the ‘Save our Schools’ programme (SOS) was piloted across 19 elementary schools in the United States. This programme introduced a highly structured curriculum to the pilot schools, including ‘learning preparation’ lessons and specialized tuition for children who were struggling. To measure its effectiveness, the programme evaluators relied upon several ‘quantitative’ and ‘qualitative’ studies. The latter comprised a school site visit, a case study, a focus group interview and an open-ended survey of teachers. All the studies were designed with a common definition of a ‘good school’ in mind. Despite this, the qualitative studies invariably found that the programme caused these 19 schools to improve, whereas the quantitative studies invariably found that the programme had, if anything, the opposite effect.

The SOS programme is a striking illustration of a general problem: it is not clear how to proceed when the quantitative evidence conflicts with the qualitative evidence, as it did for the SOS programme. Even more generally, it is not clear how to proceed when there is also a conflict amongst the quantitative evidence itself, or amongst the qualitative evidence itself—in addition to the conflict between the two. As a consequence, when policy-makers want to draw causal conclusions from such an evidence base, they are forced to rely on their intuition alone. This makes policy decisions less transparent than they aspire to be. More importantly, it raises concerns about whether the resulting policies accurately reflect the balance of all the available evidence. And so these policies run an increased risk of being ineffective, wasteful or even harmful.

This problem requires an urgent solution. Firstly, the above predicament arises frequently when developing policies on education, healthcare, social welfare, economic development, technology, drug control, and policing, especially when there is a wide range of evidence to draw upon. Secondly, this question is especially timely given recent trends in policy making, in particular the commitment to base policy on the widest available range of quantitative and qualitative evidence. Thirdly, this issue is not only of pressing practical import for policy-makers and programme evaluators. It also raises a theoretical problem for social researchers more generally. How should social scientists weigh up conflicting evidence from a diverse range of qualitative and quantitative studies, in order to draw conclusions about the causes of a given phenomenon?

QUALITY aims to shed light on this problem by bringing together cutting-edge work in epistemology with the expertise of leading social scientists.

Existing Work on the Problem

There are established ways to weigh up conflicting evidence from multiple quantitative studies. There are also promising suggestions for organizing the evidence from multiple qualitative studies. And there are excellent research designs that yield both quantitative and qualitative evidence. But how should social scientists draw causal conclusions from such designs? How should they weigh up conflicting evidence from multiple qualitative studies, either on their own or together with quantitative studies?

At present, there are four strategies for doing so:

(1)  One suspends judgement until more compelling evidence is collected.

(2)  One sets aside any of the existing studies that have minor methodological flaws. The hope is that the remaining studies will agree with each other.

(3)  One rates each study for its evidential quality, as well as for the effect size that the study claims to find. One then draws a causal conclusion by taking an average of these effect sizes, weighted by the quality of associated studies. This is a more general strategy than simple ‘vote counting’, which assumes implausibly that all studies are, in effect, of equal evidential weight.

(4)  One follows the advice of King Keohane and Verba in their highly influential manifesto Designing Social Inquiry (1994). Designing Social Inquiry proposes that qualitative methods in political science and sociology are—at their best—analogs of standard quantitative methods, whose epitome is multi-variate regression analysis. If this is correct, then the standard techniques for ‘pooling’ the evidence from quantitative studies can be carried across to qualitative studies. Simply ‘code’ the data in a quantitative format. (It follows, however, that there are no distinctive uses for which the evidence from qualitative studies can be employed. What’s more, such evidence is less probative than evidence from typical quantitative studies: it is less precise, and describes a smaller number of cases.)

Shortcomings of Existing Work

The four strategies above fall far short of a satisfactory solution to our target problem, that of drawing causal conclusions from a diverse and conflicting evidence base.

(1)  Consider the strategy of suspending judgement until new, more compelling evidence is collected. This strategy does not solve our target problem concerning causal inference. It merely postpones it. And, if the newly collected evidence turns out to be in conflict as well, then causal inference will be postponed indefinitely. And so, whenever a policy decision needs to be made imminently, this strategy fails.

(2)  The strategy of setting aside studies with minor flaws is also unsatisfactory. Firstly, there is no guarantee that the remaining studies will all agree with each other. So this strategy is available only in limited circumstances. Secondly, pursuing this strategy results in useful evidence being ignored. After all, when a study exhibits minor methodological flaws, this is no barrier to the study providing tentative evidential support to its conclusions.

(3)  What about the strategy of taking a weighted average? As it stands, this strategy leaves open the question of how to evaluate the ‘evidential weight’ (with respect to causal inference) of any given study. But this is precisely the question that needs an answer. So this strategy is a prologue to solving our target problem; it is not a solution in itself. More worryingly, this strategy makes the implausible assumption that the evidential weight of each study is additive, and therefore independent. But why should ten replications of the same study, for example, count as forcefully as ten studies that each employ different methods?

(4)  Can qualitative methods be subsumed by quantitative ones, as King Keohane and Verba urge? Many think not. Critics of Designing Social Enquiry charge, firstly, that it ignores the distinctive (and perhaps less direct) contribution that qualitative methods make to causal inference. Typical qualitative studies possess special virtues that typical quantitative studies lack. Secondly, critics charge that it is not grounded in a well-motivated epistemology. Thus it fails to recognize the considerable problems facing regression-based studies of non- experimental data. Its subsumption of qualitative methods by existing quantitative methods is therefore unwarranted, they argue. At very least, the relationship between quantitative and qualitative methods in the social sciences remains vigorously contested.

The Solution

QUALITY aims to solve our target problem, while avoiding these shortcomings. We plan to do so as follows.

(1)  Identify some social science literatures in which both qualitative and quantitative evidence has been brought to bear in order to draw causal inferences. For example, what are the major causes of violent conflict? to what extent do active labour market programmes reduce unemployment? what are the major causes of the wage gap between women and men, and of the wage gap between different ethnic groups?

(2)  Identify, in these literatures, some prominent exemplars of causal inferences that use qualitative methods. Reconstruct each of them as deductively valid arguments, or as hypothetico-deductive arguments, or as inferences to the best explanation. In doing so, we will uncover the local background assumptions and the general causal principles or ‘heuristics’ to which these argument appeal.

(3)  Do the same for a variety of exemplary inferences, in these literatures, that use quantitative methods.

(4)  Having taxonomized and analyzed several distinct forms of causal inference across the social sciences, attempt to unify the general causal principles (causal heuristics) to which they appeal. Can these various heuristics can all be understood as special cases of a handful of more basic heuristics? These basic heuristics—if they exist and whatever they turn out to be—will thereby do justice to both quantitative and qualitative methods.

In doing the above, we will need to work out how the following concepts bear on the project (if at all):

  • The idea that causal hypotheses are contrastive hypotheses
  • The framework of inference to the best explanation, and explanatory “loveliness”
  • The imperative that social theory must respect individuals’ agency
  • The distinction between micro-social and macro-social research
  • The notion of construct validity
  • The idea that counterfactual conditionals are vague
  • The application of the interventionist account of causation to the social world
  • The problem of under-consideration of rival hypotheses in the social sciences
  • The use of super-empirical virtues in theory choice
  • The appropriateness of probability and chance claims in the social sciences
  • The Causal Markov, Minimality and Faithfulness principles from the causal modelling literature
  • The distinction between Bayesian, Likelihoodist and Classical approaches to statistical inference

Events

Events 2021-22
Why do women leave philosophy?
5 May 2022 12:30 - 14:30, Room S1, Alison Richard Building, 7 West Road, Cambridge, CB3 9DT
The causal analysis of racial discrimination
19 May 2022 16:00 – 17:30, Room S1, Alison Richard Building, 7 West Road, Cambridge, CB3 9DT
‘Mixed methods’ workshop
6 Jun 2022 - 7 Jun 2022 09:00 - 17:00, Møller Institute, Storey's Way, Cambridge CB3
Events 2020-21
Understanding Gendered Violence: Session One
28 Apr 2021 2:00pm - 5:00pm, Online, via Zoom
Understanding Gendered Violence: Session Two
5 May 2021 2:00pm - 5:00pm, Online, via Zoom
Understanding Gendered Violence: Session Three
12 May 2021 3:00pm - 5:00pm, Online, via Zoom
Events 2019-20
Cumulative Social Science and the Question of Well-Formed Research Problems
23 Oct 2019 4:00pm - 6:00pm, Room S2, Alison Richard Building, Cambridge, CB3 9DT
Making Singular Risk Decisions
25 Oct 2019 4:00pm - 6:00pm, Room S2, Alison Richard Building, Cambridge, CB3 9DT
CANCELLED Rethinking Structural Violence: Gender, Testimony, and the Interpretive Wealth Gap
27 Nov 2019 5:30pm - 7:00pm, Kaetsu Centre Conference Room, Murray Edwards College, Huntingdon Road, Cambridge, CB3 0DF
CANCELLED Understanding Gendered Violence: The Value of Testimonial and Qualitative Evidence
28 Nov 2019 - 29 Nov 2019 All day, Kaetsu Centre Conference Room, Murray Edwards College, Huntingdon Road, Cambridge, CB3 0DF
POSTPONED Understanding Gendered Violence: The Value of Testimonial and Qualitative Evidence
6 Apr 2020 - 7 Apr 2020 All day, Kaetsu Centre Conference Room, Murray Edwards College, Huntingdon Road, Cambridge, CB3 0DF
Events 2018-19
Sex Work: What Can Quantitative and Qualitative Approaches Teach Us? – The Sociological Underground
8 Oct 2018 3:00pm - 4:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Sex Work: What Can Quantitative and Qualitative Approaches Teach Us? – Comparing Markets and Law
15 Oct 2018 3:00pm - 4:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Sex Work: What Can Quantitative and Qualitative Approaches Teach Us? – The Swedish Case Study
22 Oct 2018 3:00pm - 4:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Sex Work: What Can Quantitative and Qualitative Approaches Teach Us? – Creeping Neo-Abolitionism
29 Oct 2018 3:00pm - 4:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Sex Work: What Can Quantitative and Qualitative Approaches Teach Us? – Comparative Research Methods
5 Nov 2018 3:00pm - 4:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
On the Trade-Offs of Causal Inference
7 Nov 2018 3:30pm - 5:00pm, Room SG1, Alison Richard Building, Cambridge, CB3 9DT
Sex Work: What Can Quantitative and Qualitative Approaches Teach Us? – Trafficking
12 Nov 2018 3:00pm - 4:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Youth and the Jihadist Temptation: Social Scientific Dilemmas on the Tunisian Street
15 Nov 2018 4:00pm - 5:30pm, Room S1, Alison Richard Building, Cambridge, CB3 9DT
Sex Work: What Can Quantitative and Qualitative Approaches Teach Us? – Paradigm Problems
19 Nov 2018 3:00pm - 4:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Sex Work: What Can Quantitative and Qualitative Approaches Teach Us? – Violence and Law
26 Nov 2018 3:00pm - 4:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Measuring Wellbeing by Eliciting Preferences? – What Preferences Really Are
22 Jan 2019 1:00pm - 2:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Measuring Wellbeing by Eliciting Preferences? – Theorising Child Wellbeing
29 Jan 2019 1:00pm - 2:00pm, Room S1, Alison Richard Building, Cambridge, CB3 9DT
Measuring Wellbeing by Eliciting Preferences? – Preference, Evidence and Welfare
5 Feb 2019 1:00pm - 2:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Measuring Wellbeing by Eliciting Preferences? – What are Adaptive Preferences?
12 Feb 2019 1:00pm - 2:00pm, Room S1, Alison Richard Building, Cambridge, CB3 9DT
Measuring Wellbeing by Eliciting Preferences? – Preference Purification
19 Feb 2019 1:00pm - 2:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Feminist Methodologies and Feminist Epistemologies
20 Feb 2019 4:00pm - 6:00pm, Room S2, Alison Richard Building, Cambridge, CB3 9DT
Measuring Wellbeing by Eliciting Preferences? – Taking their Word for it: Theories of Disability
26 Feb 2019 1:00pm - 2:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Measuring Wellbeing by Eliciting Preferences? – Is Wellbeing Possible in Illness?
5 Mar 2019 1:00pm - 2:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT
Measuring Wellbeing by Eliciting Preferences? – Evaluating the Quality of Dying and Death
12 Mar 2019 1:00pm - 2:00pm, CRASSH Meeting Room, Alison Richard Building, Cambridge, CB3 9DT

Selected Publications

Jack Wright, ‘Are economists self-perceptions as epistemically superior self-defeating?’ in A Modern Guide to Philosophy of Economics, ed. Harold Kincaid & Don Ross, (Cheltenham: Edward Elgar Publishing Limited, 2021): pp.127-145.

Anna Alexandrova, Robert Northcott & Jack Wright, ‘Back to the big picture’, Journal of Economic Methodology 28, 1 (January 2021): pp.54-59.

Christopher Clarke, ‘Functionalism and the role of psychology in economics’, Journal of Economic Methodology 27, 4 (July 2020): pp.292-310.

Rosie Worsdale & Jack Wright, ‘My objectivity is better than yours: contextualising debates about gender inequality’, Synthese 199 (September 2020): pp.1659-1683.

Jack Wright & Tiago Mata, ‘Epistemic Consultants and the Regulation of Policy Knowledge in the Obama Administration’, Minerva 58 (December 2020): pp 535-558.

Christopher Clarke, ‘The Correlation Argument for Reductionism’, Philosophy of Science 86, 1 (January 2019): pp.76-97.

CENTRE FOR RESEARCH IN THE ARTS, SOCIAL SCIENCES AND HUMANITIES

Tel: +44 1223 766886
Email enquiries@crassh.cam.ac.uk