Photo: Crowd of people (abstract)
Photo: Crowd of people (abstract)
In the quest for maximum scientific precision, economics as a discipline has embraced quantitative methods as its modus operandi, claims Vijayendra Rao in his World Bank working paper, Can Economics Become More Reflexive?
A philosophical divide runs across the social science landscape, along the familiar fault line between methods, or rather, the choices of method. Economists are expected to use quantitative methods. Cultural anthropologists are supposed to be ethnographers, relying on qualitative methods such as participant observation.
Economics’ fixation with quantitative analysis and outcomes is a fairly recent, 20th-century phenomenon. [1] And the strides made in empirical economics over the last three decades, fondly referred to as the “credibility revolution”, have only accelerated and fortified this trend. [2] Econometrics, the lifeblood of applied economics, does two things remarkably well: (1) tease out causal relationships based on hypotheses, and (2) understand and overcome the challenges of making robust inferences about large populations from smaller samples.
What is missing in this detached, scientific method—although crucial for collecting and analyzing data at scale—is a true connection between researcher and researched, thereby “accentuating the pre-existing social and cultural distance” between the two. The researcher observes from a distance without coming in close contact with the research subject and investigates from the perspective of their own lived reality rather than that of the people being studied.
The polar opposite of this approach is ethnographic research. The researcher immerses themself in one or a few communities, actively participating in and observing the daily lives and habits of the community for an extended period. [3] By focusing on processes (hard to quantify) rather than outcomes (eminently quantifiable) and prioritizing authentic description over accurate measurement, this approach attempts to bridge the gap between researcher and researched via “deep empathetic understanding, rather than maintaining what anthropologists see as a fictive analytic distance.”
The caveats associated with the second approach should be apparent to the reader by now (hint: they are the exact strengths of empirical economics). Methods like participant observation, due to their focus on a limited number of communities, lack representative data, making it difficult to draw generalizable conclusions from data analysis.
Sociologist Michael Burawoy takes this dichotomy a step further in his 1998 paper The Extended Case-Study Method and argues there are essentially two models of social science: positivist and reflexive. In the positivist approach, embodied by economists, the researcher limits their immersion in the world they study, insulating themself from the subjects, observing from the outside, and investigating through intermediaries (enumerators, for example). In sharp contrast, the reflexive approach, characterized by ethnography, “embraces engagement not detachment as the road to knowledge.”
The question is, can we have the best of both worlds in economic research?
Photo: A farmer cuts grass at a paddy field in Shariatpur, Dhaka
Rao in his 2022 paper suggests that we can, and he puts forward four interconnected principles that could help us achieve an idyllic union of research methods:
First is the notion of cognitive empathy. Sociologist Mario Small (2018), who coined the term, describes it as “the ability to understand a person’s predicament as they [themselves] understand it.” Walk a mile in their shoes, in other words. And to achieve this greater cognitive empathy, it is imperative to reduce the distance between the scholar and the subjects of the study, who are most likely to have very different socioeconomic backgrounds. This can be achieved only through qualitative enquiry. Cognitive empathy can help researchers form sound hypotheses and analytical frameworks for their quantitative studies and can better explain the findings, bringing nuance to the results.
This reorientation of research approach is not novel to the field of economics. T. Scarlett Epstein’s influential work in rural South India (1962) is a great example of the use of an eclectic mix of qualitative and quantitative data (field observations, interviews, and survey data) to study patterns of employment, agrarian relations, migration, caste, and social class. Revealing instances of economists successfully mixing methods can be found sprinkled throughout the decades. [4] Still, economics tends to view insights from qualitative work as anecdotal rather than “hard” data. This is deeply constraining in the ways all available information can be used and the kinds of questions that can be asked in research.
Second is the idea of narrative data. As the term suggests, information can take other forms, different from the statistical/numerical form generally used in applied economics. Information and insights can be gleaned from interviews, documents, texts, images, videos, and even other narrative forms. In fact, the survey method, where respondents are directed to pick precise responses from a set of predetermined options, is an “unnatural interaction that can lead to misinterpretation of what [the respondents] are trying to communicate.”
Narrative data is the bread and butter of sociologists. Michele Lamont, in her book The Dignity of Working Men (2000), unpacked the “inner logic of racism” by examining the language and grammar of lower-middle-class, white and black men in the United States and immigrants in France. The data was collected through two hours of open-ended interviews with each individual. Similar textual analyses of village deliberative meetings have been carried out in recent decades, where narrative data was judiciously recorded, transcribed, and analyzed. [5] However, all of these research projects had relatively small samples, indicative of the time-intensive nature of narrative data collection.
This is where natural language processing (NLP), which is the use of machine learning to analyze textual data, comes into play. Recent advancements in the field have allowed researchers, including economists, to study a larger variety of questions. [6] Since NLP can be used to statistically analyze a greater volume of narrative data with relative ease, it has expanded researchers’ ability to “make sense of high-dimensional textual data”, opening new vistas of research opportunity in social science. Mixed-method research, in particular, can gain a lot from utilizing such tools.
The use of NLP, however, does not automatically make economic analysis more reflexive. For example, a systemic review of 104 studies across disciplines investigating “hate speech, and racial and gender bias in social media” by Matamoros-Fernandez and Farkas (2021) finds an “absence of researchers’ reflexive dialogue with their object of study.” The verdict is clear: the onus is on the researcher to truthfully immerse themself in the study environment and the research participants’ lives.
Third, a step back from the seemingly singular focus on outcomes in empirical economics will require increased attention to studying processes. Here again, there are examples of recent work which showcase the benefits of sensibly mixing methods. [7]
To illustrate this point, Rao cites his own work on testing an intervention to improve citizen engagement quality in rural India (Rao, Ananthpur, and Malik, 2017). They surveyed a hundred randomly picked villages, households, and key informants before and after the intervention. A 10% sub-sample was selected as a qualitative sample (half in treatment, half in control). Five trained ethnographers lived in the villages and conducted monthly participant interviews over two years. The qualitative data informed the development of the quantitative survey instruments. Interestingly, despite a lack of difference in outcomes between treatment and control groups, the ethnography helped the researchers understand why the intervention failed (variations in the quality of facilitation, lack of top-down support, and difficulties in confronting persistent inequality).
The fourth and final piece of the puzzle is the participation of respondent as analyst. All of the above points extol the benefits of reducing the distance between researcher and researched. The idea of considering the research subject as an active participant or co-producer of the research takes the concept of cognitive empathy to its logical conclusion: the researcher and researched are now on an equal footing, equal partners working together seamlessly to investigate and uncover the truth. However, the direct participation of respondents in research is an approach yet to be fully explored.
If the successful marriage of methods is an end we can strive towards (and Rao believes we can and should), the dichotomy between positive and reflexive social sciences is but a false one. In that case, an attempt to find a middle ground would be a worthwhile exercise for economists.
[1] Charles Booth’s seminal multi-volume work Life and Labour of the People in London (1892–97) triangulated data from household surveys, participant observation, and open-ended interviews to create detailed “poverty maps” of the four million residents of London, drawing from both quantitative and qualitative data collected by Booth and his team.
[2] The credibility revolution was the movement towards improved reliability in empirical economics through a focus on the quality of research design and the use of more experimental and quasi-experimental methods. Developing in the 1990s and early 2000s, this movement was aided by advances in theoretical econometric understanding, but it was especially driven by research studies that focused on the use of clean and credible research designs.
[3] Anthropologist Bhrigupathi Singh, in his ethnography, Poverty and the Quest for Life (2015), describes the ethnographic method as “a two-phase process…In the first phase, one is a hunter-gatherer, pursuing targets, collecting impressions. The next stage of labour is of a more settled cultivator, as we move from impressions to expressions…When impressions are organized and attached to concepts, they turn into thoughts and expression.”
[4] Economic theorists Christopher Bliss and Nicholas Stern (1982), inspired by Epstein, spent eight months in a North Indian village studying land tenure arrangements, labour and credit markets, and risk and uncertainty. Their conversations, experiences, and observations informed the development of new theoretical models as well as the nature of the survey data they collected. Christopher Udry, another development economist heavily influenced by anthropology and ethnography, describes his method as “iterative field research” (Udry, 2003). During their fieldwork in Ghana, Udry and Markus Goldstein realized that plots (otherwise similar) owned by women were less productive because women were not able to “invest” in the land by keeping it fallow: they had to keep the plots constantly cultivated to not have it taken away from them. This qualitative finding proved to be crucial in their attempt to demonstrate that institutional factors affect economic efficiency (2008).
[5] Village deliberative meetings are spaces where citizens can speak freely about public issues and work with the village council to make decisions on public goods and resources. Sanyal and Rao (2019) recorded 300 village meetings in South India. Their textual analysis showed that the meetings varied from state to state even though they belonged to the same linguistic community, because of differences in state government policy. Collins, Morduch, Rutherford, and Ruthven (2009) studied the money management of people living on less than two dollars a day, by conducting open-ended, narrative interviews of 250 households, instead of a conventional survey collecting data on credit, savings, and assets. The authors reconstructed balance sheets and cash-flow statements from the interviews to find patterns previously not understood with conventional surveys.
[6] Parthasarathy et al. (2019) used NLP methods to analyze 100 transcripts of village meetings, an endeavour that took six months instead of ten years because of the use of machine learning. Gentzkow et al. (2019) studied transcripts of speeches in the US Congress to develop an estimator that measured the degree of partisanship between 1873 and 2016.
[7] In his ethnography Streetwise, Elijah Anderson (1990) distils fourteen years of fieldwork between 1975 and 1989 to study interactions between whites and blacks in the Philadelphia neighbourhood he lived in and a neighbouring, largely black, area. Drawing on Erving Goffman’s work on stigma (1960) and strategic interaction (1969), he drew a picture of how Reagan-era cuts to public spending, and the decline of the city’s industrial base, led to shifts in the way the neighbourhoods interacted.
Eradul Kabir is a Communications and Knowledge Management Officer at the BRAC Institute of Governance and Development (BIGD).
Photo 1 from Max Pixel, used under CC0 1.0 Universal.
Photo 2 by Syed Rifat Hossain, used under the Unsplash licence.