Blog

Measuring Susceptibility to Misinformation

Smartphones around the world have become an increasingly important access point to the internet, and therefore information. In 2022, 85% of broadband connections across low- and middle-income countries were on mobile devices [1]. The near omnipresence of smartphones in households across the world could fundamentally shift the way that we access information. However, quantitatively capturing susceptibility to misinformation in a survey is a difficult task, one that requires both an understanding of local misinformation as well as a tactical approach to sensitivities around reporting. Over the past four years, our team at Inclusion Economics has sought to better understand how women engage with mobile technology, and what interventions can support women’s technological engagement in a shifting digital landscape. With funding from the WEE-DiFine Initiative, we are implementing an innovative approach to measuring susceptibility to misinformation, or “fake news,” among rural residents in Chhattisgarh, India.

Through exploratory qualitative interviews with rural residents of Raipur district, it became increasingly apparent that local citizens with whom we spoke were aware of the existence of “fake news” and even had their own identification heuristics. Some reported skepticism about news shared through WhatsApp forwards, while others indicated that they only trusted vernacular newspapers and hyper-localized news apps regarding nearby incidents. In addition, many respondents identified viral or sensational news as untrustworthy. Our scoping also indicated that respondents were indeed vulnerable to “fake news”, as they often reported viral “fake news” headlines as true (e.g., eating garlic helps prevent COVID-19). The scoping process revealed that we needed to further explore patterns of news consumption and trust in sources of information among our study population.

Example of an infographic used in the misinformation module. The text reads, “Mobile phone towers spread COVID-19.” | Photo Credit: Inclusion Economics India Centre at IFMR

With qualitative insights in hand, our team further tailored the module to suit the specifications of the local context by accounting for the economic conditions, political landscape, norms, literacy levels, and noteworthy regional events in Chhattisgarh. The final module consists of a series of meme-style infographics related to COVID-19. During interviews, participants were shown the infographics and asked to determine the likelihood that each one was true. We decided to focus the misinformation module on COVID-19, rather than on politics or financial scams – misinformation regarding COVID-19 was widespread, especially on social media apps, but not politicized, and ensuring the safety of our field team was paramount.

We also conducted significant scoping to finalize answer choices. For example, we experimented by asking respondents to report a percentage likelihood that a story was true. However, this request proved too complex and abstract. Likert scales were better understood, more so when there were not as many answer choices. As a result, we settled on a 5-point scale, ranging from 1 (completely false) to 5 (completely true).

Despite challenges, our team successfully implemented a module capturing susceptibility to misinformation alongside our evaluation of the spread of mobile technology and internet access in Chhattisgarh, India. Keys to success included substantial formative work that identified widespread and non-political rumours, as well as a design sensitive to respondents’ comfort levels regarding estimation. Additional lessons learned are presented below.

Important lessons from piloting the misinformation module

  • Brevity wins. Most respondents could not read text-heavy infographics and grew impatient when the enumerators read them aloud. Simple, concise infographics were more comprehensible for respondents.
  • Align infographics with the local context. Some infographics referred to national politics or pertained to other states in India. Our respondents were unfamiliar with this content and were unwilling to comment on the veracity of the news. Respondents were more comfortable assessing the veracity of news relevant only to Chhattisgarh state.
  • Identify translation gaps. Infographics related to health that mentioned words like menopause and migraines did not translate well into the local languages and were, therefore, difficult for respondents to understand. When translating the infographics, a local expert should be consulted to ensure that the language makes sense in a local context.
  • Maintain readiness and awareness of regional politics. Due to upcoming elections in Chhattisgarh state during our pilot phase, we had to be careful about politically motivated infographics. This measure was important to prevent misinterpretation of our activity as being partisan in nature, which could create unsafe working conditions for our field staff and further exacerbate pre-existing political debates.
  • Weigh granularity versus accuracy. Respondents generally had difficulty understanding more granular Likert scales (e.g., we experimented with an 11-point scale). More complex scales elicited a greater proportion of extreme or mid-point values.

Finalizing the module

In total, the misinformation module included three “true” and four “fake” news infographics, generating a sufficient number of data points to infer susceptibility to misinformation. The module ended by “debiasing” the respondents by informing them of which infographics were true versus false. This approach aimed to curtail the spread of misinformation, which was important for upholding the ethical standards of the research. Responses to all seven news items will be converted into a binary variable and compiled into a misinformation index. This index will then be normalized against the control group mean in our analysis.

Measuring Misinformation in Rural Chhattisgarh, India | Photo Credit: Inclusion Economics India Centre at IFMR

What’s Next?

In collaboration with our data collection partner, we recently completed interviews, which included the misinformation module, with over 20,000 women and men across 13 districts in Chhattisgarh. The research team will dedicate the coming months to cleaning and analyzing this novel survey data, which will enable us to shed light on whether and how phones unlock access to financial services, change economic outcomes, and affect women’s and men’s ability to identify misinformation.

 


[1] GSMA. (2023). The Mobile Gender Gap Report: 2023.

[2] GSMA. (2023). The Mobile Gender Gap Report: 2023.

 

Up