Adapting Data Collection Methods During COVID-19 to Generate and Communicate Insights

Challenge

Restrictions brought on by COVID-19 stalled or delayed travel and gatherings across the globe. For evaluation specialists, these restrictions introduced incredible challenges and, in some cases, effectively halted in-person monitoring and evaluation (M&E) activities. In March 2020, Guidehouse was at the height of data collection activities for a global evaluation commissioned by the U.S. Department of State’s (DOS) Bureau for Educational and Cultural Affairs (ECA) when we had to quickly pivot to address such obstacles and maintain momentum towards completing the evaluation. The challenge became – in addition to logistical troubleshooting, how do you continue data collection activities virtually while the world adjusted to such dramatic and emotional events in so many peoples’ lives?

Background

In October 2018, ECA commissioned Guidehouse to evaluate the Mandela Washington Fellowship for Young African Leaders, the flagship program of the Young African Leaders Initiative (YALI). The program’s purpose is to invest in the next generation of African leaders as they spur growth and prosperity, strengthen democratic governance, and enhance peace and security across sub-Saharan Africa. This evaluation sought to understand the impact the Mandela Washington Fellowship has had on its alumni, their home communities, and U.S. communities since its inception in 2014.

The evaluation design included collecting quantitative and qualitative primary data from a variety of sources, including surveying and interviewing Fellows, U.S. organizers of the Fellowship, U.S. community members who engaged with the Fellowship, and individuals from Fellows’ home communities.   

Additionally, the Guidehouse evaluation team conducted social media research and analysis (SMRA) to understand and identify trends.

The Guidehouse team conducted international in-person interviews between August 2019 and November 2019. In March 2020, when U.S.-based data collection (survey and interviews) were scheduled to start, the team was grounded. Guidehouse and ECA knew that for the safety of communities and the research team, interviews would have to be done virtually.

Solution

To address the constraints of shifting to all-virtual data collection, Guidehouse and ECA proactively reacted by adjusting our approach to conscientiously account for the new operating environment and keep empathy a critical focus of our approach. Below are three key steps we took to ensure the evaluation could continue in line with its original timeline.

  • Develop a Virtual Engagement Approach: Given the uncertainty of lockdowns and travel, Guidehouse and ECA had to quickly create a contingency plan for this once-in-a-century event. We reviewed the data collection instruments to ensure we could administer all questions within the allotted time (as virtual interviews can be taxing). Additionally, we extended the survey response timeframe to allow ample time and space for individuals to provide their thoughts. Further, we quickly and consistently communicated with key stakeholders to assure them that data collection was continuing and that we would remain flexible as issues arose. In particular, as we were working with University-based populations, maximum flexibility was required due to quickly shifting events around the transition to virtual learning.
  • Use Familiar Tools: As the world was getting used to virtual methods of connecting in the early days of the pandemic, we understood that a thoughtful and simple approach would be best for evaluation participant engagement. We reviewed existing tools and identified solutions that could help us seamlessly connect with our most important audiences. This included providing multiple options for virtual platforms to mitigate technical challenges and learning curves for unfamiliar systems.
  • Adapt to Participants’ Needs: During an uncertain time, we aimed for maximum flexibility to support our respondents. While several steps were taken, key actions included:
  • Expanded ‘virtual field visits’ from a few days to a week to provide a larger scheduling window for respondents.
  • Provided consent forms via email to allow respondents to review them before or during interviews.
  • Conducted frequent check-ins between Guidehouse and ECA to brainstorm solutions to emerging challenges.

Impact

  • Completion of key tasks on time and within budget: Despite changes to the original evaluation design and presumed operating environment, the evaluative findings, conclusions, and recommendations were produced and delivered to ECA on-time.
  • Increased Response Rate: On average, the evaluation team saw higher than anticipated response rates in U.S. interviews (compared to initial interview scheduling that was to be conducted in-person).
  • Calendar Availability: Coordinating interviews was more flexible, as interviewees were not limited by logistical hurdles or travel time getting to the interview, and the evaluation team had a broader range of time to schedule interviews.

About the Experts

Back to top