Exploring Our International Research Capabilities
Mathematica specializes in impact evaluations using experimental and quasi-experimental designs. Over the past 40 years, we have demonstrated a capacity to formulate and customize random assignment procedures so they address the needs of evaluations of diverse programs and demonstrations. In numerous projects, we have helped programs in the United States conduct random assignment at the individual, institutional, and other programmatically relevant levels. More recently, we have broadened our experience conducting random assignment in other countries as well. For an evaluation of farmer training in Armenia, we conducted a random assignment of villages to a training intervention using a phased-in approach. The random selection was done electronically in a public setting, in front of village mayors and key stakeholders, to ensure transparency and secure buy-in from all constituencies. In El Salvador, we are helping implement random assignment of students to a scholarship program to assess outcomes on educational attainment. When random assignment is not possible, we help our clients identify the most rigorous study design possible, given financial and programmatic considerations. For example, we are using regression-discontinuity methods in evaluations in Jamaica, Burkina Faso, and Armenia. In a study in Mexico, we used propensity score matching to identify a relevant comparison group.
Our sampling statisticians are experts in developing and implementing sample designs for surveys of individuals, programs, and schools. Our staff have developed sampling weights that account for multiplicity in the sample frame and nonresponse. We have experience applying this sampling expertise in other countries as well as in projects in the U.S. For example, Mathematica helped design the oversampling of rural households as part of the Integrated Survey of Living Standards (ISLS) national household survey in Armenia, which is used to generate national poverty estimates. Although finding an appropriate sample frame can be challenging in developing countries, staff have worked with local stakeholders to help identify sample frames to address the relevant study questions. For example, in Burkina Faso, we designed a procedure to develop a sampling frame in rural villages and a process for selecting a sample stratified by income. Similarly, for the Farmer Practices Survey in Armenia, Mathematica staff worked with local staff and local data collectors to help come up with a strategy to develop a sample frame of eligible farmers in villages.
Mathematica has the staff and systems necessary to guarantee high quality data collection. We implement our international data collection in a number of ways—designing survey instruments and methodology, developing and implementing training of local data collection and entry staff, and developing data collection and monitoring systems that meet high standards. We ensure quality data collection in a variety of ways, depending on the resources and technological constraints of our clients. In our evaluation of Jamaica’s social safety net initiative, Programme of Advancement Through Health and Education (PATH), we provided technical assistance to the Statistical Institute of Jamaica in instrument development and field procedures to maximize response rates in a longitudinal survey of 5,000 households. In a study to evaluate interventions to improve girls’ education in Burkina Faso, we subcontracted with a local data collection firm to collect relevant data. For the Farmer Practices Survey in Armenia, we helped develop reference terms for the data collection firm that was selected by the local evaluation team, assisted in developing the instrument, and provided guidance on sample strategy and approaches to data collection.
Mathematica is committed to building local capacity in all aspects of survey research—instrument development, data collection design, training, and survey systems—by training local staff and providing new tools to achieve desired outcomes. We work collaboratively with local institutions, government agencies, and consultants in different countries to train staff and build capacity for program monitoring and rigorous evaluation of social programs, and to provide stakeholders with the knowledge to achieve enhanced buy-in for evaluation strategies. For staff in Mexico’s Social Development Ministry, we conducted a one-day course on impact evaluation design. For our study of the Millennium Challenge Account Program in Armenia, we conducted presentations for government staff and other key stakeholders to enhance their acceptance of randomized experiment as a rigorous evaluation method. We also conducted training for Armenia’s National Statistical Service staff on best practices for collecting high quality household survey data. Our study for Mexico’s Human Development Ministry provided technical assistance to local evaluators on evaluation design and other technical issues. As part of our work on the evaluation of Jamaica’s PATH safety net reform, we provided technical assistance in instrument development and field procedures for staff at the Statistical Institute of Jamaica, the local agency responsible for data collection.