ALSE conducted Executive Seminars with participants from the Afghan Ministry of Education (MoE) and NGO education staff. These seminars provided basic training in research design and methods. In the Spring of 2016, ALSE conducted an individual-level assessment of the research capacities of the MoE's Research and Evaluation Unit (R&EU).
Executive Seminars
First Executive Seminar: Introductions to Field Experiments
This five‐day seminar was conducted in coordination with NYU Abu Dhabi in October 2013 and build on and expand the training that some MoE participants have already received from either the World Bank Strategic Impact Evaluation Fund (SIEF) program in New Delhi or from the International Institute of Educational Planning (IIEP) training in Paris.
Professors Burde, Middleton, and Samii provided an introductory training on how to design and implement field experiments. Topics discussed included avoiding selection bias through randomization, units and methods of randomization, defining and measuring outcomes and using a mixed method approach in field experiments. Through case study group work, participants had an opportunity to apply their learning and design a RCT. This course also included a visit to the Abu Dhabi Education Council to learn about their regional colleagues' research projects as well as a tour of New York University's Abu Dhabi campus and discussion with David McGlennon, NYU Abu Dhabi Vice Provost of Research Administration and University Partnerships.
Presentation Slides
- The Importance of Mixed Methods in Field Experiments
- Day 1: Introduction to Impact Evaluation
- Day 2: Selection Bias
- Day 3: Randomized Control Trials
- Day 4 (Session 1): Defining and Measuring Outcomes
- Day 4 (Session 2): Mixed Methods Approaches
- Day 5: Using Impact Evaluation Evidence
Second Executive Seminar: Qualitative Methods
Professor Burde and two senior research scholars from NYU provided a three‐day training in qualitative methods for select MoE and NGO staff in Kabul in December 2013. The participants learned how to develop questions best suited for qualitative research as well as how to conduct data collection and data analysis associated with qualitative research.
Presentation Slides
Readings
- LeCompte, M. D., & Schensul, J. J. (1999). Collecting Ethnographic Data (Vol. 1), Designing and conducting ethnographic research (pp.127-146). Rowman Altamira.
- Rubin, H. J., & Rubin, I. S. (2011). Designing Research for the Responsive Interviewing Model, Qualitative interviewing: The art of hearing data (pp. 41-57). Sage. Chicago
- Small, M. L. (2009). How many cases do I need?'On science and the logic of case selection in field-based research. Ethnography, 10(1), 5-38.
Third Executive Seminar: Capacities and Best Practices in Evaluation
This was the most recent seminar that was held in early March 2015. This 5-day Executive Seminar focused on randomized controlled trials (RCTs) as an ideal method for evaluating education interventions. The seminar included a review of materials from the first seminar (Introduction to Field Experiments), as well as case studies of three RCTs conducted to assess education interventions.
Participants also watched webinars about two RCTs (INSIGHTS into Children's Temperament and Opportunities for Equitable Access to Quality Basic Education.)
Presentation Slides
Professor McClowry's Webinar Videos
- Part 1: Introduction to INSIGHTS
- Part 2: Does INSIGHTS work?
- Part 3: Measurement tools, data collection, and analyses
- Part 4: Findings
- Interview with Professor McClowry
Readings
- Burde, D. & Linden, L. (2013). Bringing Education to Afghan Girls: A Randomized Controlled Trial of Village-Based Schools. American Economic Journal: Applied Economics. 5(3): 27-40.
- O'Connor, E. E., Cappella, E., McCormick, M. P., & McClowry, S. G. (2014). An examination of the efficacy of INSIGHTS in enhancing the academic learning context. Journal of Educational Psychology, 106(4), 1156-1169.
- Torrente, C., Aber. J.L., Witteveen, D., Gupta, T., Johnston, B., Shivshanker, A., Annan, J., & Bundervoet, T. (2013). Baseline Report: Teacher Survey Results. Unpublished Manuscript
Research Capacity Assessment Resources
The overall goal of the research capacity assessment was to identify and prioritize areas for improving the research and evaluation skills of the Ministry of Education's Research and Evaluation Unit (R&EU) staff. The assessment was based on the following materials:
ALSE asks you to please cite the assessment materials using the following citation format:
Burde, D., J. Middleton, and C. Samii. 2016. Assessment of Learning Outcomes and Social Effects of Community-Based Education: A Randomized Field Experiment in Afghanistan. [Insert assessment material name]. New York: Steinhardt School, New York University.