Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Please note that all times are shown in the time zone of the conference. The current conference time is: 25th Sept 2022, 10:07:05pm CEST

 
 
Session Overview
Session
ID 733: Experimental Impact Analysis for Program Improvement
Time:
Friday, 10/June/2022:
12:40pm - 1:40pm

Location: Room - Karavanen 8

Session Themes:
Theme 4: Methodological shift: transforming methodologies

Show help for 'Increase or decrease the abstract text size'
Presentations
ID: 733
Meet the Authors (Submission of abstracts 2022)
Themes: Theme 4: Methodological shift: transforming methodologies
Keywords: experimental evaluation, program improvement, adaptation, transformation, learning

Experimental Impact Analysis for Program Improvement

Chair(s): Laura R. Peck (Abt Associates, United States of America)

Discussant(s): Laura R. Peck (Abt Associates)

The Editors’ introduction to this book states the following: “The Impact evaluation is central to the practice and profession of evaluation. Emerging in the Great Society Era, the field of evaluation holds deep roots in the social experiments of large-scale demonstration programs—Campbell’s utopian ideas of an Experimenting Society. Since then, the fervent search for “what works”—for establishing the impact of social programs—has taken on many different forms. From the early emphasis on experimental and quasi-experimental designs, through the later emergence of systematic reviews and meta-analysis, and onwards to the more recent and sustained push for evidence-based practice, proponents of experimental designs have succeeded in bringing attention to the central role of examining the effectiveness of social programs (however we chose to define it). There is a long and rich history of measuring impact in evaluation.

The landscape of impact evaluation designs and methods has grown and continues to grow. Innovative variants of and alternatives to traditional designs and approaches continue to emerge and gain prominence, addressing not only “what works” but also “what works, for whom, and under what circumstances” (Stern et al., 2012). For the novice (and perhaps even the seasoned) evaluator, the broadening array of designs and methods, not to mention the dizzying array of corresponding terminology, may invoke a mixed sense of methodological promise and peril, opportunity and apprehension. How can randomization be applied across multiple treatments, across multiple treatment components, and across stages of a program process? What exactly is the difference between multistage, staggered, and blended impact evaluation designs? And are there any practical and methodological considerations that one should award particular attention to when applying these designs in real-world settings?

These are but a few of the questions answered in Laura Peck’s Experimental Evaluation Design for Program Improvement. Grounded on decades of scholarship and practical experience with real-world impact evaluation…this book is an important contribution to the growing landscape of impact evaluation. With her aim to identify a broader range of designs and methods that directly address causal explanation of “impacts,” Peck opens new frontiers for impact evaluation. Peck directly challenges, and correctly so, the longstanding perception that experimental designs are unable to get inside the black box of how, why, and for whom social programs work.”

The objective of this meet-the-author session is to familiarize a greater swath of people with the methods detailed in the book in hopes they will prove useful to evaluation in practice.

Biographies
Laura R. Peck is a Principal Scientist at Abt Associates, author of Experimental Impact Analysis for Program Improvement (2020, SAGE Publishing), co-author of a public policy text-book, and has published 40+ peer-reviewed journal articles. Dr. Peck specializes in innovative ways to estimate program impacts in experimental and quasi-experimental evaluations.


 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: EES 2022
Conference Software - ConfTool Pro 2.6.145+TC
© 2001–2022 by Dr. H. Weinreich, Hamburg, Germany