|
Dr. Elizabeth Tipton is an Associate Professor of Statistics, a Faculty Fellow in the Institute for Policy Research, and the co-Director of the Statistics for Evidence Based Policy and Practice (STEPP) Center at Northwestern University. Her research focuses on the development of methods for improving generalizations from experiments, both through the design and analysis of field experiments and through the use of meta-analysis. |
Dr. Rob Olsen is a Senior Fellow at the George Washington Institute of Public Policy at George Washington University. His research focuses on the generalizability of randomized trials in education. Olsen is the Principal Investigator of a study testing different sampling methods for selecting sites for impact studies, and he is the co-Principal Investigator for a study testing different regression methods for predicting the impacts of educational interventions in individual schools using data from multi-site randomized trials. Finally, he consults for national research organizations on how to design and implement impact studies for greater generalizability. |
In studies that evaluate the impacts of educational interventions, educators and policymakers typically want to know whether an intervention improved outcomes. However, researchers cannot provide a simple yes/no answer because all impact estimates have statistical uncertainty. Researchers have often used statistical significance and p-values to assess this uncertainty, but these statistics are often misinterpreted and cannot help educators and policymakers figure out how likely it was that the intervention had an effect.
This webinar lays out a Bayesian framework for interpreting impact estimates without the pitfalls of relying only on p-values. This framework allows researchers to calculate the probability an intervention had a meaningful effect, given the impact estimate and prior evidence on similar interventions. The webinar will explain key concepts and introduce a convenient, easy-to-use Excel tool for applying this framework. With these concepts and tools, researchers can extract more accurate and interpretable lessons from impact findings to support informed decisions about educational interventions. The webinar is based on IES’s recently released guide on this topic.
Dr. John Deke is an economist and senior fellow at Mathematica. His work has focused primarily on the statistical methodologies used in impact evaluations and systematic reviews, especially in K-12 education.
Dr. Mariel Finucane is a principal statistician at Mathematica. Her work uses Bayesian hierarchical modeling and tree-based methods to study social policies.
In the era of open science practices, researchers are more frequently being asked to share data from their studies. Multiple federal agencies and private funders have added policies and requirements for their grantees and contractors to share study data. Consistent with this priority and the Standards for Excellence in Education Research (SEER), the Institute of Education Sciences (IES) requires researchers to make data from IES-funded studies available to others, while also protecting the rights of study participants and meeting the requirements of data providers. Such policies, along with growing interest in open science practices, means that many education researchers are looking for practical, achievable strategies for making their data available to others.
In this 90-minute webinar, Ruth Neild from Mathematica will present an overview of IES’s guide, Sharing Study Data: A Guide for Education Researchers, and a panel of researchers will share their expertise and experiences with three key aspects of sharing study data:
The session will include a panel discussion and audience Q&A.
Panelists:
High-quality impact studies of education interventions provide the strongest evidence about whether interventions improve academic outcomes. Yet information about whether and by how much a tested intervention improves outcomes is only part of the story. To learn why and how impact findings vary and to support the broader use of effective interventions, educators need to understand how, and under what conditions, an intervention was implemented. High-quality implementation research can contribute to these understandings. To encourage this important work, the Institute of Education Sciences (IES) articulates, through the Standards for Excellence in Education Research (SEER), key recommendations concerning documenting treatment implementation and contrast.
In this 90-minute webinar, Carolyn Hill from MDRC and Lauren Scher from Mathematica will present an overview of IES’s recently-released guide, Conducting Implementation Research in Impact Studies of Education Interventions. They will be joined by a panel of researchers who will share their experiences conducting implementation research as part of high-quality impact studies of education interventions, including:
The session will include a panel discussion and audience Q&A.
Panelists:
High-quality impact studies of education interventions provide the strongest evidence about whether interventions improve academic outcomes. Information about the contexts in which interventions are implemented may provide further insight into impact findings, such as why impacts occurred in some sites but not others. Practitioners also want to know the contexts in which an intervention was studied and assess whether it is a good fit for their situation. Yet the many possible context features to collect, analyze, and report in a study are numerous and are likely to differ from intervention to intervention and from study to study. This leaves researchers with the daunting task of deciding which context features to prioritize.
In this webinar, Carolyn Hill from MDRC and Lauren Scher from Mathematica will describe a general ‘4R lenses’ approach that researchers can draw on to help them make informed decisions about the context features to include in their studies. The approach, which begins during study design, involves identifying and assessing (1) Requirements of interest holders, (2) prior Research, (3) Relevance to the current study settings, and (4) whether the features are Realistic to collect for the current study. The webinar builds on the 2023 Institute of Education Sciences’ guide Conducting Implementation Research in Impact Studies of Education Interventions and the corresponding August 2023 webinar. The session will include audience Q&A.
View the recording online here.
Recruitment is an essential component of successful education research. To support an understanding of what works, for whom, and under what conditions, researchers need to recruit a sample that is large enough and can generalize to populations of interest. While recruiting districts and schools to participate in research has always been challenging, there is some evidence to suggest that it has become even more challenging since the COVID-19 pandemic.
This 90-minute webinar will offer practical strategies for researchers who are recruiting districts and schools to participate in education research. These strategies address common factors that may influence districts’ and schools’ ability and willingness to participate in research, including alignment with priorities and needs, staff capacity, beliefs about random assignment, and comfort with providing data. Dana Robinson, a researcher and experienced study recruiter at Mathematica, will open the webinar with an overview of Practical Strategies for Recruiting Districts and Schools for Education Impact Studies, a brief she recently co-authored to help researchers implement the Institute of Education Sciences’ (IES) Standards for Excellence in Education Research. She will then facilitate an interactive panel discussion composed of researchers who conduct recruitment and district staff who make decisions about study participation or support research staff. Panelists will discuss their experiences recruiting and participating in research and will answer questions posed by webinar attendees.
Panelists: