SREE-IES Webinar Series

Applying Standards for Excellence in Education Research

The Institute for Education Sciences (IES) has launched the Standards for Excellence in Education Research (SEER) to make education research more transparent, actionable, and focused on consequential outcomes. To support this effort, IES is producing a series of practical guides for researchers on how to implement SEER to improve the quality and relevance of their studies. 

SREE is excited to partner with IES to sponsor webinars that each cover the recommendations from a specific guide aligned with the SEER. These webinars are free, open to the public, and relevant to all researchers who seek to ensure their studies are useful to policymakers and educators. This webinar series is also co-sponsored by APPAM. 

 


 

PREVIOUS WEBINARS

Webinar 1: Enhancing the Generalizability of Impact Studies in Education

May 16, 2022

Webinar Recording

Generalizability in education research indicates how well the results of a study apply to broader populations of interest to educators and policymakers. However, in studies that evaluate the impacts of educational interventions, the generalizability of study findings is often unclear because each study is conducted with only a specific sample. Educators and policymakers may question whether such findings can help them make decisions about how to best support the populations they serve.

This webinar lays out recommendations that researchers can take to enhance generalizability when planning impact studies in education. It walks researchers through the key steps involved in identifying a target population of schools, developing a list of schools in this population, and selecting a representative sample. It also provides an overview of steps to recruit schools into the study, assess and adjust for differences between the sample and population, and report generalizability appropriately. The webinar is based on IES’s recently released guide on this topic.

Dr. Elizabeth Tipton is an Associate Professor of Statistics, a Faculty Fellow in the Institute for Policy Research, and the co-Director of the Statistics for Evidence Based Policy and Practice (STEPP) Center at Northwestern University. Her research focuses on the development of methods for improving generalizations from experiments, both through the design and analysis of field experiments and through the use of meta-analysis.
Dr. Rob Olsen is a Senior Fellow at the George Washington Institute of Public Policy at George Washington University. His research focuses on the generalizability of randomized trials in education. Olsen is the Principal Investigator of a study testing different sampling methods for selecting sites for impact studies, and he is the co-Principal Investigator for a study testing different regression methods for predicting the impacts of educational interventions in individual schools using data from multi-site randomized trials. Finally, he consults for national research organizations on how to design and implement impact studies for greater generalizability.

 

Webinar 2: The BASIE (BAyeSian Interpretation of Estimates) Framework for Interpreting Findings from Impact Evaluations
June 13, 2022

Webinar Recording

In studies that evaluate the impacts of educational interventions, educators and policymakers typically want to know whether an intervention improved outcomes. However, researchers cannot provide a simple yes/no answer because all impact estimates have statistical uncertainty. Researchers have often used statistical significance and p-values to assess this uncertainty, but these statistics are often misinterpreted and cannot help educators and policymakers figure out how likely it was that the intervention had an effect.

This webinar lays out a Bayesian framework for interpreting impact estimates without the pitfalls of relying only on p-values. This framework allows researchers to calculate the probability an intervention had a meaningful effect, given the impact estimate and prior evidence on similar interventions. The webinar will explain key concepts and introduce a convenient, easy-to-use Excel tool for applying this framework. With these concepts and tools, researchers can extract more accurate and interpretable lessons from impact findings to support informed decisions about educational interventions. The webinar is based on IES’s recently released guide on this topic.

Dr. John Deke is an economist and senior fellow at Mathematica. His work has focused primarily on the statistical methodologies used in impact evaluations and systematic reviews, especially in K-12 education.

Dr. Mariel Finucane is a principal statistician at Mathematica. Her work uses Bayesian hierarchical modeling and tree-based methods to study social policies.

 

Webinar 3: Sharing Study Data: Implications for Education Researchers

March 28th, 2:00pm - 3:30pm ET

Webinar Recording

In the era of open science practices, researchers are more frequently being asked to share data from their studies. Multiple federal agencies and private funders have added policies and requirements for their grantees and contractors to share study data. Consistent with this priority and the Standards for Excellence in Education Research (SEER), the Institute of Education Sciences (IES) requires researchers to make data from IES-funded studies available to others, while also protecting the rights of study participants and meeting the requirements of data providers. Such policies, along with growing interest in open science practices, means that many education researchers are looking for practical, achievable strategies for making their data available to others.

In this 90-minute webinar, Ruth Neild from Mathematica will present an overview of IES’s guide, Sharing Study Data: A Guide for Education Researchers, and a panel of researchers will share their expertise and experiences with three key aspects of sharing study data:

  • Managing disclosure risks
  • Documenting and organizing data for other researchers
  • Depositing data for access by other researchers

The session will include a panel discussion and audience Q&A.

Panelists:

  • John Czajka is a Mathematica Senior Fellow Emeritus. Dr. Czajka has expertise on disclosure risk management and has vast experience connected to statistical applications of program administrative data, the analysis of survey data from large national samples, panel studies, and specialized surveys.
  • Jessica Logan is an Associate Professor of Special Education at Vanderbilt University’s Peabody College. Dr. Logan is a quantitative methodologist who focuses on improving the data management and data quality practices of researchers in the education sciences, encouraging transparency in science, and improving education researchers’ statistical conclusion validity.
  • John Diamond is MDRC’s lead for Data Management and Integrity and a senior research associate for the Postsecondary Education policy area. He chairs MDRC’s Data Integrity Board, Data Management Initiative, and Data Security Team, and coordinates MDRC’s data sharing with ICPSR at the University of Michigan.

 

Webinar 4: Conducting Implementation Research in Impact Studies of Education Interventions 

August 16, 2:00pm - 3:30pm ET

Webinar Recording

High-quality impact studies of education interventions provide the strongest evidence about whether interventions improve academic outcomes. Yet information about whether and by how much a tested intervention improves outcomes is only part of the story. To learn why and how impact findings vary and to support the broader use of effective interventions, educators need to understand how, and under what conditions, an intervention was implemented. High-quality implementation research can contribute to these understandings. To encourage this important work, the Institute of Education Sciences (IES) articulates, through the Standards for Excellence in Education Research (SEER), key recommendations concerning documenting treatment implementation and contrast.

In this 90-minute webinar, Carolyn Hill from MDRC and Lauren Scher from Mathematica will present an overview of IES’s recently-released guide, Conducting Implementation Research in Impact Studies of Education Interventions. They will be joined by a panel of researchers who will share their experiences conducting implementation research as part of high-quality impact studies of education interventions, including:

  • Addressing some common issues when measuring intervention fidelity
  • Mitigating challenges when measuring implementation contrast
  • Integrating implementation and impacts

The session will include a panel discussion and audience Q&A.

Panelists: 

  • Barbara Goodson, Principal Associate at Abt Associates, has led numerous implementation and impact evaluations of educational programs for children birth through grade 12. She leads the implementation evaluation technical assistance for grantees in the U.S. Department of Education’s Education Innovation and Research (EIR) program and has worked with more than 300 grantees to measure and report on fidelity of implementation.
  • Catherine Darrow, Associate Director of Research at J-PAL North America, leads J-PAL’s research activities producing evidence to better understand whether and how programs work to reduce poverty in the North America region. She has designed and directed rigorous impact and implementation evaluations of school-based programs serving children to young adults.
  • Howard Bloom, Former MDRC Chief Social Scientist, led MDRC’s development and application of experimental and quasi-experimental methods from 1999 to 2017. For the previous two decades, he taught research methods, program evaluation, and applied statistics to public policy graduate students at Harvard and NYU, where he received the university-wide Great Teacher Award in 1993. For his many methodological contributions, Dr. Bloom received the Peter Rossi Award from APPAM in 2010 and was inducted into the National Academy of Education in 2019.

Webinar 5: Prioritizing and Selecting Context Features in Education Impact Studies: A '4R Lenses' Approach 

August 7, 1:00pm - 2:30pm ET

View the recording here!

High-quality impact studies of education interventions provide the strongest evidence about whether interventions improve academic outcomes. Information about the contexts in which interventions are implemented may provide further insight into impact findings, such as why impacts occurred in some sites but not others. Practitioners also want to know the contexts in which an intervention was studied and assess whether it is a good fit for their situation. Yet the many possible context features to collect, analyze, and report in a study are numerous and are likely to differ from intervention to intervention and from study to study. This leaves researchers with the daunting task of deciding which context features to prioritize.

 In this webinar, Carolyn Hill from MDRC and Lauren Scher from Mathematica will describe a general ‘4R lenses’ approach that researchers can draw on to help them make informed decisions about the context features to include in their studies. The approach, which begins during study design, involves identifying and assessing (1) Requirements of interest holders, (2) prior Research, (3) Relevance to the current study settings, and (4) whether the features are Realistic to collect for the current study. The webinar builds on the 2023 Institute of Education Sciences’ guide Conducting Implementation Research in Impact Studies of Education Interventions and the corresponding August 2023 webinar. The session will include audience Q&A.

 

Webinar 6: Challenges and Strategies for Study Recruitment in Education Research 

August 21, 1:00pm - 2:30pm

View the recording online here.

Recruitment is an essential component of successful education research. To support an understanding of what works, for whom, and under what conditions, researchers need to recruit a sample that is large enough and can generalize to populations of interest. While recruiting districts and schools to participate in research has always been challenging, there is some evidence to suggest that it has become even more challenging since the COVID-19 pandemic.

 

This 90-minute webinar will offer practical strategies for researchers who are recruiting districts and schools to participate in education research. These strategies address common factors that may influence districts’ and schools’ ability and willingness to participate in research, including alignment with priorities and needs, staff capacity, beliefs about random assignment, and comfort with providing data. Dana Robinson, a researcher and experienced study recruiter at Mathematica, will open the webinar with an overview of Practical Strategies for Recruiting Districts and Schools for Education Impact Studies, a brief she recently co-authored to help researchers implement the Institute of Education Sciences’ (IES) Standards for Excellence in Education Research. She will then facilitate an interactive panel discussion composed of researchers who conduct recruitment and district staff who make decisions about study participation or support research staff. Panelists will discuss their experiences recruiting and participating in research and will answer questions posed by webinar attendees.

 

Panelists:

  • Sarah Dickson is the Director of External Research at Chicago Public Schools. She develops the strategic direction of district research partnerships, manages and supports external research partners, and oversees the implementation of research projects across the district. In conjunction with the district’s senior leadership, Sarah works to establish priority research areas and develop research projects to support the improvement of practice in those areas.
  • Rinat Fried is a Research Associate at Oakland Unified School District. She focuses on elementary literacy data and works with external researchers to support the district’s participation in research. Rinat has worked with researchers from Mathematica, Northwestern University’s Center for Education Efficacy, Excellence, and Equity, Harvard’s Graduate School of Education, and the Center on Reinventing Public Education at Arizona State University.
  • Jason Snipes is a Senior Research Scientist and the Director of Applied Research at the Regional Educational Laboratory West at WestEd. He has over 27 years of experience conducting rigorous applied research, with a particular emphasis on collaborating with districts and schools to develop, implement, evaluate, and disseminate strategies to improve outcomes among Black and Latinx students. Jason currently co-leads a research practice partnership with the San Francisco Unified School District focused on achieving equitable discipline outcomes for Black students.
  • Anja Kurki is a Managing Researcher at the American Institutes for Research. She has over 20 years of experience conducting large, national impact evaluations and specializes in recruitment and implementation oversight. Anja has successfully led recruitment of districts and schools for several national IES evaluations on topics including teacher professional development, parent engagement, and multi-tiered systems of support for students.