Data, Data Everywhere and Not a Drop to Drink: The Importance of Researcher-Practitioner Partnerships

Mar 29, 2018

OET blog post #5
by Sebastiaan ter Burg licensed under CC BY 2.0

Editor’s Note: This blog originally appeared on the U.S. Department of Education’s Office of Educational Technology (OET) Medium page and is being re-posted with their permission. You can follow the full series at https://medium.com/building-evaluation-capacity.

As the focus on evidence-based practices has increased in education, practitioners are increasingly using data to make decisions to improve learning. While practitioners have access to more data than ever before, limited capacity and training often inhibit the impact of that data on improving practice. Researcher-practitioner partnerships (RPPs), by design, can help to overcome these common challenges in the field.

The Learning Accelerator (TLA)Distinctive Schools in Chicago, and Leadership Public Schools in the Bay Area recently wrapped up two RPPs that generated lessons that are representative of wider challenges in the field. These partnerships demonstrated that school systems need support in:

  • Generating testable hypotheses from their questions of interest
  • Determining which data can be used to answer these questions (including research design questions like what constitutes a reasonable comparison group)
  • Analyzing these data to unearth accurate, relevant findings
  • Using their data and evidence in their decision-making processes to inform future practice.

It’s clear that school systems face many real and current needs in incorporating data and evidence into their decision-making processes.

The Challenge

Distinctive Schools and Leadership Public Schools were both interested in scaling their personalized learning initiatives and wanted to better understand what was working in their pilots, with an emphasis on what parts of their model they should scale, and what parts they should improve. In both cases, they sought to answer the question, “How should we scale an initiative that is working for us?” They collected and had access to data from a combination of teacher and student perception/attitude surveys, along with formative and summative test scores for academic and non-academic outcomes of interest. Although they had developed systems for collecting the data they needed to make decisions about what and how to scale, they did not have the internal capacity to analyze the data and extract the necessary information to make these decisions. This is where researchers from TLA were able step in to help. Not only do researchers have the necessary skills and capacity to translate data into practical insights, they also have a need for access to real classrooms and data — well-designed RPPs can benefit both partners.

However, as currently designed, RPPs themselves have limited capacity. For a lean research organization like TLA, there are only so many RPPs that staff can participate in, and by extension, only so many school systems they can support. In fact, school systems outnumber researchers (nationally and internationally) to the extent that dedicated, face-to-face RPPs are not possible in every case.

Therefore, while RPPs are one good approach to improving research capacity, school systems need other resources to support their use of data and evidence. Resources like the Rapid Cycle Evaluation Coach (RCE Coach) can both augment the number of possible RPPs, as well as provide potential solutions to the data-decision mismatch that some school systems face.

The RCE Coach Solution

Two of the lessons learned from TLA’s RPPs were that the data that schools systems need likely already exist, and the data are locally relevant. As mentioned above, a RPP is an ideal solution for tapping into the unique skills and capacity needed to extract information from data. For districts that don’t have access to RPPs, or to extend what can be done with limited resources, the RCE Coach models and scaffolds the RPP approach which can support district data use.

The RCE Coach allows school systems, ideally along with the guidance of a researcher, to make data-informed decisions by asking plain-language questions about what was implemented, with whom, and how. It also provides supplemental resources for users who are interested in learning more, or would like to engage in more reflection before answering the questions in the RCE Coach. Finally, the RCE Coach generates a report that appropriately contextualizes the findings, and presents them in an easy-to-understand format. The immediate and direct impact of the RCE Coach is to make good use of the often overwhelming, yet relevant, data that already exists in school systems across the country. Further, this approach helps to build the capacity of school systems to be more sophisticated consumers of research, and to use evidence in their decision-making processes.

Undoubtedly, researchers and practitioners need to be more integrated with each other for evidence-based educational practices to improve the teaching and learning experiences of millions of educators and students across the country. RPPs are one mechanism for providing this support. The RCE Coach extends that capacity by providing critical supports to build research capacity in school systems and allow educators to learn from the data they have.

SHARE THIS POST

The opinions expressed are those of the author(s) and do not represent those of Mathematica Policy Research.

Recent Comments

Join the conversation: You can register for an account to comment on Evidence in Action. Log in to comment through this account or through Facebook, Twitter, LinkedIn, or Google+.

Log in | Register

View the comments policy

Evidence in Action: Comments Policy

We encourage comments on the Evidence in Action blog—all viewpoints are welcome. Commenters can register through our simple form to create an account for Evidence in Action. Commenters can log in through this account or through their social media accounts. Comments are moderated, and we reserve the right to edit, reject, or remove any that include off-topic statements or links; abusive, vulgar, or offensive content; personal attacks; or spam. Those who violate this policy will be blocked from commenting in the future.

Users who log in through a social media account will be identified by information associated with that account (i.e., a Twitter handle or the user name registered with a Facebook, LinkedIn, or Google+ account). Your comment will not include links to your social media account. Mathematica will not post to your social media account.

Feel free to email us with any questions.