As I noted in my last post, the work of the Research Alliance has been a challenging and rewarding “balancing act.” This balancing act takes a variety of forms that influence almost every aspect of our work and that we attend to on a nearly daily basis. In this post, I provide some further reflections on the balance we seek in ensuring that our work is highly relevant to the concerns of key stakeholders in NYC’s public schools and, at the same time, rigorous enough to meet the standards of high-quality (i.e., credible, valid, reliable) education research.
A central tenet of our mission is to conduct studies on questions that matter to the City’s public schools. We pursue relevance through engagement with education stakeholders, including district leadership and staff, administrators and educators in schools, the unions and other support organizations, and increasingly parents and students. The intent of this engagement is to ensure that stakeholders inform the priorities for our work and that they, in turn, are informed by the findings and lessons that emerge from our studies.
While we aim to prioritize the questions that stakeholders want answered, we recognize that it may not always be feasible or appropriate to subject those questions to the type of systematic qualitative and quantitative research methods that can support valid and reliable findings and conclusions. For example, at the founding of the Research Alliance, there was deep interest in knowing whether the newly established Mayoral control of the City’s schools was more or less effective than the longstanding elected Board of Education. While this was a vitally important political and operational question, we did not see a viable research method that could yield answers that were valid, in the sense that one could infer that Mayoral control per se caused changes in the system, and reliable, in the sense that the relationship between Mayoral control and changes in the system was not merely random or episodic. Instead, we embarked on a series of studies that examined the impact of more specific initiatives that the new administration was undertaking in efforts to improve educational opportunities and outcomes. This included studies of changes in student achievement that coincided with the implementation of “Children First” initiatives in elementary and middle schools, the closing of persistently low-performing high schools, the opening new small schools of choice, and the implementation of the Expanded Success Initiative, among others.
Finally, another essential aspect of balancing relevance and rigor in our work involves candor and transparency about strengths and limitations of our studies. Even studies that meet high standards for education science have important limitations in the range of questions they can answer well and in the degree to which their findings can be applied to circumstances beyond the typically narrow conditions of the analysis. While we would like to maximize the relevance and potential impact of the evidence we produce by stating the findings and implications as definitively as possible, we often need to temper that definitiveness with caveats about the rigor of the methods that were feasible under the operational realities of the study.
As the Research Alliance evolves over the next 15 years of work, I look forward to supporting its effort to “live at the hyphen” of the priorities we seek to balance: relevance-rigor, opportunity-strategy, collaboration-independence, and so on.
Sincerely,
James J. Kemple