Image that reads Center for Educational Technologies. This image links to the Center for Educational Technologies home page.

Center for Educational Technologies projects have ended (except Challenger Learning Center) and are no longer funded.

NASA Explorer Schools Evaluation

The Center for Educational Technologies® served as the primary evaluator for the first three years (2003-2006) of the NASA Explorer Schools (NES) project. This study is relevant to those educators and policymakers who are examining school reform initiatives designed to improve science, technology, engineering, and mathematics (STEM) education projects. The findings will be particularly useful to scientists and educators affiliated with NASA who are looking to build on student interest in space science and exploration with classroom tools and resources that will help teachers increase student learning in STEM disciplines.

Evaluation Reports

The NASA Explorer Schools final report integrates the results of five previous interim reports and provides an impact analysis of the first three years of the NASA Explorer Schools intervention. The report describes how qualitative and quantitative methods were blended in the theory-based evaluation framework to determine what impact the NES program has had on participating schools, administrators, teachers, students, and families.

Executive Summary
Full Report

The case study reports include 29 randomly selected NASA Explorer Schools. Each of the 29 school reports demonstrates how the theory-based research design was used to investigate school-based implementations of the NES schoolwide STEM-education intervention. Each case study school profile was scored using the NES-grounded theory rubric designed to measure school success in achieving the six anticipated outcomes.

2003 Case Study Report and Scored Rubric
2004 Case Study Report and Scored Rubric
2005 Case Study Report and Scored Rubric

The Center for Educational Technologies presented highlights of its NASA Explorer Schools evaluation at the Evaluation 2007: Evaluation and Learning, the annual conference of the American Evaluation Association held Nov. 7-10, 2007, in Baltimore.

The session, led by Laurie Ruberg and Karen Chen of the Center for Educational Technologies, was titled "Value-added Assessment: Teacher Training Designed to Improve Student Achievement." Travel for this presentation was partially funded by the West Virginia Space Grant Consortium.

In addition, Chen presented a poster session at the conference on the NES evaluation. "Promoting STEM Through Professional Development: Learning from Evaluation" was coauthored by Ruberg and Judy Martin.

Ruberg also presented a paper detailing the NES evaluation at the 2008 annual meeting of the American Educational Research Association held March 24-28 in New York City. The paper, Applying Blended Research Methods to School-based Intervention Evaluation, was selected from more than 12,000 proposals submitted. A presentation is also available. Ruberg's paper discusses the mixed method analysis of qualitative and quantitative data using a theory-based evaluation framework that the evaluation team applied to the NES project. Travel for this presentation was partially funded by the West Virginia Space Grant Consortium.

Interim Reports

Brief 1 (McGee, Hernandez, & Kirby, 2003) established that the NES program had identified and engaged under-served schools, teachers, and students with a comprehensive portfolio of curriculum and professional development supports.
Brief 2 (Hernandez, McLaughlin, Kirby, Reese, & Martin, 2004) evaluated the summer 2003 workshops and found participants were very positive about the professional development opportunities they received.
Brief 3 (Hernandez, McGee, Reese, Kirby, & Martin, 2004) reviewed the implementation and results from the first year of implementation and offered lessons for improving the coherence and design decisions in terms of team organization, participation, and professional development supports.
Brief 4 (Davis. Palak, Martin, & Ruberg, 2006) introduced a logic model for the evaluation plan that outlined the key areas of impact and how they will be evaluated as well as the data sources. It described how the NES logic model was implemented within the mixed method approach. A summary of findings and recommendations was made for the next steps for the program and its evaluation.
Brief 5 (Ruberg, Martin, & Chen, 2006) represented a summary of findings of the data collected from the 2005-2006 academic year. The discussion section of this report and the attached appendices documented successes and challenges that NES experienced in its third year. The data collection and analysis addressed the six anticipated outcomes of the program:

  • Increase student interest and participation in science, technology, engineering, and mathematics.
  • Increase student knowledge about careers in science, technology, engineering, and mathematics.
  • Increase student ability to apply science, technology, engineering, and mathematics concepts and skills in meaningful ways.
  • Increase the active participation and professional growth of educators in science.
  • Increase the academic assistance for and technology use by educators in schools with high populations of under-served students.
  • Increase family involvement in children's learning.