Virginia SpaceLink Newsletter
Summer of Innovation Grants
Upcoming Events
NASA ESTEEM Project Home
NASA Climate Data
Recent Events
2015 NASA Interns
YouthBuild Commencement
Order of the Long Leaf Pine
ECSU Research Week
VASC Black History Month
2014 AGU
NICE @ Grambling
2014 Planning Meeting
2013 AGU
2013 PI Meeting
2013 Climate Change Workshop
2012 AGU
GSU Presentations
2012 Recruitment Trip for Cohort 2
2012 Climate Change Workshop
2012 UNH URC
NICE/GLOBE Workshops
2012 Tri-Agency PI Meeting
STEM Conference Recruitment
New PI Orientation
2011 Climate Change Workshop
2011 NASA Langley Tour
MSI Technical Conference, Recruitment
Cohort Documents
Ervin Howard
Waneene Dorsey
Loretta Jaggers
Trina Spencer & Leslie Whiteman
Anuradha Dujari
Gulnihal Ozbay
Chunlei Fan
Jan Duncan
Sheryl Bradford
Candace Carter-Stevens
Kaiem Frink
Program Documents
Statement of Work
Project Evaluation
Management Structure
2012 Progress Report
2013 Progress Report
IRB Approval Letter
NASA ESTEEM Final Report
ESTEEM Success Stories
Graduate Student Assistants
Patrina Bly
Justin Deloatch
1 Evaluation
Learning Innovations at WestEd will conduct the external evaluation of the proposed project. The evaluation led by Dr. Sue Henderson, Senior Research Associate, will include two key components: 1) formative evaluation, and 2) summative evaluation related to the outcomes of the project.

Learning Innovations enjoys a long history of evaluating NASA programs. Dr. Henderson is the lead evaluator on the current Martin GCCE project. Other NASA work includes the National Academy for Science and Mathematics Education Leadership; and the NASA Earth to Sky Project, as well as the NASA Museum Alliance Evaluation. Dr. Henderson is currently the lead evaluator on the Pennsylvania State University MSP “Earth and Space Science Education for the Middle Grades,” as well as the NSF Geosciences Directorate funded program, Transforming Earth System Science Education, a partnership between UNH, PSU, ECSU, and Dillard University.

Logic Model
During the first year of the program implementation, evaluators will work with project staff to create a logic model that clearly articulates a theory of action for the project. The goal of this process is to support and guide the formative and summative evaluation, as well as enable a common understanding among project staff of the project goals and activities, anticipated outcomes, and evaluation activities.

Formative Evaluation
The formative data gathering will focus primarily on the project development and implementation. It will provide feedback to staff on elements related to the course development and implementation activities. The evaluation questions will address the following: the degree to which the project team is able to effectively teach climate content; the degree to which the project team models effective use of NASA climate data; identifying the triumphs and challenges in applying workshop lessons into undergraduate science educator preparation; the flexibility of the program to adjust to emerging needs and concerns of MSI faculty and their needs.

Summative Evaluation
Essential to the project is the tracking and analysis of data on the project outcomes in cooperation with the project team. Specific outcomes and evaluation questions identified through the logic model development will likely include:

  1. How does the project impact STEM education faculty at participating minority serving institutions in terms of content knowledge? To what extent do these STEM faculty understand climatological impacts, feedback mechanisms, and drivers? To what extent do they understand and are able to use NASA data to understand the impact of climate on the world’s forests and oceans?
  2. To what extent does the program provide STEM education faculty models of instruction data that are enriched through appropriate use of NASA climate data? To what extent do STEM education faculty lead their students to construct an understanding of climate science by engaging in the scientific process with NASA tools?
  3. To what extent are the STEM faculty able to motivate and engage their undergraduate education students in climate science, making it relevant and connected to their environment and lives?
  4. To what extent do the undergraduate pre-service teachers use the lessons taught by and integrate the NASA tools used by the faculty trainers?

Analysis Techniques
Excel and SPSS will be used to conduct analyses of quantitative data as needed. Descriptive analysis of data, as well as psychometric analyses of surveys and measurement tools, will be run to address their degree of reliability and validity. Content analysis techniques will be used to analyze qualitative data. The following table outlines the timeline for evaluation activities.

Logic Model development X    
Development of online questionnaires, focus group protocols, and document analysis rubrics X    
Conduct pre and post participant focus groups X X  
Pre and post online questionnaire of students, teachers, and scientists X X X
Analysis of program outcome data X X X
Revision of protocols and questionnaires as needed   X X
Informal reporting of formative findings to leadership team X X X
Annual reporting of summative findings X X X
Presentations at national conferences such as AERA, AEA, and NSTA     X
Publication of findings in peer-reviewed journals     X