We are Nena Bloom and Lori Rubino-Hare, the internal evaluator and principal investigator, respectively, of the Advanced Technological Education project Geospatial Connections Promoting Advancement to Careers and Higher Education (GEOCACHE). GEOCACHE is a professional development (PD) project that aims to enable educators to incorporate geospatial technology (GST) into their classes, to ultimately promote careers using these technologies. Below, we share how we collaborated on creating a rubric for the project’s evaluation.

One important outcome of effective PD is the ability to master new knowledge and skills (Guskey, 2000; Haslam, 2010). GEOCACHE identifies “mastery” as participants’ effective application of the new knowledge and skills in educator-created lesson plans.

GEOCACHE helps educators teach their content through Project Based Instruction (PBI) that integrates GST. In PBI, students collaborate and critically examine data to solve a problem or answer a question. Educators were provided 55 hours of PD, during which they experienced model lessons integrated with GST content. Educators then created lesson plans tied to the curricular goals of their courses, infusing opportunities for students to learn appropriate subject matter through the exploration of spatial data. “High-quality GST integration” was defined as opportunities for learners to collaboratively use GST to analyze and/or communicate patterns in data to describe phenomena, answer spatial questions, or propose solutions to problems.

We analyzed the educator-created lesson plans using a rubric to determine if GEOCACHE PD supported participants’ ability to effectively apply the new knowledge and skills within lessons. We believe this is a more objective indicator of the effectiveness of PD than solely using self-report measures. Rubrics, widespread methods of assessing student performance, also provide meaningful information for program evaluation (Davidson, 2004; Oakden, 2013). A rubric illustrates a clear standard and set of criteria for identifying different levels of performance quality. The objective is to understand the average skill level of participants in the program on the particular dimensions of interest. Davidson (2004) proposes that rubrics are useful in evaluation because they help make judgments transparent. In program evaluation, scores for each criterion are aggregated across all participants.

Practices we used to develop and utilize the rubric included the following:

  • We developed the rubric collaboratively with the program team to create a shared understanding of performance expectations.
  • We focused on aligning the criteria and expectations of the rubric with the goal of the lesson plan (i.e., to use GST to support learning goals through PBI approaches).
  • Because good rubrics existed but were not entirely aligned with our project goal, we chose to adapt existing technology (Britten & Casady, 2005; Harris, Grandgenett & Hofer, 2010) and PBI rubrics (Buck Institute for Education, 2017) to include GST use, rather than start from scratch.
  • We checked that the criteria at each level was clearly defined, to ensure that scoring would be accurate and consistent.
  • We pilot tested the rubric with several units, using several scorers, and revised accordingly.

This authentic assessment of educator learning informed the evaluation. It provided information about the knowledge and skills educators were able to master and how the PD might be improved.


References and resources

Britten, J. S., & Cassady, J. C. (2005). The Technology Integration Assessment Instrument: Understanding planned use of technology by classroom teachers. Computers in the Schools, 22(3), 49-61.

Buck Institute for Education. (2017). Project design rubric. Retrieved from http://www.bie.org/object/document/project_design_rubric

Davidson, E. J. (2004). Evaluation methodology basics: The nuts and bolts of sound evaluation. Thousand Oaks, CA: Sage Publications, Inc.

Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, CA: Corwin Press.

Harris, J., Grandgenett, N., & Hofer, M. (2010). Testing a TPACK-based technology integration assessment instrument. In C. D. Maddux, D. Gibson, & B. Dodge (Eds.), Research highlights in technology and teacher education 2010 (pp. 323-331). Chesapeake, VA: Society for Information Technology and Teacher Education.

Haslam, M. B. (2010). Teacher professional development evaluation guide. Oxford, OH: National Staff Development Council.

Oakden, J. (2013). Evaluation rubrics: How to ensure transparent and clear assessment that respects diverse lines of evidence. Melbourne, Australia: BetterEvaluation.

About the Authors

Nena Bloom

Nena Bloom box with arrow

Evaluation Coordinator Center for Science Teaching and Learning, Northern Arizona University

Nena Bloom, Ed.D., is an evaluator of STEM education projects focusing on opportunities for, and persistence and retention of, K-20 students in STEM majors and fields. She has been an evaluator for both formal and informal education projects since 2007. She has evaluated projects funded by a number of STEM education initiatives, including the National Science Foundation (NSF), the National Institutes of Health, and private foundations. Nena also conducts STEM education research on professional development for educators and how educators support student learning in STEM. Before becoming an evaluator, Nena worked as a project coordinator for education projects and as a middle school science teacher.

Lori Rubino-Hare

Lori Rubino-Hare box with arrow

Professional Development Coordinator Center for Science Teaching and Learning, Northern Arizona University

Lori Rubino-Hare, M.Ed., taught elementary and middle school in Arizona for 13 years. In 2008, she became a professional development coordinator at the Center for Science Teaching and Learning at Northern Arizona University and is currently principal investigator on two NSF-funded projects: Geospatial Connections Promoting Advancement to Careers and Higher Education (GEOCACHE) and the Power of Data Project, which received a 2015 Esri Special Achievement in GIS Award. Lori has facilitated professional development and researched best practices in teaching with GIS since 2009.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 2332143. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.