Road map with magnifiying glass on top

At Pacific Research and Evaluation, a key strategy to ensure a successful ATE evaluation throughout the life of the grant is to develop a strong evaluation plan at the beginning of the project. The evaluation plan serves as a road map of research activities that both the evaluator and the grant team can reference throughout the grant to stay on track. In order to build a strong ATE evaluation plan, it’s important to consider 1) the process of developing the plan and 2) the components to include within the plan. These steps are described in more detail below and have been used for a current ATE evaluation at Gateway Technical College, as well as other ATE evaluations.

The Process: Knowledge Building Phase and Participatory Approach

At Pacific Research and Evaluation, we put a strong emphasis on planning efforts when our partnership begins with an organization—whether that is during the grant writing phase or after the grant has been awarded—in order to quickly get up to speed on project goals and objectives.

This can be achieved by facilitating an evaluation kickoff meeting that allows the grant team to share project goals and progress, obstacles they have faced, their initial thoughts on proposed evaluation activities, and their input on how they prefer to engage with the evaluator throughout the process.

Following the kickoff meeting, it can also be helpful for the evaluator to facilitate informational interviews or conversations with individual members of the project team or other key stakeholders. These conversations help the evaluator to develop relevant research questions and determine useful data collection activities.

Finally, the process for developing the evaluation plan includes a in which the project team reviews the draft plan and provides input that leads to updates that ensure the plan’s usefulness and the grant team’s buy-in of the evaluation. This process for ensuring a participatory approach is facilitated by the evaluator, who works with the project team to determine what voices should be included in the process – such as members of the project team, institution administrators, external partners, or advisory board members.

Once key stakeholders are defined, evaluators share the evaluation plan with the target stakeholders and gather feedback through methods that work best for the particular stakeholder group; this can include a meeting facilitated by evaluators and dedicated to receiving feedback on the evaluation plan, an opportunity to provide tracked changes and comments on the evaluation plan, or an online form with prompts for gathering feedback.

From there, the evaluator reviews the feedback and collaborates with the project team to determine what updates may be needed to the evaluation plan; those updates are then made to the evaluation plan and shared with the stakeholder group. If not all feedback is reflected in the updated evaluation plan, this can be a good opportunity to explain the reasoning for this decision.

The Evaluation Plan: What to Include

Following the knowledge building phase, Pacific Research and Evaluation next applies what we learned about the grant to different components included within the plan. A number of components can be included in an evaluation plan to make it useful. Strong evaluation plans tend to

  • The purpose and approach of the evaluation. This section helps define the reason for implementing an evaluation and how it will be useful, as well as any evaluation approaches that will be utilized, such as participatory, culturally responsive, utilization, etc.
  • Evaluation questions. These questions help guide the study, define the scope of the evaluation, and help derive insights that inform decision-making and improvements.
  • Formative and summative data collection activities. This section discusses the data collection activities that will be included in the study (e.g., student survey, faculty focus group, etc.) while detailing which evaluation questions will be addressed through each activity, as well as the target audience and implementation method.
  • Sampling plans. This section describes how participants, sites, or data sources will be selected to ensure that the evaluation findings are representative and aligned to the study’s goals.
  • Data analysis techniques. This section explains what qualitative and quantitative tools will be utilized to analyze the data and the type of statistical analysis that will be performed. This section can also describe how data will be stored securely.
  • A timeline of evaluation activities and deliverables. This information can be presented in a table that includes each activity, the date it will be delivered, and the group that is responsible for the activity. It is helpful to include drafts and review periods as well.
  • Deliverables, which can include reports, infographics, and change management plans. This section describes the specific deliverables that will provided throughout the evaluation, with details on what type of information will be included in each deliverable.

The strategies described above have led to the development of strong evaluation plans for ATE evaluations and beyond.

About the Authors

Regina Wheeler

Regina Wheeler

Senior Research Manager, Pacific Research & Evaluation

Regina Wheeler is a Senior Research Manager at Pacific Research & Evaluation (PRE) based in Portland, Oregon. Since joining PRE in 2012, she has led research and evaluation projects nationwide across multiple sectors, including NSF ATE and other STEM projects, K-12 and post-secondary education initiatives, and workforce development programs. Her work emphasizes participatory and utilization-focused evaluation. A member of the American Evaluation Association, Regina holds an M.A. in Educational Psychology from the University of Colorado Boulder and has been practicing applied research for 15 years.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 2332143. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.