The cascading impact of COVID-19, the resulting global pandemic, and its effect on our daily lives have caused many shifts. For higher education, the COVID-19 pandemic has prompted questions about what we can or cannot do. We’ve had to rethink the “how” while not impacting the “what.”

Since before the pandemic, project teams with IT apprenticeship programs at Columbus State Community College (CSCC) have worked with evaluators Dr. Michael FitzGerald and Dr. Julia Siwierka of The Rucks Group to ensure that the program’s activities are of the highest quality.

During the pandemic, aligning evaluation and project activities have been especially important. The partnership exemplifies what we call cooperative evaluation, and it has served as the basis for innovation within CSCC’s projects during this time of intense change.

Cooperative evaluation involves four components: continuous improvement, evaluation as co-creators, challenging norms, and engraining the importance of evaluation within the culture. Here are some examples of these components at work:

Continuous Improvement: After each and every program event, a survey is sent to gather student input to inform future project events. In 2020, program staff and faculty launched a new co-curricular opportunity, a virtual cyber meet-up group. It was essential to gather student input on what would be perceived as value-added. Insights gained from the survey after the initial meeting were used to structure the subsequent meeting, a brainstorming session to frame the full programming calendar. After the brainstorm, all that was left to finalize were the dates and topic leads. Without a survey, program staff and faculty might have designed a program calendar inconsiderate of student interests and expectations.

Evaluation Co-Creators: Together, program staff and evaluation team members designed a 2020 end-of-year feedback session for student apprentices and employers. We developed a pre-session survey that allowed project leads to ask deeper questions and identify topics for further discussion with apprentices. The involvement of evaluation team members from the very beginning of the feedback-gathering process instilled a sense of cooperation and collaboration rather than a third-party review.

Challenging Norms: Formalized evaluation has revealed participant preferences and new potential best practices. As a result, we’re considering how some of the past year’s innovations might be sustained post-COVID. Specifically, we learned from employers that, in some ways, they prefer virtual networking (via Microsoft Teams) to in-person connections. While abandoning in-person networking is unlikely, we will ensure that the best of our virtual engagements are carried forward.

Culturally Engrained: Project leads’ commitment to evaluation extends far beyond ensuring grant compliance. To name just a couple of examples: Project team members review data collected through post-event surveys. This builds “evaluative thinking” among the whole team. And evaluation team members are frequently engaged in meetings related to design, impact, and department-wide updates.

These examples show a partnership with some history- working together before and through an extraordinary period of time. But cooperative evaluation can enrich any project.

Here’s one step you can take-right now-toward cooperative evaluation: Consider at what stage in project implementation you are engaging members of your evaluation team.

Whenever a conversation is about the question, “What are we hoping to achieve?” evaluation team members should be involved.

If they’re not, you risk failing to deliver against stated expectations. More importantly, you might miss the opportunity to identify and capitalize upon innovative new practices.

About the Authors

David Cofer

David Cofer box with arrow

Project Manager

Dave Cofer is an EEEL project manager overseeing a portfolio of experiential learning programs. He provides leadership for both existing and new experiential learning programs in the areas of manufacturing, information technology, and financial services. In this capacity, Dave works closely with evaluation partner The Rucks Group with an eye towards continuously improving all aspects of program delivery and positively impacting the experience of all program stakeholders. Dave earned a Master of Arts in workforce development and education with an emphasis in human resource development from The Ohio State University.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 2332143. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.