I have had experience evaluating a number of ATE projects, all of them collaborative projects among several four-year and two-year community colleges. One of the projects’ overarching goals is to provide training to college instructors as well as elementary-, middle-, and high-school teachers, to advance their skills in additive manufacturing and/or smart manufacturing.

The training is done via the use of train-the-trainer studios (TTS). TTSs provide novel hands-on learning experiences to program participants. As with any program, the evaluation of such projects needs to be informed by rich data to capture participants’ entire experience, including the knowledge they gain.

Here’s one lesson I’ve learned from evaluating these projects: Participants’ perception of their value in the project contributes crucially to the quality of data collected.

As the evaluator, one can make participants feel that the data they are being asked to provide (regarding technical knowledge gained, their application of it, and perceptions about all aspects of the training) will be beneficial to the overall program and to them directly or indirectly.

If they feel that their importance is minimal, and that the information they provide will not matter, they will provide the barest amount of information (regardless of the method of data collection employed). If they understand the importance of their participation, they’re more likely to provide rich data.

How can you make them feel valued?

Establish good rapport with each of the participants, particularly if the group(s) is(are) of reasonable size. Make sure to interact informally with each participant throughout the training workshop(s). Inquire about their professional work, and ask them about supports that they might need when they return to their workplace.

The responses to the open-ended questions on most of my workshop evaluations have been very rich and detailed – much more so than those from participants to whom I administered the survey remotely, without ever meeting. Program participants want to connect to a real person, not a remote evaluator. In the event that in-person connections are not possible, explore other innovative ways of establishing rapport with individual participants, before and during the program.

How can you improve the quality of data they will provide?

 Sell the evaluation. Make it clear how the evaluation findings will be used and how the results will benefit the participants and their constituents specifically, directly or indirectly.

 Share success stories. During the training workshops that I have been evaluating, I’ve shared some previous success stories with participants in order to show them what they are capable of accomplishing as well.

The time and energy you spend building these connections with participants will result in high-quality evaluation data, ultimately helping the program serve participants better.

About the Authors

George Chitiyo

George Chitiyo

Professor of Educational Research and Evaluation, Tennessee Tech University

Dr. George Chitiyo is a professor of educational research and evaluation at Tennessee Technological University with over 15 years of experience in program evaluation. His expertise lies in quantitative methods as well as mixed-methods research. He has led evaluations for projects funded by the National Science Foundation (NSF), the United States Department of Agriculture (USDA), and state agencies such as the Tennessee Department of Health and the Tennessee Higher Education Commission. Dr. Chitiyo has evaluated multiple NSF-funded initiatives—including ATE, S-STEM, SaTC, and REU programs—centered on STEM education, instructional innovation, and advanced manufacturing. His research also explores diverse topics, such as chess-based learning, assessment literacy, and the psychosocial impact of HIV/AIDS in sub-Saharan Africa. He is widely published in peer-reviewed journals and actively engaged in mentoring graduate students and collaborating across interdisciplinary teams.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 2332143. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.