Hi! We are Val Marshall, Megan Zelinsky, and Lyssa Wilson Becho from EvaluATE, the evaluation hub for the National Science Foundation’s (NSF) Advanced Technological Education (ATE) program. For over two decades, EvaluATE has administered an annual monitoring survey to capture ATE program activities. We refer to this survey as the ATE Survey. Over time, we’ve learned some lessons on survey development and adaptation in a STEM context. We hope that these lessons will be helpful for you in evaluating STEM or another evolving and adapting landscape.

Lesson Learned #1

Survey adaptation is necessary to keep up with a constantly evolving landscape. Innovation and technological advancement are a given in STEM. To keep up with our ever-changing community, we develop new survey sections and questions to capture the breadth of activities occurring holistically. We then collect feedback on new surveys sections through stakeholder focus groups and interviews. Although this is an annual monitoring survey, we find responsiveness to the current landscape more valuable than maintaining consistency across questions year to year. One of the best ways we’ve found to review new survey questions is through think-aloud sessions.

Lesson Learned #2 

Holding space for researchers to collaborate with the ATE Survey provides a valuable service back to our community. The ATE Survey is the only program-wide, publicly available data source for the ATE program. EvaluATE collaborates with researchers interested in career and technical education to include new questions on the ATE Survey each year that can advance their research. Overall, this practice helps to spur new research within the ATE program at minimal cost to the collaborating research team.

Lesson Learned #3

Developing inclusive questions is a continuous, collaborative, and conscientious process.  NSF recognizes the importance of broadening participation of historically marginalized students in STEM. A step towards evaluating that progress is collecting data on the diversity of students served by  projects. Establishing student demographics is not an easy task, especially since the ATE Survey asks projects to report demographics of students who may or may not have self-reported this information. We found that no single question structure fits best for all respondents. Therefore, we provide training and assistance to respondents around collecting and restructuring their data to fit the survey questions.  For more information on how ATE projects define and measure diversity, equity, and inclusion, check out the webinar “What Gets Measured Gets Done: Exploring ATE Evaluators’ and Principal Investigators’ Attention to Diversity, Equity, and Inclusion.”

Lesson Learned #4

Relationships and trust are essential for survey engagement and relevance. We stay on top of current and future program initiatives by cultivating a strong relationship with our funder, NSF, and by building trust within the ATE community. We build relationships with respondents through conferences, workshops, webchats, and one-on-one conversations. We maintain open channels to communicate about the ATE Survey and provide technical assistance. We also encourage transparency by producing public reports and an interactive online data dashboard. This helps to build a culture around the survey within the ATE community, which can increase survey response rate. For example, the ATE Program Lead or those involved in service projects may encourage participation or send out survey reminders to the larger community. For an additional resource, see this blog post: Tips for Building and Strengthening Stakeholder Relationships.

We hope the lessons we have shared will help you remain flexible, responsive, and inclusive as you approach survey development and adaptation to meet the needs of your community!

 

Blog was originally published on AEA 365. 

About the Authors

Lyssa Wilson Becho

Lyssa Wilson Becho box with arrow

Principal Research Associate, The Evaluation Center, Western Michigan University

Lyssa is the Director of EvaluATE, she leads the training elements of EvaluATE, including webinars, workshops, resources, and evaluation coaching. She also works with Valerie on strategy and reporting for the ATE annual survey. Lyssa is a principal research associate at The Evaluation Center at Western Michigan University and co-principal investigator for EvaluATE. She holds a Ph.D. in evaluation and has 7 years of experience conducting evaluations for a variety of local, national, and international programs.

Valerie Marshall

Valerie Marshall box with arrow

Project Manager, The Evaluation Center, Western Michigan University

Valerie has served as a project manager at The Evaluation Center since August 2019. In this role, Valerie works on a variety of local, state, and federally funded evaluation and research projects, including EvaluATE. She is also a doctoral student in the Interdisciplinary Ph.D. in Evaluation program at Western Michigan University. Prior to The Evaluation Center, Valerie worked on research and evaluation projects focused on behavioral health, homelessness and poverty, and social policy in both private and non-profit sectors.

Megan Zelinsky

Megan Zelinsky box with arrow

Senior Research Associate, The Evaluation Center, Western Michigan University
Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.