Hi! We are Val Marshall, Megan Zelinsky, and Lyssa Wilson Becho from EvaluATE, the evaluation hub for the National Science Foundation’s (NSF) Advanced Technological Education (ATE) program. For over two decades, EvaluATE has administered an annual monitoring survey to capture ATE program activities. We refer to this survey as the ATE Survey. Over time, we’ve learned some lessons on survey development and adaptation in a STEM context. We hope that these lessons will be helpful for you in evaluating STEM or another evolving and adapting landscape.
Lesson Learned #1
Survey adaptation is necessary to keep up with a constantly evolving landscape. Innovation and technological advancement are a given in STEM. To keep up with our ever-changing community, we develop new survey sections and questions to capture the breadth of activities occurring holistically. We then collect feedback on new surveys sections through stakeholder focus groups and interviews. Although this is an annual monitoring survey, we find responsiveness to the current landscape more valuable than maintaining consistency across questions year to year. One of the best ways we’ve found to review new survey questions is through think-aloud sessions.
Lesson Learned #2
Holding space for researchers to collaborate with the ATE Survey provides a valuable service back to our community. The ATE Survey is the only program-wide, publicly available data source for the ATE program. EvaluATE collaborates with researchers interested in career and technical education to include new questions on the ATE Survey each year that can advance their research. Overall, this practice helps to spur new research within the ATE program at minimal cost to the collaborating research team.
Lesson Learned #3
Developing inclusive questions is a continuous, collaborative, and conscientious process. NSF recognizes the importance of broadening participation of historically marginalized students in STEM. A step towards evaluating that progress is collecting data on the diversity of students served by projects. Establishing student demographics is not an easy task, especially since the ATE Survey asks projects to report demographics of students who may or may not have self-reported this information. We found that no single question structure fits best for all respondents. Therefore, we provide training and assistance to respondents around collecting and restructuring their data to fit the survey questions. For more information on how ATE projects define and measure diversity, equity, and inclusion, check out the webinar “What Gets Measured Gets Done: Exploring ATE Evaluators’ and Principal Investigators’ Attention to Diversity, Equity, and Inclusion.”
Lesson Learned #4
Relationships and trust are essential for survey engagement and relevance. We stay on top of current and future program initiatives by cultivating a strong relationship with our funder, NSF, and by building trust within the ATE community. We build relationships with respondents through conferences, workshops, webchats, and one-on-one conversations. We maintain open channels to communicate about the ATE Survey and provide technical assistance. We also encourage transparency by producing public reports and an interactive online data dashboard. This helps to build a culture around the survey within the ATE community, which can increase survey response rate. For example, the ATE Program Lead or those involved in service projects may encourage participation or send out survey reminders to the larger community. For an additional resource, see this blog post: Tips for Building and Strengthening Stakeholder Relationships.
We hope the lessons we have shared will help you remain flexible, responsive, and inclusive as you approach survey development and adaptation to meet the needs of your community!
Blog was originally published on AEA 365.
Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.