How can ATE project staff and/or STEM educators in general tell if the strategies they are  implementing to increase diversity are impacting the targeted students and if those students actually find those strategies helpful?

I’m very passionate about using evaluation and data to support the National Science Foundation’s (NSF’s) goal of broadening impacts in STEM education. In IWITTS’ CalWomenTech Project, we provided technical assistance to seven community colleges in California between 2006 and 2011 to help them recruit and retain female students into technology programs where they were underrepresented. Six of seven CalWomenTech colleges had increases in female enrollment in targeted introductory technology courses and four colleges increased both female and male completion rates substantially (six colleges increased male retention). So how could the CalWomenTech colleges tell during the project if the strategies they were implementing were helping female technology students?

The short answer is: The CalWomenTech colleges knew because 1) the project was measuring increases in female (and male) enrollment and completion numbers in as close to real time as possible; and 2) they asked the female students in the targeted classes if they had experienced project strategies, found those strategies helpful, and wanted to experience strategies they hadn’t encountered.

What I want to focus on here is how the CalWomenTech Project was able to use the findings from those qualitative surveys. The external evaluators for the CalWomenTech Project developed an anonymous “Survey of Female Technology Course Students” that was distributed among the colleges. The survey was a combination of looking at classroom retention strategies that the instructors had been trained on as part of the project, recruitment strategies, and population demographics. The first time we administered the survey, 60 female students responded (out of 121 surveyed) across seven CalWomenTech colleges. The colleges were also provided with the female survey data filtered for their specific college.

Fifty percent or more of the 60 survey respondents reported exposure to over half the retention strategies listed in the survey. One of the most important outcomes of the survey was that the CalWomenTech colleges were able to use the survey results to choose which strategies to focus on. Instructors exposed to the results during a site visit or monthly conference call came up with ways to start incorporating the strategies female students requested in their classroom. For example, one STEM instructor came up with a plan to start assigning leadership roles in group projects randomly to avoid men taking the leadership role in groups more often than women, after she saw how many female students wanted to try out a leadership role in class.

To hear about more evaluation lessons learned, watch the webinar “How well are we serving our female students in STEM?” or read more about the CalWomenTech survey of female technology students here.

Human Subjects Alert: If you are administering a survey such as this to a specific group of students and there are only a few in the program, then it’s not anonymous. It’s important to be very careful about how the responses are shared and with whom, since this kind of survey includes confidential information that could harm respondents.

About the Authors

Donna Milgram

Donna Milgram box with arrow

Executive Director, CalWomen Tech ScaleUP, IWITTS

Donna Milgram is currently the Principal Investigator (PI) of the National Science Foundation (NSF) funded CalWomenTech Scale Up Project and the Increasing the Number of Women in Technical Careers: Online Professional Development of Leadership Teams at Community Colleges ATE Project. Ms. Milgram has been PI of five NSF projects including the original CalWomenTech Project, which was highlighted by NSF in 2009 for demonstrating significant achievement and program effectiveness to the Committee for Government Performance and Results Act Performance Assessment. Ms. Milgram has spoken and conducted training throughout the U.S. and Canada and written extensively on evidence-based strategies for recruiting female students to STEM and retaining female (and male) students in STEM. She received a Reader’s Choice Award at the 2013 International Technology & Engineering Education Association (ITEEA) Annual Conference for her cover article, "How to Recruit Women & Girls to the STEM Classroom" published in Technology and Engineering Teacher magazine. Her presentations and publications include two peer-reviewed papers she presented at the 2010 and 2011 American Society for Engineering Education (ASEE) Annual Conference & Exposition and a third paper presented at the Joint National Association for Multicultural Engineering Program Advocates (NAMEPA) and Women in Engineering ProActive Network (WEPAN) 2010 Conference. Ms. Milgram has testified before the U.S. Congress, presented at multiple ATE National Principal Investigators Conferences, and conducted trainings for ATE Projects and Centers.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 2332143. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.