Recently, I attended the Building Pathways and Partnerships in STEM for a Global Network conference, hosted by the State University of New York (SUNY) system. It focused on innovative practices in STEM higher education, centered on increasing retention, completion, and cultural diversity.

As an evaluator, it was enlightening to hear about new practices being used by higher education faculty and staff to encourage students, particularly students in groups traditionally underrepresented in STEM, to stay enrolled and get their degrees. These included:

  • Research opportunities! Students should be exposed to real research if they are going to engage in STEM. This is not only important for four-year degree students, but also community college students, whether they plan to continue their education or move into the workforce.
  • Internships (PAID!) are crucial for gaining practical experience before entering the workforce.
  • Partnerships, partnerships, partnerships. Internships and research opportunities are most useful if they are with organizations outside of the school. This means considerable outreach and relationship-building.
  • One-on-one peer mentoring. Systems where upper level students work directly with new students to help them get through tough classes or labs has been shown to keep students enrolled not only in STEM programs, but in college in general.

The main takeaway from this conference is that the SUNY system is being more creative in engaging students in STEM. They are making a concerted effort to help underrepresented students. This trend is not limited to NY—many colleges and universities are focusing on these issues.

What does all this mean for evaluation? Evidence is more important than ever to sort out what types of new practices work and for whom. Evaluation designs and methods need to be just as innovative as the programs they are reviewing. As evaluators, we need to channel program designers’ creativity and apply our knowledge in useful ways. Examples include:

  • Being flexible. Many methods are brand new or new to the institution or department, so implementers may tweak them along the way. Which means we need to pay attention to how we assess outcomes, perhaps taking guidance from Patton’s Developmental Evaluation work.
  • Considering cultural viewpoints. We should always be mindful of the diversity of perspectives and backgrounds when developing instruments and data collection methods. This is especially important when programs are meant to improve underrepresented groups’ outcomes. Think about how individuals will be able to access an instrument (online, paper) and pay attention to language when writing questionnaire items. The American Evaluation Association provides useful resources for this: http://aea365.org/blog/faheemah-mustafaa-on-pursuing-racial-equity-in-evaluation-practice/
  • Thinking beyond immediate outcomes. What do students accomplish in the long-term? Do they go on to get higher degrees, do they get jobs that fit with their expectations? If you can’t measure these due to budget or timeline constraints, help institutions design ways to do this themselves. It can help them continue to identify program strengths and weaknesses.

Keep these in mind, and your evaluation can provide valuable information for programs geared to make a real difference.

About the Authors

Sarah Singer

Sarah Singer box with arrow

Research Associate, Hezel Associates

As a research associate with Hezel Associates, Ms. Singer’s primary focus is STEM education and workforce development research and evaluation. Her expertise includes project management and quantitative and qualitative data collection and analysis. Her current projects include evaluation of an ATE project with River Valley Community College in New Hampshire and several community college-focused U.S. Department of Labor program evaluations. Previously, Ms. Singer served as coordinator for PK-12 recycling education at a municipal recycling agency and an environmental scientist at SRC, Inc. Ms. Singer holds a master’s in public administration with a concentration in environmental policy from Syracuse University.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 2332143. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.