Over the years, my affiliation with EvaluATE has significantly contributed to my growth as an evaluator. Recently, EvaluATE paid for me to attend the 2023 conference of the Center for Culturally Responsive Evaluation and Assessment (CREA), held October 3 through 6, 2023, in Chicago. (If you’re interested in attending, the next CREA conference will be held in spring 2025. Information about EvaluATE funding for conferences and other professional development opportunities can be found here.)

As defined on the center’s website, CREA (housed at University of Illinois Urbana-Champaign)

is an interdisciplinary endeavor that brings researchers together … to address the growing need for policy-relevant studies that take seriously the influences of cultural norms, practices, and expectations in the design, implementation, and evaluation of social and educational interventions. [Hence], the creation of an interdisciplinary evaluation Center is grounded in the need for designing and conducting evaluations and assessments that embody cognitive, cultural and interdisciplinary diversity.

Attended by a multifaceted, diverse audience of conference-goers, sessions supported and elevated my evaluation practices, expanded my knowledge base, stimulated my thought processes, and broadened my perspective pertaining to inclusivity in evaluations and assessments. Generally, conference takeaways and learnings call for all to consider the significance of unlearning some (not all) past practices for the purpose of working toward more impactful and culturally responsive evaluations, assessments, and outcomes. Additional takeaways beckon us to:

  • Consider that the link between validity of assessment and related evaluation conclusions is inextricable: validity is key to ensuring high-quality assessment/measurement, which in turn also impacts the quality of evaluation conclusions.
  • Be aware of how our identity and experiences influence our work. To conduct high-quality evaluations, it is important to remain mindful of who we are and how our experiences can positively or negatively influence our work.
  • Plan ahead so that evaluation findings can help disrupt inequities. When planning an evaluation, think about how and what data can be used to shed light on inequities and help advance action to redress them.
  • Remember that storytelling is different from telling a story. Tell an accurate, data-informed story that can help advance equity. Don’t make assumptions (i.e., don’t simply tell your own stories about what you observe); use data to document the “why” behind the causes of inequities. For example, assuming that women are not interested in STEM careers is not an accurate description of the cause of gender inequities in STEM and does not help programs advance change to address the inequity.
  • Be open to new ways of doing things, especially as it relates to populations with whom you do not identify. For example, one presentation suggested that incorporating images into assessments would help respondents relax and reduce stress, thus possibly yielding better test results. This was suggested as an especially good idea for Gen-Zers. Since I do not identify as a Gen-Zer, my first assumption was that these images might be distracting. Nonetheless, I am pushing myself to be accepting of the idea of doing things that are outside of my comfort zone and that better serve the needs of the populations with whom I’m working.

In reflecting on the preceding points, it is imperative that we consider that trends and practices may be in generational opposition to one another. However, it is important to be open-minded to emerging approaches and ways of doing things if we want to make our work as useful as possible and help programs improve.

CREA is to be applauded for forming a research initiative aimed at heightening evaluator awareness regarding the importance of inclusivity in evaluation and assessment of not only sponsored programs but also traditional modes of determining individual and group comprehension of subject matter.

About the Authors

Dr. Diana Pollard McCauley Williams

Dr. Diana Pollard McCauley Williams box with arrow

Education Administrator, Independent

Diana received her Bachelors Degree in Elementary Education with a minor in Mathematics from Cheyney University; Masters Degrees in Mathematics Education (Temple University) and Library Science (Villanova University); and the Doctorate in Higher Education Administration (Temple University). Her professional experiences include teaching, administration, educational sales, consulting, and grant evaluations. For almost two decades, she has served as a grant evaluator of projects funded by the US Departments of Education and Labor, the National Science Foundation, and private organizations. Diana has long been on the front line of educational, political, and youth-focused projects, garnering recognition for her service from numerous entities.

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Related Blog Posts

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.