Benchmarking is a process for comparing your organization’s activities and achievements with those of other organizations. In the business world, benchmarking emphasizes measuring one’s performance against organizations “known to be leaders in one or more aspects of their operations” (asq.org). In education contexts, benchmarking tends to be more about comparing an institution’s performance with its peer institutions. This may be done by using data from the National Community College Benchmark Project (nccbp.org/benchmarks) and the National Survey of Student Engagement (nsse.iub.edu),1 which has data specific to community colleges as well as four-year institutions. In short, benchmarking can be used to assess organizational performance against what is typical or exceptional, depending on your needs.

The ATE survey, conducted annually since 2000, provides aggregate information about ATE-funded projects and centers. The survey data may be used for comparing your individual project or center against the program as a whole. Such a comparison could be used to make a case for addressing a continuing need within the ATE program or to demonstrate your grant’s performance in relation to the program overall. For example, one concern throughout the ATE program and NSF is the participation of women and underrepresented minorities. Based on the 2014 survey of ATE grantees, we know that

  • 42 percent of students served by the ATE program are from minority groups that are underrepresented in STEM; in comparison, individuals from these minority groups make up 31 percent of the U.S. population.
  • 25 percent of students in ATE are women, compared with 51 percent of the population; only in biotechnology does the percentage of women reflect that of the U.S. population.

These and other demographic data may be used to help your project or center assess how it’s doing with regard to broadening participation in comparison with the ATE program as a whole or within your discipline. Similarly, information about ATE project and center practices may help gain insights on grant operations. Results from the 2014 ATE survey indicate that

  • 85 percent of projects and centers collaborated with business and industry; of those, 63 percent obtained information about workforce needs from their collaborators.
  • 90 percent of ATE grantees have engaged an evaluator; most evaluators (84%) are external to both the institution and the grant.

Check out our ATE survey fact sheets and data snapshots to identify data points that you can use to assess your performance against other ATE projects and centers:  evalu-ate.org/annual_survey. If you would like a tailored snapshot report to assist your project or center with benchmarking against the ATE program, email corey.d.smith@wmich.edu. To see a demonstration of how to compare grant-level, program-level, and national-level data, go to evalu-ate.org/resources/video-data1.

Keep in mind that the ATE program should not be used as a proxy for all technician education in the U.S. See Corey Smith’s article on page 3 for a list of other sources of secondary data that may be of use for planning, evaluation, and benchmarking.

1Both entities restrict data access to institutional members.

About the Authors

Lori Wingate

Lori Wingate box with arrow

Executive Director, The Evaluation Center, Western Michigan University

Lori has a Ph.D. in evaluation and more than 20 years of experience in the field of program evaluation. She is co-principal investigator of EvaluATE and leads a variety of evaluation projects at WMU focused on STEM education, health, and higher education initiatives. Dr. Wingate has led numerous webinars and workshops on evaluation in a variety of contexts, including CDC University and the American Evaluation Association Summer Evaluation Institute. She is an associate member of the graduate faculty at WMU. She, along with Dr. Kelly Robertson, led the development of The Evaluation Center's online training program, Valeo (valeoeval.com)

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 2332143. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.