Research.gov is the online reporting system used by all National Science Foundation grantees. The system is designed to accommodate reporting on all types of work supported by NSF—from research on changes in ocean chemistry to developing technician education programs. Not all NSF programs require grantees to conduct project-level evaluations, so the Research.gov system does not have a specific section for reporting evaluation results. This may leave some ATE grantees wondering where and how they are supposed to include information from their evaluations in their annual reports. There is no one right way to do this, but here is my advice:

Upload your external evaluation report as a supporting file in the Accomplishments section of the Research.gov system. If the main body of this report exceeds 25 pages, be sure that it includes a 1-3 page executive summary that highlights key findings and conclusions. Although NSF program officers are very interested in project evaluation results, they simply do not have time to read lengthy detailed reports for all the grants they oversee.

Highlight key findings from your evaluation in the Activities, Outcomes, and Impacts sections of your annual report, as appropriate. For example, if you have data on the number and type of individuals served through your grant activities and their satisfaction with that experience, include some of these findings or related conclusions as you report on your activities. If you have data on changes brought about by your grant work at the individual, organizational, or community levels, summarize that evidence in your Outcomes or Impacts sections.

The Impacts section of the annual report is for describing how projects

  • developed human resources by providing opportunities for research, teaching, and mentoring
  • improved capacity of underrepresented groups to engage in STEM research, teaching, and learning
  • provided STEM experiences to teachers, youth, and the public
  • enhanced the knowledge base of the project’s principal discipline or other disciplines
  • expanded physical (labs, instrumentation, etc.) or institutional resources to increase capacity for STEM research, teaching, and learning.

Many—not all—of these types of impacts are relevant to the projects and centers supported by the ATE program, which is focused on improving the quality and quantity of technicians in the workforce. It is appropriate to indicate “not applicable” if you don’t have results that align with these categories. If you happen to have other types of results that don’t match these categories, report them in the Outcomes section of the Research.gov reporting system.

Refer to the uploaded evaluation report for additional information. Each section in the Research.gov reporting system has an 8,000 character limit, so it’s unlikely you can include detailed evaluation results. (To put that in perspective, this article has 3,515 characters.) Instead, convey key findings or conclusions in your annual report and refer to the uploaded evaluation report for details and additional information.

Finally, if the evaluation revealed problems with the project that point to a need to change how it is being implemented, include that information in the Changes/Problems section of the report. One reason that evaluation is required for all ATE projects is to support continuous improvement. If the evaluation reveals something is not working as well as expected, it’s best to be transparent about the problem and how it is being addressed.

About the Authors

Lori Wingate

Lori Wingate box with arrow

Executive Director, The Evaluation Center, Western Michigan University

Lori has a Ph.D. in evaluation and more than 20 years of experience in the field of program evaluation. She is co-principal investigator of EvaluATE and leads a variety of evaluation projects at WMU focused on STEM education, health, and higher education initiatives. Dr. Wingate has led numerous webinars and workshops on evaluation in a variety of contexts, including CDC University and the American Evaluation Association Summer Evaluation Institute. She is an associate member of the graduate faculty at WMU. She, along with Dr. Kelly Robertson, led the development of The Evaluation Center's online training program, Valeo (valeoeval.com)

Creative Commons

Except where noted, all content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Nation Science Foundation Logo EvaluATE is supported by the National Science Foundation under grant number 2332143. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.